US20200074184A1 - Information processing apparatus, control method, and program - Google Patents

Information processing apparatus, control method, and program Download PDF

Info

Publication number
US20200074184A1
US20200074184A1 US16/674,082 US201916674082A US2020074184A1 US 20200074184 A1 US20200074184 A1 US 20200074184A1 US 201916674082 A US201916674082 A US 201916674082A US 2020074184 A1 US2020074184 A1 US 2020074184A1
Authority
US
United States
Prior art keywords
video
display
target object
summary information
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/674,082
Inventor
Ryo Kawai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to US16/674,082 priority Critical patent/US20200074184A1/en
Publication of US20200074184A1 publication Critical patent/US20200074184A1/en
Priority to US18/074,700 priority patent/US20230103243A1/en
Priority to US18/241,760 priority patent/US20230410510A1/en
Priority to US18/243,357 priority patent/US20230419665A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • G06V20/47Detecting features for summarising video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06K9/00751
    • G06K9/00765
    • G06K9/00771
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an information processing apparatus, a control method, and a program.
  • a video of a camera is used in various scenes.
  • video surveillance using the video of the camera (so-called a surveillance camera) which images a place to be surveilled is performed.
  • Patent Document 1 discloses a technology of detecting an important scene from a surveillance video and generating a summary video in which frames other than the important scene are omitted.
  • Patent Document 1 Japanese Unexamined Patent Publication No. 2012-205097
  • the present invention is provided in view of the problem described above.
  • One object of the present invention is related to provide a technology for easily viewing a plurality of videos.
  • An information processing apparatus of the present invention includes: (1) a summarizing unit which obtains videos and generates summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video; and (2) a display control unit which causes a display unit to display the video.
  • the display control unit In response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display control unit causes the display unit to display the summary information of that video.
  • a control method of the present invention is executed by a computer.
  • the control method includes: (1) a summarizing step of obtaining videos and generating summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video; and (2) a display control step of causing a display unit to display the video.
  • the display unit in response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display unit displays the summary information of that video.
  • a program of the present invention causes a computer to execute each step included in the control method of the invention.
  • FIG. 1 is a diagram conceptually illustrating an operation of an information processing apparatus according to Example Embodiment 1.
  • FIG. 2 is a diagram illustrating the information processing apparatus according to Example Embodiment 1 and a use environment of the information processing apparatus.
  • FIG. 3 is a diagram illustrating a computer for realizing the information processing apparatus.
  • FIG. 4 is the first diagram illustrating a display state of a video in a display system.
  • FIG. 5 is the second diagram illustrating the display state of the video in the display system.
  • FIG. 6 is a third diagram illustrating the display state of the video in the display system.
  • FIG. 7 is a fourth diagram illustrating the display state of the video in the display system.
  • FIG. 8 is a flowchart illustrating a flow of a process executed by the information processing apparatus according to Example Embodiment 1.
  • FIG. 9 is a diagram illustrating summary information in a table format.
  • FIG. 10 is the first diagram illustrating timing when displaying summary information of a video based on the first example of a predetermined condition.
  • FIG. 11 is a diagram illustrating a scene in which the summary information of the video is displayed based on the second example of the predetermined condition.
  • FIG. 12 is a diagram illustrating a scene in which the summary information of the video is displayed based on a third example of the predetermined condition.
  • FIG. 13 is the first diagram illustrating a display state of the summary information.
  • FIG. 14 is the second diagram illustrating the display state of the summary information.
  • FIG. 15 is the first diagram illustrating a scene of generating the summary information.
  • FIG. 16 is a diagram illustrating a scene in which a display control unit selects the summary information to be displayed to a display system.
  • FIG. 17 is the first diagram illustrating a scene of updating the summary information.
  • FIG. 18 is the second diagram illustrating a scene of updating the summary information.
  • FIG. 19 is a diagram illustrating a supposed environment of an information processing apparatus according to Example Embodiment 2.
  • FIG. 20 is a diagram illustrating priority information in a table format.
  • FIG. 21 is a diagram illustrating a relationship between a staying time and a priority of summary information.
  • FIG. 22 is a diagram illustrating a temporal change in a score of summary information.
  • each of blocks represents not a hardware unit but a functional unit configuration.
  • FIG. 1 is a diagram conceptually illustrating an operation of an information processing apparatus 2000 according to Example Embodiment 1. Note that, FIG. 1 is a diagram for facilitating understanding of the operation of the information processing apparatus 2000 , and the operation of the information processing apparatus 2000 is not limited by FIG. 1 .
  • a camera 10 performs imaging and generates still image data or video data.
  • a video 30 is video data based on an imaging result of the camera 10 .
  • the video 30 is displayed on a display system 20 . Accordingly, a user of the information processing apparatus 2000 can view the video 30 .
  • the information processing apparatus 2000 is an apparatus which provides a surveillance video to a surveillant.
  • the camera 10 is a surveillance camera which images a place to be surveilled.
  • the user of the information processing apparatus 2000 is a surveillant or the like who surveils the surveillance place by viewing the video 30 .
  • the information processing apparatus 2000 generates summary information of the video 30 .
  • the summary information of the video 30 indicates any information obtained from contents of the video 30 .
  • the summary information indicates a staying time, a trace of movement, and the like of a person captured in the video 30 .
  • the information processing apparatus 2000 causes the display system 20 to display the summary information of the video 30 .
  • the summary information in FIG. 1 is an arrow indicating a trace of movement of the person captured in the video 30 .
  • the summary information of the video 30 is displayed in response to that a predetermine condition is satisfied regarding a change in a display state of the video 30 on the display system 20 .
  • the user of the information processing apparatus 2000 can easily recognize contents of the video 30 in the past.
  • FIG. 2 is a diagram illustrating the information processing apparatus 2000 according to Example Embodiment 1 and a use environment of the information processing apparatus 2000 .
  • the information processing apparatus 2000 includes a summarizing unit 2040 and a display control unit 2060 .
  • the summarizing unit 2040 obtains the video 30 generated by each of a plurality of cameras 10 . Furthermore, the summarizing unit 2040 performs a summarizing process on the video 30 and generates summary information of the video 30 .
  • the display control unit 2060 causes the display system 20 to display the video 30 .
  • the display control unit 2060 causes the display system 20 to display the summary information of the video 30 in response to that a predetermine condition is satisfied regarding a change in the display state of the video 30 on the display system 20 .
  • each of the plurality of cameras 10 generates the video 30 .
  • a user for example, a surveillant
  • the information processing apparatus 2000 who views the video 30 has to recognize occurrence of an abnormality or the like from a plurality of videos 30 .
  • a lot of workload is required to recognize contents of the plurality of videos 30 .
  • an important scene in the video 30 may be overlooked.
  • the summary information in which the contents of the video 30 are summarized, is generated.
  • the summary information is displayed on the display system 20 in response to that a predetermine condition is satisfied regarding a change in the display state of the video 30 on the display system 20 .
  • the predetermined condition is appropriately determined, it is possible to display the summary information of the video 30 on the display system 20 at the timing that it is appropriate for the user to easily recognize the contents of the video 30 . Therefore, according to the information processing apparatus 2000 of the present example embodiment, it becomes easy to view the plurality of videos 30 . As a result, it is possible to realize to decrease the workload of the user who wants to view the plurality of videos 30 , and to prevent an important scene from being overlooked.
  • Each of function configuration units of the information processing apparatus 2000 may be realized by hardware (for example, hard-wired electronic circuit or the like) which realizes each of the function configuration units or may be realized by a combination (for example, a combination of the electronic circuit and a program controlling the electronic circuit or the like) of hardware and software.
  • hardware for example, hard-wired electronic circuit or the like
  • a combination for example, a combination of the electronic circuit and a program controlling the electronic circuit or the like
  • FIG. 3 is a diagram illustrating a computer 1000 for realizing the information processing apparatus 2000 .
  • the computer 1000 is a predetermined computer.
  • the computer 1000 is a personal computer (PC), a server machine, a tablet terminal, a smartphone, or the like.
  • the computer 1000 may be a dedicated computer designed to realize the information processing apparatus 2000 or may be a general purpose computer.
  • the computer 1000 includes a bus 1020 , a processor 1040 , a memory 1060 , a storage 1080 , an input and output interface 1100 , and a network interface 1120 .
  • the bus 1020 is a data transmission line through which the processor 1040 , the memory 1060 , the storage 1080 , the input and output interface 1100 , and the network interface 1120 mutually transmit and receive data.
  • a method of connecting the processors 1040 and the like to each other is not limited to bus connection.
  • the processor 1040 is an arithmetic apparatus such as a central processing unit (CPU), a graphics processing unit (GPU), or the like.
  • the memory 1060 is a main storage device realized by using a random access memory (RAM) or the like.
  • the storage 1080 is an auxiliary storage device realized by using a hard disk, a solid state drive (SSD), a memory card, a read only memory (ROM), or the like.
  • the storage 1080 may be configured with the same hardware as the hardware constituting the main storage device such as a RAM.
  • the input and output interface 1100 is an interface for connecting the computer 1000 and an input and output device.
  • the network interface 1120 is an interface for connecting the computer 1000 to a communications network.
  • the communications network is, for example, a local area network (LAN) or a wide area network (WAN).
  • a method by which the network interface 1120 connects to the communication network may be a wireless connection or a wired connection.
  • the computer 1000 is communicably connected to the camera 10 through a network.
  • a method of communicably connecting the computer 1000 to the camera 10 is not limited to a connection through the network.
  • the computer 1000 may not be communicably connected to the camera 10 .
  • the storage 1080 stores a program module which realizes each of the function configuration units (the summarizing unit 2040 and the display control unit 2060 ) of the information processing apparatus 2000 . By reading each of these program modules into the memory 1060 and executing the program module, the processor 1040 realizes a function corresponding to each of the program modules.
  • the information processing apparatus 2000 may be realized by using a plurality of computers 1000 .
  • the information processing apparatus 2000 can be realized by two computers, that is, the first computer 1000 which realizes a function of the summarizing unit 2040 and the second computer 1000 which realizes a function of the display control unit 2060 .
  • the first computer is a computer which performs a process for generating summary information.
  • the second computer is a computer which performs a process for displaying the summary information to the display system 20 .
  • the second computer obtains the summary information from the first computer by a predetermined method.
  • the first computer 1000 and the second computer 1000 are realized by a PC, a server machine, a tablet terminal, a smartphone, or the like.
  • the first computer 1000 may be realized by the camera 10 .
  • the camera 10 performs the summarizing process on the video 30 generated by the camera 10 and generates summary information.
  • the second computer 1000 obtains the summary information generated by the camera 10 .
  • the camera 10 having a function of the summarizing unit 2040 in this manner is, for example, a camera called an intelligent camera, a network camera, an internet protocol (IP) camera, or the like.
  • IP internet protocol
  • the camera 10 is any camera which performs imaging and generates still image data or video data.
  • the video 30 is configured based on the data generated by the camera 10 .
  • the video 30 is the video data generated by the camera 10 .
  • the video 30 is configured with a sequence of a plurality of pieces of still image data generated by the camera 10 .
  • the camera 10 may be a camera whose position is fixed (hereinafter, referred to as a fixed camera) or whose position is not fixed (hereinafter, referred to as a moving camera).
  • the fixed camera is a camera installed in various places such as a wall, a pillar, or a ceiling. A place at which the fixed camera is installed may be indoor or outdoor.
  • the wall or the like on which the fixed camera is installed is not limited to a real property, and may be fixed for a certain period.
  • the wall or the like on which the fixed camera is installed may be a partition, a pillar, or the like temporally installed at an event hall or the like.
  • a moving object equipped with a camera usable also as a moving camera it is possible to stop a moving object equipped with a camera usable also as a moving camera to be described below at a certain place and to use that camera as a fixed camera.
  • the moving object is, for example, a car, a motorcycle, a robot, a flying object (for example, a drone or an airship), or the like.
  • the moving camera is, for example, a camera which is put to a person or attached to the moving object or the like described above.
  • the moving camera put to the person is, for example, a camera held by a hand (a camera of a mobile terminal such as a video camera, a smartphone, or the like), a camera fixed to a head, a chest, or the like (wearable camera or the like), or the like.
  • the camera attached to the car, the motorcycle, the robot, the flying object, or the like may be a camera attached for use as a so-called drive recorder, or may be a camera attached separately for generating the video 30 to be provided to the information processing apparatus 2000 .
  • a place imaged by the camera 10 is arbitrary.
  • the camera 10 images a place to be surveilled.
  • the place to be surveilled is, for example, a route in or around the event hall, a route between the event hall and a nearest station of the event hall, or the like.
  • the place imaged by the camera 10 may be indoor or outdoor.
  • the display system 20 is configured to include one or a plurality of display devices. Hereinafter, some examples of a display state of the video 30 in the display system 20 will be described. Note that, hereinafter, an example in which the display system 20 is configured to include one display device 22 .
  • FIG. 4 is the first diagram illustrating the display state of the video 30 in the display system 20 .
  • the display device 22 in FIG. 4 includes a display area 24 in which the video 30 is displayed.
  • the display control unit 2060 sequentially displays the plurality of videos 30 in the display area 24 .
  • a video 30 - 1 and a video 30 - 2 are respectively generated by two cameras 10 - 1 and camera 10 - 2 .
  • the video 30 is displayed in the display area 24 in order of the video 30 - 1 , the video 30 - 2 , the video 30 - 1 , the video 30 - 2 , . . . .
  • FIG. 5 is the second diagram illustrating the display state of the video 30 in the display system 20 .
  • the plurality of videos 30 are displayed on the display device 22 at the same time. Specifically, a plurality of display areas 24 having the same size are provided to the display device 22 , and each of the different videos 30 is displayed in each of the display areas 24 .
  • the number of videos 30 (the number of cameras 10 ) is larger than the number of display areas 24 .
  • the plurality of videos 30 are sequentially displayed.
  • the display control unit 2060 alternately displays two videos 30 in each of the display areas 24 .
  • FIG. 6 is a third diagram illustrating the display state of the video 30 in the display system 20 .
  • the display device 22 in FIG. 6 also includes the plurality of display areas 24 .
  • the different videos 30 are respectively displayed from the display area 24 - 1 to the display area 24 - 8 .
  • the video 30 displayed in the display area 24 - 1 is automatically determined, for example, by the display control unit 2060 .
  • the display control unit 2060 displays the plurality of videos 30 in the display area 24 - 1 in turn.
  • the video 30 displayed in the display area 24 - 1 may be selected by the user of the information processing apparatus 2000 .
  • the display device 22 includes a touch panel.
  • the user performs an operation of touching any one of the display area 24 - 2 to the display area 24 - 8 .
  • the display control unit 2060 changes a display position of the video 30 displayed in the touched display area 24 to the display area 24 - 1 .
  • FIG. 7 is a fourth diagram illustrating the display state of the video 30 in the display system 20 .
  • FIG. 7 is the same as FIG. 6 except that FIG. 7 includes the plurality of display areas 24 having large size.
  • the display system 20 may be configured with a plurality of display devices 22 .
  • each of the plurality of display areas 24 in the example described above is realized by one display device 22 .
  • the display control unit 2060 handles each of the display devices 22 in the same manner as the display area 24 in the example described above.
  • FIG. 8 is a flowchart illustrating a flow of a process executed by the information processing apparatus 2000 according to Example Embodiment 1.
  • the summarizing unit 2040 obtains the video 30 from each of the cameras 10 (S 102 ).
  • the summarizing unit 2040 generates summary information of the video 30 (S 104 ).
  • the display control unit 2060 causes the display system 20 to display the summary information of the video 30 (S 108 ).
  • timing when a process (S 102 and S 104 ) for generating the summary information is executed and timing when a process (S 106 and S 108 ) for displaying the summary information to the display system 20 is executed are various. Thus, these processes do not have to be executed sequentially as illustrated in FIG. 8 .
  • the timing when generating the summary information and the timing when displaying the summary information will be specifically described below.
  • the summarizing unit 2040 obtains the video 30 (S 102 ).
  • a method by which the summarizing unit 2040 obtains the video 30 is arbitrary.
  • the summarizing unit 2040 receives the video 30 transmitted from the camera 10 .
  • the summarizing unit 2040 accesses the camera 10 and obtains the video 30 stored in the camera 10 .
  • the camera 10 may store the video 30 in a storage device provided outside the camera 10 .
  • the summarizing unit 2040 accesses the storage device and obtains the video 30 .
  • each of the videos 30 generated by the plurality of cameras 10 may be stored in the same storage device or may be respectively stored in different storage devices.
  • the summarizing unit 2040 obtains the video 30 stored in a storage device (for example, the memory 1060 or the storage device 1080 in FIG. 3 ) inside the camera 10 .
  • the summarizing unit 2040 performs the summarizing process on the video 30 and generates summary information of the video 30 (S 104 ).
  • contents of the summary information generated by the summarizing unit 2040 will be described.
  • the summary information indicates any information obtained from contents of the video 30 .
  • the contents of the summary information are those that briefly represent important contents for the user among the contents of the video 30 .
  • the content important for the user is, for example, a feature of an object captured in the video 30 .
  • the summary information indicates a feature of a certain object
  • the object is referred to as “target object”.
  • the target object various objects can be handled.
  • the target object is a person.
  • the target object is any moving object described above.
  • the target object may be luggage (a package such as a bag or the like) carried by a person, a moving object, or the like.
  • the feature of the target object is, for example, a staying time, a moving time, a moving velocity, a moving state, or the like.
  • the staying time represents a length of a period when the target object stays in the video 30 .
  • the staying means that the target object stops or hardly moves (for example, a size of a moving range is equal to or less than a predetermined value).
  • the moving time represents a length of a period when the target object moves in the video 30 .
  • the moving means that the target object does not stay (for example, the size of the moving range is larger than the predetermined value).
  • the moving velocity represents a moving velocity (for example, an average velocity) of the target object during a period when the target object moves.
  • the moving state represents, for example, a trace of movement (such as whether the target object moves straight or meanderingly).
  • the staying time indicated in the summary information may be each of a plurality of staying times, or a statistical value of the plurality of staying times (total value, mode, average value, or the like) may be used. The same applies to the moving time, the moving velocity, and the moving state.
  • the summary information indicating the feature of staying or movement of the target object for example, it is possible to determine a target object to be focused and to intensively surveil the target object. For example, in a case where a person stays for a long time in a place at which a person normally does not stop, it is conceivable that the person is a person to be focused. In addition, in a case where a bag or the like is left in a place at which luggage is not normally left, it can be said that that luggage is suspicious and to be focused.
  • the feature of the target object is not limited to the example described above. Another example of the feature of the target object will be described below.
  • the summarizing unit 2040 detects a target object from the video 30 and computes a feature of the target object. For example, the summarizing unit 2040 computes a change in a position of the target object by detecting the target object from each of frames constituting the video 30 . The summarizing unit 2040 computes the staying time, the moving time, the moving velocity, the moving state, and the like from the change in the position of the target object. Note that, in a case of detecting a plurality of different target objects from the video 30 , the summarizing unit 2040 computes a feature for each of the target objects.
  • the summarizing unit 2040 may compute values of various attributes (hereinafter, referred to as attribute values) for the target object and include these attribute values in the features of the target object.
  • An attribute of the person is, for example, an age group, a gender, a nationality, the presence or absence of belongings, whether or not the person is a person with difficulty in walking, or the like.
  • the person with difficulty in walking means a person who walks with assistance from an animal or another person, or a person who walks using an assistance tool.
  • the animal supporting the person with difficulty in walking is a guide dog, for example.
  • the assistance tool used by the person with difficulty in walking is, for example, a crutch or a wheelchair.
  • the attribute values of the age group are various values representing the age group. For example, an age group (10s or 20s) or a category (a child, a young person, an elderly, or the like) representing an age is exemplified.
  • the attribute value of the gender is male or female.
  • the attribute value of the nationality is a value representing a birth country or a living country, or a feature based on the country.
  • the attribute value of the nationality indicates either Japanese or a foreigner.
  • the attribute value of the nationality indicates a category of countries such as Asia, Europe, or Africa.
  • the attribute value of the nationality may indicate a language to be used (Japanese, English, Chinese, or the like).
  • the attribute value of the presence or absence of belongings indicates, regarding various types of belongings, whether or not such the belongings are belonged or used.
  • a walking stick, a wheelchair, a baby carriage, and the like correspond to the belongings.
  • the attribute value of the presence or absence of the walking stick represents whether or not the walking stick is belonged or is used.
  • the attribute value as to whether or not a person is a person with difficulty in walking represents whether the person is supported by an animal or another person, whether or not the person uses the assistance tool, or the like. For example, whether or not a certain person is a person with difficulty in walking can be determined based on the presence or absence of an animal or another person who supports the person. For example, in a case where the summarizing unit 2040 detects a scene in which a person A is supported by another person B from the video 30 , the summarizing unit 2040 determines that the person A is a person with difficulty in walking.
  • the summarizing unit 2040 determines that the person is a person with difficulty in walking.
  • whether or not a person is a person with difficulty in walking can be determined based on the presence or absence of use of the assistance tool. For example, in a case of detecting a person using a predetermined tool such as a crutch or a wheelchair from the video 30 , the summarizing unit 2040 determines that the person is a person with difficulty in walking.
  • the summary information indicating the attribute of such a person for example, it is possible to determine a person who may need assistance, such as an elderly, a foreigner, a missing child, a person with difficulty in walking and to focus on and surveil the person. In addition, in order to handle such a person, it is possible to take measures such as having staff go to a place at which the person is located.
  • FIG. 9 is a diagram illustrating the summary information in a table format.
  • the table in FIG. 9 is referred to as a table 500 .
  • the table 500 has fields of an identifier 502 and a feature 504 .
  • the identifier 502 is an identifier of the target object.
  • the feature 504 indicates a feature of the target object determined by the identifier 502 .
  • the feature 504 includes a staying time 506 , a moving time 508 , or the like.
  • the display control unit 2060 detects that a change in the display state of a certain video 30 in the display system 20 satisfies a predetermined condition (S 106 ).
  • a predetermined condition various conditions can be adopted.
  • the video 30 - 1 is the video 30 generated by the camera 10 - 1 .
  • the predetermined condition is, for example, a condition that “the video 30 is switched from a state in which the video 30 is not displayed to the display system 20 to a state in which the video 30 is displayed to the display system 20 .”
  • FIG. 10 is the first diagram illustrating timing when displaying the summary information of the video 30 based on the first example of the predetermined condition.
  • the video 30 - 1 is displayed to the display system 20 between time t 1 to time t 2 .
  • the video 30 - 1 is not displayed to the display system 20 between the time t 2 to time t 3 .
  • the video 30 - 1 is displayed to the display system 20 again.
  • the display control unit 2060 causes the display system 20 to display the summary information of the video 30 - 1 at timing when the predetermined condition is satisfied, that is, at the time t 3 .
  • the summary information to be displayed to the display system 20 by the display control unit 2060 preferably includes summary information generated during a period between the first time at which the display state of the video 30 is switched from the first display state into the second display state and the second time at which the display state of the video 30 is switched from the second display state into the first display state.
  • the first time is the time when a state in which the video 30 is displayed to the display system 20 is switched into a state in which the video 30 is not displayed to the display system 20 : that is, at the time t 2 .
  • the second time is the time when a state in which the video 30 is not displayed to the display system 20 is switched into a state in which the video 30 is displayed to the display system 20 : that is, at the time t 3 . That is, summary information generated during a period between the time t 2 and the time t 3 is displayed to the display system 20 .
  • summary information of such a period By displaying the summary information of such a period to the display system 20 , summary information of the video 30 during a period when the video 30 - 1 is not displayed to the display system 20 , i.e. a period when the user cannot view the video 30 - 1 , is displayed to the display system 20 .
  • the user can easily recognize what is happened in an imaging range of the camera 10 - 1 during the period when the video 30 - 1 cannot be viewed.
  • the predetermined condition is, for example, a condition that “in the display system 20 , a state in which the video 30 is displayed in a relatively small size is switched into a state in which the video 30 is displayed in a relatively large size.”
  • FIG. 11 is a diagram illustrating a scene in which the summary information of the video 30 is displayed based on the second example of the predetermined condition.
  • the video 30 - 1 is displayed in the display area 24 - 1 of the display system 20 between the time t 1 to the time t 2 .
  • the video 30 - 1 is displayed in the display area 24 - 2 of the display system 20 between the time t 2 to the time t 3 .
  • the video 30 - 1 is displayed in the display area 24 - 1 again.
  • the size of the display area 24 - 1 is larger than the size of the display area 24 - 2 . Therefore, the video 30 - 1 is displayed to the display system 20 in a relatively small size between the time t 2 to the time t 3 .
  • the display control unit 2060 causes the display system 20 to display the summary information of the video 30 - 1 at timing when the predetermined condition is satisfied: that is, at the time t 3 .
  • the display control unit 2060 causes the display system 20 to display summary information generated for the video 30 during a period between the time t 2 and the time t 3 .
  • the time t 2 is the time when a condition that “in the display system 20 , a state in which the video 30 is displayed in a relatively large size is switched into a state in which the video 30 is displayed in a relatively small size” is satisfied. Since the period between the time t 2 and the time t 3 is a period when the video 30 - 1 is displayed in a small size to the display system 20 , the period is a period when it is not easy for the user to view the video 30 - 1 . Therefore, by watching the summary information regarding the video 30 during that period at the time t 3 , the user can easily recognize what is happened in the imaging range of the camera 10 - 1 during the period when it is not easy to view the video 30 - 1 .
  • the predetermined condition is, for example, a condition that “in the display system 20 , a state in which the video 30 is displayed at a position being less likely to come into sight of the user is switched into a state in which the video 30 is displayed at a position being more likely to come into sight of the user”.
  • FIG. 12 is a diagram illustrating a scene in which the summary information of the video 30 is displayed based on the third example of the predetermined condition.
  • the video 30 - 1 is displayed in the display area 24 - 1 of the display system 20 between the time t 1 to the time t 2 .
  • the video 30 - 1 is displayed in the display area 24 - 2 of the display system 20 between the time t 2 to the time t 3 .
  • the video 30 - 1 is displayed in the display area 24 - 1 again.
  • the display area 24 - 1 is at a position at which a front direction of the user crosses the display system 20 . Therefore, the display area 24 - 2 is far from the position at which the front direction of the user of the information processing apparatus 2000 crosses the display system 20 , as compared with the display area 24 - 1 . Thus, it can be said that it is more difficult for the user to view the video 30 - 1 during the period between the time t 2 to the time t 3 than other periods.
  • the display control unit 2060 causes the display system 20 to display the summary information of the video 30 - 1 at the timing when the predetermined condition is satisfied: that is, at the time t 3 .
  • the display control unit 2060 causes the display system 20 to display summary information generated for the video 30 during a period between the time t 2 and the time t 3 .
  • the time t 2 is the time when a condition that “in the display system 20 , a state in which the video 30 is displayed at a position being less likely to come into sight of the user is switched into a state in which the video 30 is displayed at a position being more likely to come into sight of the user”.
  • front direction of the user may be, for example, a front direction of the user's face, a front direction of the user's body, or a gaze direction of the user.
  • a position of the user is fixed (for example, a case where a position of a chair on which the user sits is fixed)
  • a relationship between each of the display areas 24 and a position at which the front direction of the user crosses the display system 20 is can be predetermined.
  • the summarizing unit 2040 may determine the front direction of the user by analyzing an image generated by a camera which images the user. In this manner, the summarizing unit 2040 can compute the relationship between each of the display areas 24 and the position at which the front direction of the user crosses the display system 20 . Note that, the camera which images the user is provided in the vicinity of the display system 20 , for example.
  • an existing method can be used as a specific method of determining the front direction or the like of the user's face described above.
  • the degree of how likely it comes into sight of the user may be associated with each of the display areas 24 in advance.
  • the association information is stored in advance in a storage device accessible from the display control unit 2060 .
  • Timing when the summary information is displayed to the display system 20 may not be limited to the timing when the display state of the video 30 satisfies the predetermined condition.
  • the information processing apparatus 2000 may display the summary information of the video 30 to the display system 20 in response to receiving an input from the user to select the video 30 displayed to the display system 20 .
  • the display control unit 2060 causes the display system 20 to display the summary information of the video 30 (S 108 ).
  • the display state of the summary information various states can be adopted.
  • an example of the specific display state of the summary information will be described. Note that, in each of the following examples, the summary information is generated for the video 30 - 1 generated by the camera 10 - 1 .
  • FIG. 13 is the first diagram illustrating the display state of the summary information.
  • the video 30 - 1 generated in real time by the camera 10 - 1 (so-called live video) is displayed in the display area 24 of the display system 20 .
  • the summary information of the video 30 - 1 is also displayed in the display area 24 .
  • the summary information of the video 30 - 1 is superimposed and displayed on the live video generated by the camera 10 - 1 .
  • the summary information of FIG. 13 represents that a target object 40 captured in the video 30 - 1 acts in order of (1) staying for 10 seconds, (2) moving for 2 seconds, (3) staying for 13 seconds, and (4) moving for 1 second.
  • an arrow represents a trace of movement of the target object 40 .
  • the summary information is displayed for each of the target objects 40 .
  • FIG. 14 is the second diagram illustrating the display state of the summary information.
  • the live video generated by the camera 10 - 1 is displayed in the display area 24 - 1 .
  • summary information 50 - 1 of the video 30 - 1 is displayed in the display area 24 - 2 instead of the display area 24 - 1 . That is, in this example, the display area 24 in which the summary information of the video 30 - 1 is displayed is different from the display area 24 in which the video 30 - 1 is displayed.
  • summary information on each of a plurality of target objects 40 is generated for the video 30 - 1 .
  • a plurality of pieces of summary information may be displayed in one display area 24 (the display area 24 - 2 in FIG. 14 ) or may be displayed in different display areas 24 .
  • the plurality of pieces of summary information may be displayed at the same time or may be displayed in order.
  • the summary information of the video 30 - 1 may be displayed to the display system 20 at timing when the video 30 - 1 is not displayed to the display system 20 .
  • the display control unit 2060 displays the summary information of the video 30 - 1 in the display area 24 - 1 of the display device 22 during a predetermined period from the timing when the predetermined condition described above for the video 30 - 1 is satisfied. Meanwhile, the display control unit 2060 does not display the video 30 to the display system 20 . After the predetermined period elapses, the display control unit 2060 displays the video 30 in the display area 24 - 1 .
  • the summary information is represented by still data such as a character and a figure.
  • the summary information may be generated as video data.
  • the summary information of the video 30 is generated by omitting some of frames of the video 30 .
  • the summarizing unit 2040 omits one or more frames other than the frame in which the target object starts stopping and the frame in which the target object ends stopping.
  • the summarizing unit 2040 omits one or more frames other than the frame in which the target object starts moving and the frame in which the target object ends moving.
  • the target object is a person
  • the summarizing unit 2040 generates summary information of the video 30 (S 104 ). Timing when the summarizing unit 2040 generates the summary information is various. Hereinafter, some examples of the timing will be described.
  • the summarizing unit 2040 repeatedly analyzes the video 30 at a predetermined cycle to individually generate summary information for a plurality of time ranges of the video 30 .
  • FIG. 15 is the first diagram illustrating a scene of generating the summary information.
  • the summarizing unit 2040 analyzes the video 30 from the time t 1 to the time t 2 at the time t 2 , and generates the summary information 50 - 1 based on the result.
  • the summarizing unit 2040 analyzes the video 30 from the time t 2 to the time t 3 at the time t 3 , and generates summary information 50 - 2 based on the result.
  • the target object stays for 20 seconds from the time t 1 to the time t 2
  • the target object stays for 30 seconds from the time t 2 to the time t 3 . Therefore, the summarizing unit 2040 respectively generates the summary information 50 - 1 indicating “staying time: 20 seconds” and the summary information 50 - 2 indicating “staying time: 30 seconds”.
  • FIG. 16 is a diagram illustrating a scene in which the display control unit 2060 selects the summary information 50 to be displayed to the display system 20 .
  • the display control unit 2060 causes the display system 20 to display the summary information 50 on the video 30 between time T 1 and time T 2 .
  • the summary information 50 on the video 30 between the time T 1 and the time T 2 is the summary information 50 - 2 and summary information 50 - 3 . Therefore, the display control unit 2060 causes the display system 20 to display the summary information 50 - 2 and the summary information 50 - 3 .
  • the display control unit 2060 may cause the display system 20 to display the summary information 50 - 1 in addition to the summary information 50 - 2 and the summary information 50 - 3 .
  • the display control unit 2060 may cause the display system 20 to individually display the plurality of pieces of summary information 50 , and perform a process (for example, statistical process) of integrating the plurality of pieces of summary information 50 into one and cause the display system 20 to display the one summary information 50 generated as a result.
  • a process for example, statistical process
  • generation of the summary information that is periodically performed may keep being repeatedly executed (for example, from the time when the information processing apparatus 2000 is activated) or may be started from a specified timing.
  • the specified timing is “the first time when the display state of the video 30 is switched from the first display state to the second display state” described above.
  • it may be “a time when a state in which the video 30 is displayed to the display system 20 is switched into a state in which the video 30 is not displayed to the display system 20 ”, “a time when in the display system 20 , a state in which the video 30 is displayed in a relatively large size is switched into a state in which the video 30 is displayed in a relatively small size”, or “a time when in the display system 20 , a state in which the video 30 is displayed at a position at which it is more likely to come into sight of the user is switched into a state in which the video 30 is displayed at a position at which it is less likely to come into sight of the user.”
  • the summarizing unit 2040 repeatedly analyzes the video 30 at a predetermined cycle. However, the summarizing unit 2040 repeatedly updates one piece of summary information 50 based on the analysis result of the video 30 .
  • FIG. 17 is the first diagram illustrating a scene of updating the summary information.
  • the summarizing unit 2040 analyzes the video 30 from the time t 1 to the time t 2 at the time t 2 , and generates the summary information 50 based on the result.
  • the summarizing unit 2040 analyzes the video 30 from the time t 2 to the time t 3 at the time t 3 , and updates the summary information 50 based on the result. It is assumed that a staying time of the target object between the time t 1 and the time t 2 is 20 seconds and the staying time of the target object between the time t 2 and the time t 3 is 30 seconds.
  • the summary information 50 is updated by overwriting.
  • the summarizing unit 2040 overwrites contents of the summary information 50 indicating “staying time: 20 seconds” with information indicating “staying time: 30 seconds”.
  • the summarizing unit 2040 may perform a process of overwriting the summary information 50 with 25 seconds, which is an average value of the staying times during the two periods.
  • FIG. 18 is the second diagram illustrating the scene of updating the summary information.
  • FIG. 18 illustrates the same contents as in FIG. 17 except for the updating method.
  • the summary information 50 is updated by integrating a new analysis result into the previous analysis result.
  • the summarizing unit 2040 adds information indicating “staying time: 30 seconds” to the summary information 50 .
  • the summarizing unit 2040 may generate summary information to be displayed at timing when the predetermined condition described above (the condition for displaying the summary information of the video 30 to the display system 20 ) is satisfied. In this case, for example, the summarizing unit 2040 generates the summary information for the video 30 during a period between a predetermined time before the above-mentioned timing and the above-mentioned timing.
  • the display control unit 2060 may transmit a request of generating summary information to the camera 10 .
  • the summarizing unit 2040 generates the summary information for the video 30 during a period between a predetermined time before the timing and the timing.
  • the display control unit 2060 obtains the summary information generated by the camera 10 .
  • the information processing apparatus 2000 according to Example Embodiment 2 is illustrated in FIG. 1 in the same manner as the information processing apparatus 2000 of Example Embodiment 1.
  • the information processing apparatus 2000 according to Example Embodiment 2 has the same functions as the information processing apparatus 2000 of Example Embodiment 1 except for items to be described below.
  • the display control unit 2060 causes the display system 20 to display the summary information in consideration of a priority of each of the pieces of summary information.
  • FIG. 19 is a diagram illustrating a supposed environment of the information processing apparatus 2000 according to Example Embodiment 2.
  • the summary information of the video 30 is displayed in another display area 24 different from the display area 24 in which the video 30 is displayed.
  • the display system 20 includes one display device 22
  • the display device 22 includes three display areas 24 - 1 to 24 - 3 .
  • the video 30 - 1 and the video 30 - 2 are alternately displayed in the display area 24 - 1
  • a video 30 - 3 and a video 30 - 4 are alternately displayed in the display area 24 - 2
  • any one of the pieces of summary information is displayed in the display area 24 - 3 . It is assumed that a priority of the summary information of the video 30 - 1 is higher than a priority of the summary information of the video 30 - 3 .
  • the display control unit 2060 displays the summary information of the video 30 - 1 having a higher priority among the summary information of the video 30 - 1 and the summary information of the video 30 - 3 in the display area 24 - 3 .
  • the display control unit 2060 may display only the summary information of the video 30 - 1 in the display area 24 - 3 , or may display the summary information of the video 30 - 1 in the display area 24 - 3 first and then display the summary information of the video 30 - 3 in the display area 24 - 3 .
  • the process of displaying the summary information based on the priority is necessary in a case where the number of pieces of the summary information to be displayed is larger than the number of display areas to be used for displaying the summary information.
  • Such a case is not limited to the case illustrated by using FIG. 19 .
  • the display system 20 has a layout illustrated in FIG. 7 . It is assumed that two videos 30 are alternately displayed in each of display areas 24 - 1 to 24 - 9 . On the other hand, the summary information of one of the videos 30 is displayed in the display area 24 - 10 .
  • the display control unit 2060 determines the summary information to be displayed in the display area 24 - 10 according to the priority of the summary information to be displayed.
  • a method of determining a priority of the summary information is various. Hereinafter, an example of the method of determining the priority of the summary information will be described.
  • a priority is set for each of the cameras 10 .
  • the priority of the camera 10 is set as a priority of summary information of the video 30 generated by the camera 10 .
  • a priority of the summary information of the video 30 - 1 is a priority associated with the camera 10 - 1 which generates the video 30 - 1 .
  • information indicating the priority of the camera 10 is referred to as priority information.
  • FIG. 20 is a diagram illustrating the priority information in a table format.
  • the table in FIG. 20 is referred to as a table 600 .
  • the table 600 includes a camera identifier 602 and a priority 604 .
  • the camera identifier 602 represents an identifier of the camera 10 .
  • the priority 604 indicates a priority associated with the camera 10 .
  • the priority information (for example, the table 600 ) is stored in advance in a storage device accessible from the display control unit 2060 .
  • This storage device may be provided inside the information processing apparatus 2000 or may be provided outside the information processing apparatus 2000 .
  • a priority of summary information may be determined based on contents of the summary information.
  • the display control unit 2060 handles any numerical value indicated in the summary information as a priority of summary information.
  • the display control unit 2060 handles a value of the staying time as the priority of the summary information. In this manner, as the summary information of the video 30 has a longer staying time of the target object, the priority becomes higher.
  • the numerical value handled as a priority is not limited to the staying time.
  • the display control unit 2060 computes a score of the summary information by using a rule (for example, a function) for computing the score of the summary information from the contents of the summary information, and handles the score as a priority of priority information.
  • a rule for example, a function
  • FIG. 21 is a diagram illustrating a relationship between the staying time and a priority of the summary information.
  • the horizontal axis indicates the staying time of a person captured in the video 30
  • the vertical axis indicates the score of the summary information of the video 30 .
  • a maximum value of the score is 100.
  • the score of the summary information is the maximum when the person starts staying. As the staying time becomes longer, the score of the summary information becomes smaller.
  • the score of the summary information increases. In this manner, the summary information of the person who stays for a time longer than the predetermined value is easily displayed to the display system 20 .
  • the rule for computing the score of the summary information may include a rule for increasing the score in response to occurrence of a predetermined event.
  • the predetermined event is contact with another person.
  • FIG. 22 is a diagram illustrating a temporal change in a score of summary information.
  • the rule for computing the score of the summary information in FIG. 22 is defined by a combination of (1) a rule illustrated in FIG. 21 and (2) a rule for increasing the score according to contact with another person.
  • the priority of the summary information may be computed by using each of the scores computed from a plurality of pieces of information included in the summary information.
  • the display control unit 2060 computes the priority of the summary information by using the following Equation (1).
  • Equation (1) p is a priority of summary information.
  • w i is a weight given to each piece of information i (staying time and the like) included in the summary information.
  • d i is a value of the information i included in the summary information.
  • f i is a function for computing the score of the summary information for the information i.
  • the display control unit 2060 may determine a display position of summary information based on a priority of the summary information. For example, a priority is associated in advance with each of the display areas 24 for displaying the summary information. The display control unit 2060 matches the summary information with the display area 24 so that the summary information having a higher priority is displayed in the display area 24 having a higher priority. Here, it is preferable that the display area 24 which the user more easily watches has a higher priority.
  • the priority of the display area 24 is stored in advance in a storage device accessible from the display control unit 2060 .
  • the information processing apparatus 2000 according to Example Embodiment 2 is realized by using the computer 1000 in the same manner as Example Embodiment 1 (see FIG. 4 ).
  • each of the program modules stored in the storage 1080 described above further includes a program for realizing each of the function described in the present example embodiment.
  • the information processing apparatus 2000 determines a display method of the summary information based on the priority of the summary information. In this manner, for example, it is possible to make that “the user of the information processing apparatus 2000 more easily watches the summary information having a higher priority”. Therefore, it is possible to more reliably prevent important information from being overlooked. In addition, convenience of the information processing apparatus 2000 is improved for the user of the information processing apparatus 2000 .
  • An information processing apparatus comprising:
  • a summarizing unit which obtains videos and generates summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video;
  • a display control unit which causes a display unit to display the video
  • the display control unit causes the display unit to display the summary information of that video.
  • the display control unit causes the display unit to display summary information of a first video in response to that a display state of the first video is switched from a state not being displayed on the display unit into a state being displayed on the display unit.
  • the display control unit causes the display unit to display summary information of a first video in response to that a display state of the first video is switched from a state being displayed in a first size on the display unit into a state being displayed in a second size on the display unit, the second size being larger than the first size.
  • the summarizing unit causes the display unit to display summary information of a first video generated during a period between a first time when the display state of the first video is switched from a first display state into a second display state on the display unit and a second time when the display state of the first video is switched from the second display state into the first display state on the display unit.
  • the display control unit causes the display unit to display the summary information having a higher priority between the summary information of the first video and the summary information of the second video.
  • the display control unit obtains a priority of summary information of each of the videos from a storage unit which stores the priority of the summary information of the video for each of the videos.
  • the display control unit computes the priority of the summary information based on contents of each piece of the summary information.
  • the display control unit displays the video and the summary information generated during a past period of the video in display areas different from each other on the display unit.
  • the display control unit superimposes the summary information generated during a past period of the video on the video and causes the display unit to display the superimposed video.
  • the display control unit causes the display unit not to display the video while the display unit displays the summary information of the video.
  • a control method executed by a computer comprising:
  • the display unit in response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display unit displays the summary information of that video.
  • the display unit displays summary information of a first video in response to that a display state of the first video is switched from a state not being displayed on the display unit into a state being displayed on the display unit.
  • the display unit displays summary information of a first video in response to that a display state of the first video is switched from a state being displayed in a first size on the display unit into a state being displayed in a second size on the display unit, the second size being larger than the first size.
  • the display unit displays the summary information of a first video generated during a period between a first time when the display state of the first video is switched from a first display state into a second display state on the display unit and a second time when the display state of the first video is switched from the second display state into the first display state on the display unit.
  • the display unit displays the summary information having a higher priority from the summary information of the first video and the summary information of the second video.
  • a priority of summary information of each of the videos is obtained from a storage unit which stores the priority of the summary information of the video for each of the videos.
  • the priority of the summary information is computed based on contents of each piece of the summary information.
  • the video and the summary information generated during a past period of the video are displayed in display areas different from each other on the display unit.
  • the summary information generated during a past period of the video is superimposed on the video and the display unit displays the superimposed video.
  • the display unit does not display the video while the display unit displays the summary information of the video.
  • a program causing a computer to execute each step of the control method according to any one of 11 to 20.

Abstract

An information processing apparatus (2000) includes a summarizing unit (2040) and a display control unit (2060). The summarizing unit (2040) obtains a video (30) generated by each of a plurality of cameras (10). Furthermore, the summarizing unit (2040) performs a summarizing process on the video (30) and generates summary information of the video (30). The display control unit (2060) causes a display system (20) to display the video (30). Here, the display control unit (2060) causes the display system (20) to display the summary information of the video (30) in response to that a change in a display state of the video (30) in the display system (20) satisfies a predetermined condition.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a Continuation of U.S. application Ser. No. 16/347,262 filed May 3, 2019, which is a National Stage of International Application No. PCT/JP2016/082950 filed Nov. 7, 2016.
  • TECHNICAL FIELD
  • The present invention relates to an information processing apparatus, a control method, and a program.
  • BACKGROUND ART
  • A video of a camera is used in various scenes. For example, video surveillance using the video of the camera (so-called a surveillance camera) which images a place to be surveilled is performed.
  • According to such a background, a technology for easily handling the video of the camera is developed. For example, Patent Document 1 discloses a technology of detecting an important scene from a surveillance video and generating a summary video in which frames other than the important scene are omitted.
  • RELATED DOCUMENT Patent Document
  • [Patent Document 1] Japanese Unexamined Patent Publication No. 2012-205097
  • SUMMARY OF THE INVENTION Technical Problem
  • In the video surveillance or the like, there is a situation in which one person (for example, a surveillant) has to watch a plurality of videos. For example, in a case where a plurality of places to be surveilled are imaged by different surveillance cameras, the surveillant has to view all of videos generated by a plurality of cameras and recognize a place at which an abnormality occurs. This operation requires a lot of workload for the person who watches the video.
  • The present invention is provided in view of the problem described above. One object of the present invention is related to provide a technology for easily viewing a plurality of videos.
  • Solution to Problem
  • An information processing apparatus of the present invention includes: (1) a summarizing unit which obtains videos and generates summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video; and (2) a display control unit which causes a display unit to display the video.
  • In response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display control unit causes the display unit to display the summary information of that video.
  • A control method of the present invention is executed by a computer.
  • The control method includes: (1) a summarizing step of obtaining videos and generating summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video; and (2) a display control step of causing a display unit to display the video.
  • In the display control step, in response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display unit displays the summary information of that video.
  • A program of the present invention causes a computer to execute each step included in the control method of the invention.
  • Advantageous Effects of Invention
  • According to the present invention, there is provided a technology capable of easily viewing a plurality of videos.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above objects and other objects, features and advantages will become more apparent from the following description of the preferred embodiments and the accompanying drawings.
  • FIG. 1 is a diagram conceptually illustrating an operation of an information processing apparatus according to Example Embodiment 1.
  • FIG. 2 is a diagram illustrating the information processing apparatus according to Example Embodiment 1 and a use environment of the information processing apparatus.
  • FIG. 3 is a diagram illustrating a computer for realizing the information processing apparatus.
  • FIG. 4 is the first diagram illustrating a display state of a video in a display system.
  • FIG. 5 is the second diagram illustrating the display state of the video in the display system.
  • FIG. 6 is a third diagram illustrating the display state of the video in the display system.
  • FIG. 7 is a fourth diagram illustrating the display state of the video in the display system.
  • FIG. 8 is a flowchart illustrating a flow of a process executed by the information processing apparatus according to Example Embodiment 1.
  • FIG. 9 is a diagram illustrating summary information in a table format.
  • FIG. 10 is the first diagram illustrating timing when displaying summary information of a video based on the first example of a predetermined condition.
  • FIG. 11 is a diagram illustrating a scene in which the summary information of the video is displayed based on the second example of the predetermined condition.
  • FIG. 12 is a diagram illustrating a scene in which the summary information of the video is displayed based on a third example of the predetermined condition.
  • FIG. 13 is the first diagram illustrating a display state of the summary information.
  • FIG. 14 is the second diagram illustrating the display state of the summary information.
  • FIG. 15 is the first diagram illustrating a scene of generating the summary information.
  • FIG. 16 is a diagram illustrating a scene in which a display control unit selects the summary information to be displayed to a display system.
  • FIG. 17 is the first diagram illustrating a scene of updating the summary information.
  • FIG. 18 is the second diagram illustrating a scene of updating the summary information.
  • FIG. 19 is a diagram illustrating a supposed environment of an information processing apparatus according to Example Embodiment 2.
  • FIG. 20 is a diagram illustrating priority information in a table format.
  • FIG. 21 is a diagram illustrating a relationship between a staying time and a priority of summary information.
  • FIG. 22 is a diagram illustrating a temporal change in a score of summary information.
  • DESCRIPTION OF EMBODIMENTS
  • Hereinafter, example embodiments according to the present invention will be described by using the drawings. In all of the drawings, the same components are denoted by the same reference numerals, and description thereof is not repeated as appropriate. In addition, unless otherwise described, in each of block diagrams, each of blocks represents not a hardware unit but a functional unit configuration.
  • Example Embodiment 1
  • <Outline of Operation of Information Processing Apparatus 2000>
  • FIG. 1 is a diagram conceptually illustrating an operation of an information processing apparatus 2000 according to Example Embodiment 1. Note that, FIG. 1 is a diagram for facilitating understanding of the operation of the information processing apparatus 2000, and the operation of the information processing apparatus 2000 is not limited by FIG. 1.
  • A camera 10 performs imaging and generates still image data or video data. A video 30 is video data based on an imaging result of the camera 10.
  • The video 30 is displayed on a display system 20. Accordingly, a user of the information processing apparatus 2000 can view the video 30. For example, the information processing apparatus 2000 is an apparatus which provides a surveillance video to a surveillant. In this case, the camera 10 is a surveillance camera which images a place to be surveilled. In addition, in this case, the user of the information processing apparatus 2000 is a surveillant or the like who surveils the surveillance place by viewing the video 30.
  • The information processing apparatus 2000 generates summary information of the video 30. The summary information of the video 30 indicates any information obtained from contents of the video 30. For example, the summary information indicates a staying time, a trace of movement, and the like of a person captured in the video 30.
  • The information processing apparatus 2000 causes the display system 20 to display the summary information of the video 30. The summary information in FIG. 1 is an arrow indicating a trace of movement of the person captured in the video 30.
  • Here, the summary information of the video 30 is displayed in response to that a predetermine condition is satisfied regarding a change in a display state of the video 30 on the display system 20. By watching the summary information of the video 30, the user of the information processing apparatus 2000 can easily recognize contents of the video 30 in the past.
  • <Outline of Configuration of Information Processing Apparatus 2000>
  • FIG. 2 is a diagram illustrating the information processing apparatus 2000 according to Example Embodiment 1 and a use environment of the information processing apparatus 2000. The information processing apparatus 2000 includes a summarizing unit 2040 and a display control unit 2060. The summarizing unit 2040 obtains the video 30 generated by each of a plurality of cameras 10. Furthermore, the summarizing unit 2040 performs a summarizing process on the video 30 and generates summary information of the video 30. The display control unit 2060 causes the display system 20 to display the video 30. Here, the display control unit 2060 causes the display system 20 to display the summary information of the video 30 in response to that a predetermine condition is satisfied regarding a change in the display state of the video 30 on the display system 20.
  • Advantageous Effect
  • In the information processing apparatus 2000 according to the present example embodiment, each of the plurality of cameras 10 generates the video 30. In such a case, a user (for example, a surveillant) of the information processing apparatus 2000 who views the video 30 has to recognize occurrence of an abnormality or the like from a plurality of videos 30. However, a lot of workload is required to recognize contents of the plurality of videos 30. Also, it is apprehended that an important scene in the video 30 may be overlooked.
  • In the information processing apparatus 2000 according to the present example embodiment, the summary information, in which the contents of the video 30 are summarized, is generated. In addition, the summary information is displayed on the display system 20 in response to that a predetermine condition is satisfied regarding a change in the display state of the video 30 on the display system 20. Here, if the predetermined condition is appropriately determined, it is possible to display the summary information of the video 30 on the display system 20 at the timing that it is appropriate for the user to easily recognize the contents of the video 30. Therefore, according to the information processing apparatus 2000 of the present example embodiment, it becomes easy to view the plurality of videos 30. As a result, it is possible to realize to decrease the workload of the user who wants to view the plurality of videos 30, and to prevent an important scene from being overlooked.
  • Hereinafter, the present example embodiment will be described in detail.
  • <Hardware Configuration Example of Information Processing Apparatus 2000>
  • Each of function configuration units of the information processing apparatus 2000 may be realized by hardware (for example, hard-wired electronic circuit or the like) which realizes each of the function configuration units or may be realized by a combination (for example, a combination of the electronic circuit and a program controlling the electronic circuit or the like) of hardware and software. Hereinafter, a case where each of the function configuration units in the information processing apparatus 2000 is realized by a combination of hardware and software will be further described.
  • FIG. 3 is a diagram illustrating a computer 1000 for realizing the information processing apparatus 2000. The computer 1000 is a predetermined computer. For example, the computer 1000 is a personal computer (PC), a server machine, a tablet terminal, a smartphone, or the like. The computer 1000 may be a dedicated computer designed to realize the information processing apparatus 2000 or may be a general purpose computer.
  • The computer 1000 includes a bus 1020, a processor 1040, a memory 1060, a storage 1080, an input and output interface 1100, and a network interface 1120. The bus 1020 is a data transmission line through which the processor 1040, the memory 1060, the storage 1080, the input and output interface 1100, and the network interface 1120 mutually transmit and receive data. However, a method of connecting the processors 1040 and the like to each other is not limited to bus connection. The processor 1040 is an arithmetic apparatus such as a central processing unit (CPU), a graphics processing unit (GPU), or the like. The memory 1060 is a main storage device realized by using a random access memory (RAM) or the like. The storage 1080 is an auxiliary storage device realized by using a hard disk, a solid state drive (SSD), a memory card, a read only memory (ROM), or the like. However, the storage 1080 may be configured with the same hardware as the hardware constituting the main storage device such as a RAM.
  • The input and output interface 1100 is an interface for connecting the computer 1000 and an input and output device. The network interface 1120 is an interface for connecting the computer 1000 to a communications network. The communications network is, for example, a local area network (LAN) or a wide area network (WAN). A method by which the network interface 1120 connects to the communication network may be a wireless connection or a wired connection.
  • For example, the computer 1000 is communicably connected to the camera 10 through a network. However, a method of communicably connecting the computer 1000 to the camera 10 is not limited to a connection through the network. In addition, the computer 1000 may not be communicably connected to the camera 10.
  • The storage 1080 stores a program module which realizes each of the function configuration units (the summarizing unit 2040 and the display control unit 2060) of the information processing apparatus 2000. By reading each of these program modules into the memory 1060 and executing the program module, the processor 1040 realizes a function corresponding to each of the program modules.
  • Note that, the information processing apparatus 2000 may be realized by using a plurality of computers 1000. For example, the information processing apparatus 2000 can be realized by two computers, that is, the first computer 1000 which realizes a function of the summarizing unit 2040 and the second computer 1000 which realizes a function of the display control unit 2060. In this case, the first computer is a computer which performs a process for generating summary information. On the other hands, the second computer is a computer which performs a process for displaying the summary information to the display system 20. The second computer obtains the summary information from the first computer by a predetermined method.
  • As described above, for example, the first computer 1000 and the second computer 1000 are realized by a PC, a server machine, a tablet terminal, a smartphone, or the like. However, the first computer 1000 may be realized by the camera 10. In this case, the camera 10 performs the summarizing process on the video 30 generated by the camera 10 and generates summary information. The second computer 1000 obtains the summary information generated by the camera 10. The camera 10 having a function of the summarizing unit 2040 in this manner is, for example, a camera called an intelligent camera, a network camera, an internet protocol (IP) camera, or the like.
  • «Camera 10»
  • The camera 10 is any camera which performs imaging and generates still image data or video data. The video 30 is configured based on the data generated by the camera 10. For example, the video 30 is the video data generated by the camera 10. In another example, the video 30 is configured with a sequence of a plurality of pieces of still image data generated by the camera 10.
  • The camera 10 may be a camera whose position is fixed (hereinafter, referred to as a fixed camera) or whose position is not fixed (hereinafter, referred to as a moving camera). The fixed camera is a camera installed in various places such as a wall, a pillar, or a ceiling. A place at which the fixed camera is installed may be indoor or outdoor.
  • Note that, the wall or the like on which the fixed camera is installed is not limited to a real property, and may be fixed for a certain period. For example, the wall or the like on which the fixed camera is installed may be a partition, a pillar, or the like temporally installed at an event hall or the like.
  • In another example, it is possible to stop a moving object equipped with a camera usable also as a moving camera to be described below at a certain place and to use that camera as a fixed camera. The moving object is, for example, a car, a motorcycle, a robot, a flying object (for example, a drone or an airship), or the like.
  • The moving camera is, for example, a camera which is put to a person or attached to the moving object or the like described above. The moving camera put to the person is, for example, a camera held by a hand (a camera of a mobile terminal such as a video camera, a smartphone, or the like), a camera fixed to a head, a chest, or the like (wearable camera or the like), or the like. The camera attached to the car, the motorcycle, the robot, the flying object, or the like may be a camera attached for use as a so-called drive recorder, or may be a camera attached separately for generating the video 30 to be provided to the information processing apparatus 2000.
  • A place imaged by the camera 10 is arbitrary. For example, in a case where the camera 10 is a surveillance camera, the camera 10 images a place to be surveilled. The place to be surveilled is, for example, a route in or around the event hall, a route between the event hall and a nearest station of the event hall, or the like. Note that, the place imaged by the camera 10 may be indoor or outdoor.
  • <Display System 20>
  • The display system 20 is configured to include one or a plurality of display devices. Hereinafter, some examples of a display state of the video 30 in the display system 20 will be described. Note that, hereinafter, an example in which the display system 20 is configured to include one display device 22.
  • FIG. 4 is the first diagram illustrating the display state of the video 30 in the display system 20. The display device 22 in FIG. 4 includes a display area 24 in which the video 30 is displayed. The display control unit 2060 sequentially displays the plurality of videos 30 in the display area 24.
  • For example, it is assumed that a video 30-1 and a video 30-2 are respectively generated by two cameras 10-1 and camera 10-2. In this case, the video 30 is displayed in the display area 24 in order of the video 30-1, the video 30-2, the video 30-1, the video 30-2, . . . .
  • FIG. 5 is the second diagram illustrating the display state of the video 30 in the display system 20. The plurality of videos 30 are displayed on the display device 22 at the same time. Specifically, a plurality of display areas 24 having the same size are provided to the display device 22, and each of the different videos 30 is displayed in each of the display areas 24.
  • In this case, it is assumed that the number of videos 30 (the number of cameras 10) is larger than the number of display areas 24. In this case, in each of the display areas 24, the plurality of videos 30 are sequentially displayed. For example, it is assumed that the number of videos 30 is 8 and the number of display areas 24 is 4. In this case, the display control unit 2060 alternately displays two videos 30 in each of the display areas 24.
  • FIG. 6 is a third diagram illustrating the display state of the video 30 in the display system 20. The display device 22 in FIG. 6 also includes the plurality of display areas 24. However, in the display device 22 of FIG. 6, there are two types of display areas 24 having different sizes. Sizes of a display area 24-2 to a display area 24-8 are all the same size. On the other hand, the size of a display area 24-1 is larger than the sizes of the other display areas 24.
  • The different videos 30 are respectively displayed from the display area 24-1 to the display area 24-8. The video 30 displayed in the display area 24-1 is automatically determined, for example, by the display control unit 2060. For example, the display control unit 2060 displays the plurality of videos 30 in the display area 24-1 in turn.
  • In another example, the video 30 displayed in the display area 24-1 may be selected by the user of the information processing apparatus 2000. For example, it is assumed that the display device 22 includes a touch panel. In this case, the user performs an operation of touching any one of the display area 24-2 to the display area 24-8. According to this operation, the display control unit 2060 changes a display position of the video 30 displayed in the touched display area 24 to the display area 24-1.
  • FIG. 7 is a fourth diagram illustrating the display state of the video 30 in the display system 20. FIG. 7 is the same as FIG. 6 except that FIG. 7 includes the plurality of display areas 24 having large size.
  • As described above, here, the display system 20 may be configured with a plurality of display devices 22. For example, in this case, each of the plurality of display areas 24 in the example described above is realized by one display device 22. In this case, the display control unit 2060 handles each of the display devices 22 in the same manner as the display area 24 in the example described above.
  • <Flow of Process>
  • FIG. 8 is a flowchart illustrating a flow of a process executed by the information processing apparatus 2000 according to Example Embodiment 1. The summarizing unit 2040 obtains the video 30 from each of the cameras 10 (S102). The summarizing unit 2040 generates summary information of the video 30 (S104). In a case where a change in the display state of the video 30 satisfies a predetermined condition (YES in S106), the display control unit 2060 causes the display system 20 to display the summary information of the video 30 (S108).
  • Note that, as described below, timing when a process (S102 and S104) for generating the summary information is executed and timing when a process (S106 and S108) for displaying the summary information to the display system 20 is executed are various. Thus, these processes do not have to be executed sequentially as illustrated in FIG. 8. The timing when generating the summary information and the timing when displaying the summary information will be specifically described below.
  • <Method of Obtaining Video 30: S102>
  • The summarizing unit 2040 obtains the video 30 (S102). A method by which the summarizing unit 2040 obtains the video 30 is arbitrary. For example, the summarizing unit 2040 receives the video 30 transmitted from the camera 10. In another example, the summarizing unit 2040 accesses the camera 10 and obtains the video 30 stored in the camera 10.
  • Note that, the camera 10 may store the video 30 in a storage device provided outside the camera 10. In this case, the summarizing unit 2040 accesses the storage device and obtains the video 30. Note that, each of the videos 30 generated by the plurality of cameras 10 may be stored in the same storage device or may be respectively stored in different storage devices.
  • In a case where the camera 10 has a function of the summarizing unit 2040 (a case where the first computer 1000 is realized by the camera 10), the summarizing unit 2040 obtains the video 30 stored in a storage device (for example, the memory 1060 or the storage device 1080 in FIG. 3) inside the camera 10.
  • <Contents of Summary Information: S104>
  • The summarizing unit 2040 performs the summarizing process on the video 30 and generates summary information of the video 30 (S104). Here, contents of the summary information generated by the summarizing unit 2040 will be described. As described above, the summary information indicates any information obtained from contents of the video 30. However, it is preferable for the user that viewing the summary information of the video 30 enables to recognize the contents of the video 30 more easily than viewing the video 30 itself. In other words, it is preferable that the contents of the summary information are those that briefly represent important contents for the user among the contents of the video 30.
  • The content important for the user is, for example, a feature of an object captured in the video 30. Hereinafter, in a case where the summary information indicates a feature of a certain object, the object is referred to as “target object”.
  • As the target object, various objects can be handled. For example, the target object is a person. In another example, the target object is any moving object described above. In another example, the target object may be luggage (a package such as a bag or the like) carried by a person, a moving object, or the like.
  • The feature of the target object is, for example, a staying time, a moving time, a moving velocity, a moving state, or the like. The staying time represents a length of a period when the target object stays in the video 30. The staying here means that the target object stops or hardly moves (for example, a size of a moving range is equal to or less than a predetermined value). The moving time represents a length of a period when the target object moves in the video 30. The moving here means that the target object does not stay (for example, the size of the moving range is larger than the predetermined value). The moving velocity represents a moving velocity (for example, an average velocity) of the target object during a period when the target object moves. The moving state represents, for example, a trace of movement (such as whether the target object moves straight or meanderingly).
  • Here, in a case where the target object repeatedly moves and stays, the staying time indicated in the summary information may be each of a plurality of staying times, or a statistical value of the plurality of staying times (total value, mode, average value, or the like) may be used. The same applies to the moving time, the moving velocity, and the moving state.
  • By using the summary information indicating the feature of staying or movement of the target object, for example, it is possible to determine a target object to be focused and to intensively surveil the target object. For example, in a case where a person stays for a long time in a place at which a person normally does not stop, it is conceivable that the person is a person to be focused. In addition, in a case where a bag or the like is left in a place at which luggage is not normally left, it can be said that that luggage is suspicious and to be focused.
  • Note that, the feature of the target object is not limited to the example described above. Another example of the feature of the target object will be described below.
  • The summarizing unit 2040 detects a target object from the video 30 and computes a feature of the target object. For example, the summarizing unit 2040 computes a change in a position of the target object by detecting the target object from each of frames constituting the video 30. The summarizing unit 2040 computes the staying time, the moving time, the moving velocity, the moving state, and the like from the change in the position of the target object. Note that, in a case of detecting a plurality of different target objects from the video 30, the summarizing unit 2040 computes a feature for each of the target objects.
  • Here, in a case where the target object is a person, the summarizing unit 2040 may compute values of various attributes (hereinafter, referred to as attribute values) for the target object and include these attribute values in the features of the target object. An attribute of the person is, for example, an age group, a gender, a nationality, the presence or absence of belongings, whether or not the person is a person with difficulty in walking, or the like. Here, the person with difficulty in walking means a person who walks with assistance from an animal or another person, or a person who walks using an assistance tool. The animal supporting the person with difficulty in walking is a guide dog, for example. The assistance tool used by the person with difficulty in walking is, for example, a crutch or a wheelchair.
  • The attribute values of the age group are various values representing the age group. For example, an age group (10s or 20s) or a category (a child, a young person, an elderly, or the like) representing an age is exemplified. The attribute value of the gender is male or female.
  • The attribute value of the nationality is a value representing a birth country or a living country, or a feature based on the country. For example, the attribute value of the nationality indicates either Japanese or a foreigner. In another example, the attribute value of the nationality indicates a category of countries such as Asia, Europe, or Africa. In another example, the attribute value of the nationality may indicate a language to be used (Japanese, English, Chinese, or the like).
  • The attribute value of the presence or absence of belongings indicates, regarding various types of belongings, whether or not such the belongings are belonged or used. For example, a walking stick, a wheelchair, a baby carriage, and the like correspond to the belongings. For example, the attribute value of the presence or absence of the walking stick represents whether or not the walking stick is belonged or is used.
  • The attribute value as to whether or not a person is a person with difficulty in walking represents whether the person is supported by an animal or another person, whether or not the person uses the assistance tool, or the like. For example, whether or not a certain person is a person with difficulty in walking can be determined based on the presence or absence of an animal or another person who supports the person. For example, in a case where the summarizing unit 2040 detects a scene in which a person A is supported by another person B from the video 30, the summarizing unit 2040 determines that the person A is a person with difficulty in walking. In addition, in a case where the summarizing unit 2040 detects a scene in which a person moves together with an animal having a predetermined feature such as a guide dog from the video 30, the summarizing unit 2040 determines that the person is a person with difficulty in walking.
  • In another example, whether or not a person is a person with difficulty in walking can be determined based on the presence or absence of use of the assistance tool. For example, in a case of detecting a person using a predetermined tool such as a crutch or a wheelchair from the video 30, the summarizing unit 2040 determines that the person is a person with difficulty in walking.
  • By using the summary information indicating the attribute of such a person, for example, it is possible to determine a person who may need assistance, such as an elderly, a foreigner, a missing child, a person with difficulty in walking and to focus on and surveil the person. In addition, in order to handle such a person, it is possible to take measures such as having staff go to a place at which the person is located.
  • FIG. 9 is a diagram illustrating the summary information in a table format. The table in FIG. 9 is referred to as a table 500. The table 500 has fields of an identifier 502 and a feature 504. The identifier 502 is an identifier of the target object. The feature 504 indicates a feature of the target object determined by the identifier 502. In FIG. 9, the feature 504 includes a staying time 506, a moving time 508, or the like.
  • <Timing of Displaying Summary Information: S106>
  • The display control unit 2060 detects that a change in the display state of a certain video 30 in the display system 20 satisfies a predetermined condition (S106). As this predetermined condition, various conditions can be adopted. Hereinafter, some examples of the predetermined condition will be described. In the following description, the video 30-1 is the video 30 generated by the camera 10-1.
  • Example 1 of Predetermined Condition
  • The predetermined condition is, for example, a condition that “the video 30 is switched from a state in which the video 30 is not displayed to the display system 20 to a state in which the video 30 is displayed to the display system 20.”
  • FIG. 10 is the first diagram illustrating timing when displaying the summary information of the video 30 based on the first example of the predetermined condition. The video 30-1 is displayed to the display system 20 between time t1 to time t2. On the other hand, the video 30-1 is not displayed to the display system 20 between the time t2 to time t3. After the time t3, the video 30-1 is displayed to the display system 20 again.
  • The display control unit 2060 causes the display system 20 to display the summary information of the video 30-1 at timing when the predetermined condition is satisfied, that is, at the time t3.
  • Here, the summary information to be displayed to the display system 20 by the display control unit 2060 preferably includes summary information generated during a period between the first time at which the display state of the video 30 is switched from the first display state into the second display state and the second time at which the display state of the video 30 is switched from the second display state into the first display state. For example, in the example in FIG. 10, the first time is the time when a state in which the video 30 is displayed to the display system 20 is switched into a state in which the video 30 is not displayed to the display system 20: that is, at the time t2. On the other hand, the second time is the time when a state in which the video 30 is not displayed to the display system 20 is switched into a state in which the video 30 is displayed to the display system 20: that is, at the time t3. That is, summary information generated during a period between the time t2 and the time t3 is displayed to the display system 20.
  • By displaying the summary information of such a period to the display system 20, summary information of the video 30 during a period when the video 30-1 is not displayed to the display system 20, i.e. a period when the user cannot view the video 30-1, is displayed to the display system 20. By watching the summary information at the time t3, the user can easily recognize what is happened in an imaging range of the camera 10-1 during the period when the video 30-1 cannot be viewed.
  • «Example 2 of Predetermined Condition»
  • The predetermined condition is, for example, a condition that “in the display system 20, a state in which the video 30 is displayed in a relatively small size is switched into a state in which the video 30 is displayed in a relatively large size.”
  • FIG. 11 is a diagram illustrating a scene in which the summary information of the video 30 is displayed based on the second example of the predetermined condition. The video 30-1 is displayed in the display area 24-1 of the display system 20 between the time t1 to the time t2. On the other hand, the video 30-1 is displayed in the display area 24-2 of the display system 20 between the time t2 to the time t3. After the time t3, the video 30-1 is displayed in the display area 24-1 again.
  • Here, the size of the display area 24-1 is larger than the size of the display area 24-2. Therefore, the video 30-1 is displayed to the display system 20 in a relatively small size between the time t2 to the time t3.
  • The display control unit 2060 causes the display system 20 to display the summary information of the video 30-1 at timing when the predetermined condition is satisfied: that is, at the time t3.
  • In this case, for example, the display control unit 2060 causes the display system 20 to display summary information generated for the video 30 during a period between the time t2 and the time t3. The time t2 is the time when a condition that “in the display system 20, a state in which the video 30 is displayed in a relatively large size is switched into a state in which the video 30 is displayed in a relatively small size” is satisfied. Since the period between the time t2 and the time t3 is a period when the video 30-1 is displayed in a small size to the display system 20, the period is a period when it is not easy for the user to view the video 30-1. Therefore, by watching the summary information regarding the video 30 during that period at the time t3, the user can easily recognize what is happened in the imaging range of the camera 10-1 during the period when it is not easy to view the video 30-1.
  • «Example 3 of Predetermined Condition»
  • The predetermined condition is, for example, a condition that “in the display system 20, a state in which the video 30 is displayed at a position being less likely to come into sight of the user is switched into a state in which the video 30 is displayed at a position being more likely to come into sight of the user”.
  • FIG. 12 is a diagram illustrating a scene in which the summary information of the video 30 is displayed based on the third example of the predetermined condition. The video 30-1 is displayed in the display area 24-1 of the display system 20 between the time t1 to the time t2. On the other hand, the video 30-1 is displayed in the display area 24-2 of the display system 20 between the time t2 to the time t3. After the time t3, the video 30-1 is displayed in the display area 24-1 again.
  • Here, it is assumed that the display area 24-1 is at a position at which a front direction of the user crosses the display system 20. Therefore, the display area 24-2 is far from the position at which the front direction of the user of the information processing apparatus 2000 crosses the display system 20, as compared with the display area 24-1. Thus, it can be said that it is more difficult for the user to view the video 30-1 during the period between the time t2 to the time t3 than other periods.
  • The display control unit 2060 causes the display system 20 to display the summary information of the video 30-1 at the timing when the predetermined condition is satisfied: that is, at the time t3.
  • In this case, for example, the display control unit 2060 causes the display system 20 to display summary information generated for the video 30 during a period between the time t2 and the time t3. The time t2 is the time when a condition that “in the display system 20, a state in which the video 30 is displayed at a position being less likely to come into sight of the user is switched into a state in which the video 30 is displayed at a position being more likely to come into sight of the user”. By generating the summary information in this manner, the user who watches the summary information can easily recognize what is happened in the imaging range of the camera 10-1 during the period when it is difficult to view the video 30-1.
  • Here, “front direction of the user” described above may be, for example, a front direction of the user's face, a front direction of the user's body, or a gaze direction of the user. Here, in a case where a position of the user is fixed (for example, a case where a position of a chair on which the user sits is fixed), a relationship between each of the display areas 24 and a position at which the front direction of the user crosses the display system 20 is can be predetermined.
  • In another example, the summarizing unit 2040 may determine the front direction of the user by analyzing an image generated by a camera which images the user. In this manner, the summarizing unit 2040 can compute the relationship between each of the display areas 24 and the position at which the front direction of the user crosses the display system 20. Note that, the camera which images the user is provided in the vicinity of the display system 20, for example. Here, as a specific method of determining the front direction or the like of the user's face described above, an existing method can be used.
  • In addition, the degree of how likely it comes into sight of the user may be associated with each of the display areas 24 in advance. The association information is stored in advance in a storage device accessible from the display control unit 2060.
  • «Other Example»
  • Timing when the summary information is displayed to the display system 20 may not be limited to the timing when the display state of the video 30 satisfies the predetermined condition. For example, the information processing apparatus 2000 may display the summary information of the video 30 to the display system 20 in response to receiving an input from the user to select the video 30 displayed to the display system 20.
  • <Display State of Summary Information: S108>
  • As a change in the display state of the video 30 satisfies the predetermined condition, the display control unit 2060 causes the display system 20 to display the summary information of the video 30 (S108). As the display state of the summary information, various states can be adopted. Hereinafter, an example of the specific display state of the summary information will be described. Note that, in each of the following examples, the summary information is generated for the video 30-1 generated by the camera 10-1.
  • FIG. 13 is the first diagram illustrating the display state of the summary information. In this example, the video 30-1 generated in real time by the camera 10-1 (so-called live video) is displayed in the display area 24 of the display system 20. The summary information of the video 30-1 is also displayed in the display area 24.
  • More specifically, the summary information of the video 30-1 is superimposed and displayed on the live video generated by the camera 10-1. The summary information of FIG. 13 represents that a target object 40 captured in the video 30-1 acts in order of (1) staying for 10 seconds, (2) moving for 2 seconds, (3) staying for 13 seconds, and (4) moving for 1 second. In addition, an arrow represents a trace of movement of the target object 40.
  • Note that, in a case where a plurality of target objects 40 are included in the video 30-1, the summary information is displayed for each of the target objects 40.
  • FIG. 14 is the second diagram illustrating the display state of the summary information. In this example, the live video generated by the camera 10-1 is displayed in the display area 24-1. In addition, summary information 50-1 of the video 30-1 is displayed in the display area 24-2 instead of the display area 24-1. That is, in this example, the display area 24 in which the summary information of the video 30-1 is displayed is different from the display area 24 in which the video 30-1 is displayed.
  • Here, it is assumed that summary information on each of a plurality of target objects 40 is generated for the video 30-1. In this case, a plurality of pieces of summary information may be displayed in one display area 24 (the display area 24-2 in FIG. 14) or may be displayed in different display areas 24. Note that, in the former case, the plurality of pieces of summary information may be displayed at the same time or may be displayed in order.
  • Note that, the summary information of the video 30-1 may be displayed to the display system 20 at timing when the video 30-1 is not displayed to the display system 20. For example, the display control unit 2060 displays the summary information of the video 30-1 in the display area 24-1 of the display device 22 during a predetermined period from the timing when the predetermined condition described above for the video 30-1 is satisfied. Meanwhile, the display control unit 2060 does not display the video 30 to the display system 20. After the predetermined period elapses, the display control unit 2060 displays the video 30 in the display area 24-1.
  • Note that, in each of the examples described above, the summary information is represented by still data such as a character and a figure. However, the summary information may be generated as video data. In this case, for example, the summary information of the video 30 is generated by omitting some of frames of the video 30. For example, regarding a period when the target object stops, the summarizing unit 2040 omits one or more frames other than the frame in which the target object starts stopping and the frame in which the target object ends stopping. In another example, regarding frames during a period when the target object moves, the summarizing unit 2040 omits one or more frames other than the frame in which the target object starts moving and the frame in which the target object ends moving.
  • Note that, when omitting some of the frames of the video 30, it is preferable not to omit a frame including characteristic movement of the target object. For example, in a case where the target object is a person, it is preferable not to omit frames during a period when the person is contact with another person or the person takes a look around.
  • <Timing of Generating Summary Information: S104>
  • The summarizing unit 2040 generates summary information of the video 30 (S104). Timing when the summarizing unit 2040 generates the summary information is various. Hereinafter, some examples of the timing will be described.
  • «Timing 1 of Generating Summary Information»
  • For example, the summarizing unit 2040 repeatedly analyzes the video 30 at a predetermined cycle to individually generate summary information for a plurality of time ranges of the video 30. FIG. 15 is the first diagram illustrating a scene of generating the summary information. In this example, the summarizing unit 2040 analyzes the video 30 from the time t1 to the time t2 at the time t2, and generates the summary information 50-1 based on the result. In addition, the summarizing unit 2040 analyzes the video 30 from the time t2 to the time t3 at the time t3, and generates summary information 50-2 based on the result.
  • Here, the target object stays for 20 seconds from the time t1 to the time t2, and the target object stays for 30 seconds from the time t2 to the time t3. Therefore, the summarizing unit 2040 respectively generates the summary information 50-1 indicating “staying time: 20 seconds” and the summary information 50-2 indicating “staying time: 30 seconds”.
  • Note that, the display control unit 2060 selects summary information 50 to be displayed to the display system 20 from a plurality of pieces of summary information 50 periodically generated for the video 30 as described above. FIG. 16 is a diagram illustrating a scene in which the display control unit 2060 selects the summary information 50 to be displayed to the display system 20.
  • In this example, the display control unit 2060 causes the display system 20 to display the summary information 50 on the video 30 between time T1 and time T2. The summary information 50 on the video 30 between the time T1 and the time T2 is the summary information 50-2 and summary information 50-3. Therefore, the display control unit 2060 causes the display system 20 to display the summary information 50-2 and the summary information 50-3.
  • However, a portion (time t1 to time t2) of the period from the time t1 to the time t2, which is a target period of the summary information 50-1, overlaps with the period from the time T1 to the time T2. Thus, the display control unit 2060 may cause the display system 20 to display the summary information 50-1 in addition to the summary information 50-2 and the summary information 50-3.
  • In a case of selecting the plurality of pieces of summary information 50 in this manner, the display control unit 2060 may cause the display system 20 to individually display the plurality of pieces of summary information 50, and perform a process (for example, statistical process) of integrating the plurality of pieces of summary information 50 into one and cause the display system 20 to display the one summary information 50 generated as a result.
  • Note that, generation of the summary information that is periodically performed may keep being repeatedly executed (for example, from the time when the information processing apparatus 2000 is activated) or may be started from a specified timing. For example, the specified timing is “the first time when the display state of the video 30 is switched from the first display state to the second display state” described above. More specifically, it may be “a time when a state in which the video 30 is displayed to the display system 20 is switched into a state in which the video 30 is not displayed to the display system 20”, “a time when in the display system 20, a state in which the video 30 is displayed in a relatively large size is switched into a state in which the video 30 is displayed in a relatively small size”, or “a time when in the display system 20, a state in which the video 30 is displayed at a position at which it is more likely to come into sight of the user is switched into a state in which the video 30 is displayed at a position at which it is less likely to come into sight of the user.”
  • «Timing 2 of Generating Summary Information»
  • Also in this example, the summarizing unit 2040 repeatedly analyzes the video 30 at a predetermined cycle. However, the summarizing unit 2040 repeatedly updates one piece of summary information 50 based on the analysis result of the video 30.
  • FIG. 17 is the first diagram illustrating a scene of updating the summary information. In this example, the summarizing unit 2040 analyzes the video 30 from the time t1 to the time t2 at the time t2, and generates the summary information 50 based on the result. After then, the summarizing unit 2040 analyzes the video 30 from the time t2 to the time t3 at the time t3, and updates the summary information 50 based on the result. It is assumed that a staying time of the target object between the time t1 and the time t2 is 20 seconds and the staying time of the target object between the time t2 and the time t3 is 30 seconds.
  • In FIG. 17, the summary information 50 is updated by overwriting. Thus, the summarizing unit 2040 overwrites contents of the summary information 50 indicating “staying time: 20 seconds” with information indicating “staying time: 30 seconds”. In another example, the summarizing unit 2040 may perform a process of overwriting the summary information 50 with 25 seconds, which is an average value of the staying times during the two periods.
  • FIG. 18 is the second diagram illustrating the scene of updating the summary information. FIG. 18 illustrates the same contents as in FIG. 17 except for the updating method.
  • In FIG. 18, the summary information 50 is updated by integrating a new analysis result into the previous analysis result. Thus, the summarizing unit 2040 adds information indicating “staying time: 30 seconds” to the summary information 50.
  • «Timing 3 of Generating Summary Information»
  • The summarizing unit 2040 may generate summary information to be displayed at timing when the predetermined condition described above (the condition for displaying the summary information of the video 30 to the display system 20) is satisfied. In this case, for example, the summarizing unit 2040 generates the summary information for the video 30 during a period between a predetermined time before the above-mentioned timing and the above-mentioned timing.
  • Note that, in a case where the first computer 1000 for realizing the summarizing unit 2040 is the camera 10, when the predetermined condition is satisfied, the display control unit 2060 (the second computer 1000) may transmit a request of generating summary information to the camera 10. For example, at timing when receiving the request, the summarizing unit 2040 generates the summary information for the video 30 during a period between a predetermined time before the timing and the timing. The display control unit 2060 obtains the summary information generated by the camera 10.
  • Example Embodiment 2
  • The information processing apparatus 2000 according to Example Embodiment 2 is illustrated in FIG. 1 in the same manner as the information processing apparatus 2000 of Example Embodiment 1. The information processing apparatus 2000 according to Example Embodiment 2 has the same functions as the information processing apparatus 2000 of Example Embodiment 1 except for items to be described below.
  • In a case of causing the display system 20 to display a plurality of pieces of summary information respectively generated from different videos 30, the display control unit 2060 according to Example Embodiment 2 causes the display system 20 to display the summary information in consideration of a priority of each of the pieces of summary information.
  • FIG. 19 is a diagram illustrating a supposed environment of the information processing apparatus 2000 according to Example Embodiment 2. In this example, the summary information of the video 30 is displayed in another display area 24 different from the display area 24 in which the video 30 is displayed. More specifically, the display system 20 includes one display device 22, and the display device 22 includes three display areas 24-1 to 24-3. The video 30-1 and the video 30-2 are alternately displayed in the display area 24-1, a video 30-3 and a video 30-4 are alternately displayed in the display area 24-2, and any one of the pieces of summary information is displayed in the display area 24-3. It is assumed that a priority of the summary information of the video 30-1 is higher than a priority of the summary information of the video 30-3.
  • In this case, it is assumed that both of a change in the display state of the video 30-1 and a change in the display state of the video 30-3 satisfy the predetermined condition. Here, only one of the summary information of the video 30-1 and the summary information of the video 30-3 is displayed in the display area 24-3.
  • Therefore, the display control unit 2060 displays the summary information of the video 30-1 having a higher priority among the summary information of the video 30-1 and the summary information of the video 30-3 in the display area 24-3. In this case, the display control unit 2060 may display only the summary information of the video 30-1 in the display area 24-3, or may display the summary information of the video 30-1 in the display area 24-3 first and then display the summary information of the video 30-3 in the display area 24-3.
  • Here, the process of displaying the summary information based on the priority is necessary in a case where the number of pieces of the summary information to be displayed is larger than the number of display areas to be used for displaying the summary information. Such a case is not limited to the case illustrated by using FIG. 19.
  • For example, it is assumed that the display system 20 has a layout illustrated in FIG. 7. It is assumed that two videos 30 are alternately displayed in each of display areas 24-1 to 24-9. On the other hand, the summary information of one of the videos 30 is displayed in the display area 24-10.
  • In this case, when each of changes in the display states of two or more videos 30 satisfies a predetermined condition, the number of the pieces of summary information to be displayed is equal to or larger than two. On the other hand, the number of display areas which can be used for displaying the summary information is one. Therefore, the number of the pieces of summary information to be displayed may be larger than the number of display areas 24 which can be used for displaying the summary information. Therefore, the display control unit 2060 determines the summary information to be displayed in the display area 24-10 according to the priority of the summary information to be displayed.
  • <Method of Determining Priority of Summary Information>
  • A method of determining a priority of the summary information is various. Hereinafter, an example of the method of determining the priority of the summary information will be described.
  • «Method 1 of Determining Priority»
  • A priority is set for each of the cameras 10. The priority of the camera 10 is set as a priority of summary information of the video 30 generated by the camera 10. For example, a priority of the summary information of the video 30-1 is a priority associated with the camera 10-1 which generates the video 30-1. Hereinafter, information indicating the priority of the camera 10 is referred to as priority information.
  • FIG. 20 is a diagram illustrating the priority information in a table format. The table in FIG. 20 is referred to as a table 600. The table 600 includes a camera identifier 602 and a priority 604. The camera identifier 602 represents an identifier of the camera 10. The priority 604 indicates a priority associated with the camera 10.
  • The priority information (for example, the table 600) is stored in advance in a storage device accessible from the display control unit 2060. This storage device may be provided inside the information processing apparatus 2000 or may be provided outside the information processing apparatus 2000.
  • <Method 2 of Determining Priority of Summary Information>
  • A priority of summary information may be determined based on contents of the summary information. For example, the display control unit 2060 handles any numerical value indicated in the summary information as a priority of summary information. For example, in a case where the summary information indicates a staying time of the target object, the display control unit 2060 handles a value of the staying time as the priority of the summary information. In this manner, as the summary information of the video 30 has a longer staying time of the target object, the priority becomes higher. However, the numerical value handled as a priority is not limited to the staying time.
  • In another example, the display control unit 2060 computes a score of the summary information by using a rule (for example, a function) for computing the score of the summary information from the contents of the summary information, and handles the score as a priority of priority information. Hereinafter, an example of a rule associating a staying time with a score of summary information will be described.
  • FIG. 21 is a diagram illustrating a relationship between the staying time and a priority of the summary information. The horizontal axis indicates the staying time of a person captured in the video 30, and the vertical axis indicates the score of the summary information of the video 30. Here, a maximum value of the score is 100.
  • In this example, the score of the summary information is the maximum when the person starts staying. As the staying time becomes longer, the score of the summary information becomes smaller.
  • However, at timing when the staying time reaches a predetermined value t1, the score of the summary information increases. In this manner, the summary information of the person who stays for a time longer than the predetermined value is easily displayed to the display system 20.
  • Note that, the rule for computing the score of the summary information may include a rule for increasing the score in response to occurrence of a predetermined event. For example, the predetermined event is contact with another person.
  • FIG. 22 is a diagram illustrating a temporal change in a score of summary information. The rule for computing the score of the summary information in FIG. 22 is defined by a combination of (1) a rule illustrated in FIG. 21 and (2) a rule for increasing the score according to contact with another person.
  • In this example, a person who stays is in contact with another person at the time t2. Thus, the score of the summary information increases at the time t2.
  • The priority of the summary information may be computed by using each of the scores computed from a plurality of pieces of information included in the summary information. For example, the display control unit 2060 computes the priority of the summary information by using the following Equation (1).
  • p = i w i * f i ( d i ) ( 1 )
  • In Equation (1), p is a priority of summary information. wi is a weight given to each piece of information i (staying time and the like) included in the summary information. di is a value of the information i included in the summary information. fi is a function for computing the score of the summary information for the information i.
  • <Determination of Display Position Based on Priority>
  • The display control unit 2060 may determine a display position of summary information based on a priority of the summary information. For example, a priority is associated in advance with each of the display areas 24 for displaying the summary information. The display control unit 2060 matches the summary information with the display area 24 so that the summary information having a higher priority is displayed in the display area 24 having a higher priority. Here, it is preferable that the display area 24 which the user more easily watches has a higher priority. The priority of the display area 24 is stored in advance in a storage device accessible from the display control unit 2060.
  • <Hardware Configuration Example>
  • The information processing apparatus 2000 according to Example Embodiment 2 is realized by using the computer 1000 in the same manner as Example Embodiment 1 (see FIG. 4). In the present example embodiment, each of the program modules stored in the storage 1080 described above further includes a program for realizing each of the function described in the present example embodiment.
  • Advantageous Effect
  • When causing the display system 20 to display the pieces of the summary information respectively generated for the different videos 30, the information processing apparatus 2000 according to the present example embodiment determines a display method of the summary information based on the priority of the summary information. In this manner, for example, it is possible to make that “the user of the information processing apparatus 2000 more easily watches the summary information having a higher priority”. Therefore, it is possible to more reliably prevent important information from being overlooked. In addition, convenience of the information processing apparatus 2000 is improved for the user of the information processing apparatus 2000.
  • Although the example embodiments of the present invention are described with reference to the drawings, these are examples of the present invention, and a combination of the respective example embodiments or various other configurations other than the example embodiment described above may be adopted.
  • A part or all of the example embodiments may also be described as the following appendixes, but are not limited to the following.
  • 1. An information processing apparatus comprising:
  • a summarizing unit which obtains videos and generates summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video; and
  • a display control unit which causes a display unit to display the video,
  • wherein, in response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display control unit causes the display unit to display the summary information of that video.
  • 2. The information processing apparatus according to 1,
  • wherein the display control unit causes the display unit to display summary information of a first video in response to that a display state of the first video is switched from a state not being displayed on the display unit into a state being displayed on the display unit.
  • 3. The information processing apparatus according to 1,
  • wherein the display control unit causes the display unit to display summary information of a first video in response to that a display state of the first video is switched from a state being displayed in a first size on the display unit into a state being displayed in a second size on the display unit, the second size being larger than the first size.
  • 4. The information processing apparatus according to any one of 1 to 3,
  • wherein the summarizing unit causes the display unit to display summary information of a first video generated during a period between a first time when the display state of the first video is switched from a first display state into a second display state on the display unit and a second time when the display state of the first video is switched from the second display state into the first display state on the display unit.
  • 5. The information processing apparatus according to any one of 1 to 4,
  • wherein in a case where a change in the display state of a first video generated by a first camera satisfies the predetermined condition and a display state of a second video generated by a second camera satisfies the predetermined condition, the display control unit causes the display unit to display the summary information having a higher priority between the summary information of the first video and the summary information of the second video.
  • 6. The information processing apparatus according to 5,
  • wherein the display control unit obtains a priority of summary information of each of the videos from a storage unit which stores the priority of the summary information of the video for each of the videos.
  • 7. The information processing apparatus according to 5,
  • wherein the display control unit computes the priority of the summary information based on contents of each piece of the summary information.
  • 8. The information processing apparatus according to any one of 1 to 7,
  • wherein the display control unit displays the video and the summary information generated during a past period of the video in display areas different from each other on the display unit.
  • 9. The information processing apparatus according to any one of 1 to 7,
  • wherein the display control unit superimposes the summary information generated during a past period of the video on the video and causes the display unit to display the superimposed video.
  • 10. The information processing apparatus according to any one of 1 to 8,
  • wherein the display control unit causes the display unit not to display the video while the display unit displays the summary information of the video.
  • 11. A control method executed by a computer, the control method comprising:
  • a summarizing step of obtaining videos and generating summary information of the obtained video by performing a summarizing process on the obtained video, each of a plurality of cameras generating the video; and
  • a display control step of causing a display unit to display the video,
  • wherein in the display control step, in response to that a change in a display state of the video on the display unit satisfies a predetermined condition, the display unit displays the summary information of that video.
  • 12. The control method according to 11,
  • wherein in the display control step, the display unit displays summary information of a first video in response to that a display state of the first video is switched from a state not being displayed on the display unit into a state being displayed on the display unit.
  • 13. The control method according to 11,
  • wherein in the display control step, the display unit displays summary information of a first video in response to that a display state of the first video is switched from a state being displayed in a first size on the display unit into a state being displayed in a second size on the display unit, the second size being larger than the first size.
  • 14. The control method according to any one of 11 to 13,
  • wherein in the summarizing step, the display unit displays the summary information of a first video generated during a period between a first time when the display state of the first video is switched from a first display state into a second display state on the display unit and a second time when the display state of the first video is switched from the second display state into the first display state on the display unit.
  • 15. The control method according to any one of 11 to 14,
  • wherein in a case where a change in the display state of a first video generated by a first camera satisfies the predetermined condition and a display state of a second video generated by a second camera satisfies the predetermined condition, in the display control step, the display unit displays the summary information having a higher priority from the summary information of the first video and the summary information of the second video.
  • 16. The control method according to 15,
  • wherein in the display control step, a priority of summary information of each of the videos is obtained from a storage unit which stores the priority of the summary information of the video for each of the videos.
  • 17. The control method according to 15,
  • wherein in the display control step, the priority of the summary information is computed based on contents of each piece of the summary information.
  • 18. The control method according to any one of 11 to 17,
  • wherein in the display control step, the video and the summary information generated during a past period of the video are displayed in display areas different from each other on the display unit.
  • 19. The control method according to any one of 11 to 17,
  • wherein in the display control step, the summary information generated during a past period of the video is superimposed on the video and the display unit displays the superimposed video.
  • 20. The control method according to any one of 11 to 18,
  • wherein in the display control step, the display unit does not display the video while the display unit displays the summary information of the video.
  • 21. A program causing a computer to execute each step of the control method according to any one of 11 to 20.

Claims (20)

1. A data processing system comprising:
at least one memory configured to store a computer program; and
at least one processor configured to execute the computer program to perform:
generating a feature of a movement of a target object; and
displaying information regarding the feature of the movement of the target object together with a trajectory of the target on a screen.
2. The data processing system according to claim 1, wherein
the at least one processor is configured to execute the computer program to perform:
displaying, as the information regarding the feature of the movement of the target object, a moving time and a staying time of the target object.
3. The data processing system according to claim 2, wherein
the at least one processor is configured to execute the computer program to perform:
displaying, as the information regarding the feature of the movement of the target object, a symbol indicating a position at which the target object temporarily stayed.
4. The data processing system according to claim 3, wherein
the at least one processor is configured to execute the computer program to perform:
displaying a moving velocity of the target object.
5. The data processing system according to claim 1, wherein
the at least one processor is configured to execute the computer program to perform:
determining higher priority of the target object of which the staying time is longer; and
selecting, in accordance with the priority, information regarding the feature of the movement of the target object to be displayed.
6. The data processing system according to claim 5, wherein
the at least one processor is configured to execute the computer program to perform:
reducing the priority of the target object to which a predetermined event occurred.
7. The data processing system according to claim 6, wherein
the at least one processor is configured to execute the computer program to perform:
reducing the priority of the target object when the target object contacts with another target object.
8. The data processing system according to claim 1, wherein
the at least one processor is configured to execute the computer program to perform:
displaying summary information of a video in response to that a display state is switched from a state in which the video not being displayed into a state in which the video being displayed.
9. The data processing system according to claim 1, wherein
the at least one processor is configured to execute the computer program to perform:
displaying summary information of a video in response to that a display state is switched from a state in which the video being displayed in a first size into a state in which the video being displayed in a second size, the second size being larger than the first size.
10. The data processing system according to claim 1, wherein
the at least one processor is configured to execute the computer program to perform:
displaying summary information of a video generated during a period between a first time when the video is switched from a first display state into a second display state and a second time when the video is switched from the second display state into the first display state.
11. A data processing method comprising:
generating a feature of a movement of a target object; and
displaying information regarding the feature of the movement of the target object together with a trajectory of the target on one image.
12. The data processing method according to claim 11, comprising:
displaying, as the information regarding the feature of the movement of the target object, a moving time and a staying time of the target object.
13. The data processing method according to claim 12, comprising:
displaying a symbol indicating a position at which the target object temporarily stayed.
14. The data processing method according to claim 13, comprising:
displaying a moving velocity of the target object.
15. The data processing method according to claim 11, comprising:
determining higher priority of the target object of which the staying time is longer; and
selecting, in accordance with the priority, information regarding the feature of the movement of the target object to be displayed.
16. A non-transitory computer-readable storage medium storing a computer program causing a computer to execute:
generating a feature of a movement of a target object; and
displaying information regarding the feature of the movement of the target object together with a trajectory of the target on one image.
17. The non-transitory computer-readable storage medium according to claim 16, wherein
the computer program causes the computer to execute:
displaying, as the information regarding the feature of the movement of the target object, a moving time and a staying time of the target object.
18. The non-transitory computer-readable storage medium according to claim 17, wherein
the computer program causes the computer to execute:
displaying a symbol indicating a position at which the target object temporarily stayed.
19. The non-transitory computer-readable storage medium according to claim 18, wherein
the computer program causes the computer to execute:
displaying a moving velocity of the target object.
20. The non-transitory computer-readable storage medium according to claim 16, wherein
the computer program causes the computer to execute:
determining higher priority of the target object of which the staying time is longer; and
selecting, in accordance with the priority, information regarding the feature of the movement of the target object to be displayed.
US16/674,082 2016-11-07 2019-11-05 Information processing apparatus, control method, and program Abandoned US20200074184A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US16/674,082 US20200074184A1 (en) 2016-11-07 2019-11-05 Information processing apparatus, control method, and program
US18/074,700 US20230103243A1 (en) 2016-11-07 2022-12-05 Information processing apparatus, control method, and program
US18/241,760 US20230410510A1 (en) 2016-11-07 2023-09-01 Information processing apparatus, control method, and program
US18/243,357 US20230419665A1 (en) 2016-11-07 2023-09-07 Information processing apparatus, control method, and program

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/JP2016/082950 WO2018083793A1 (en) 2016-11-07 2016-11-07 Information processing device, control method, and program
US201916347262A 2019-05-03 2019-05-03
US16/674,082 US20200074184A1 (en) 2016-11-07 2019-11-05 Information processing apparatus, control method, and program

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2016/082950 Continuation WO2018083793A1 (en) 2016-11-07 2016-11-07 Information processing device, control method, and program
US16/347,262 Continuation US11532160B2 (en) 2016-11-07 2016-11-07 Information processing apparatus, control method, and program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/074,700 Continuation US20230103243A1 (en) 2016-11-07 2022-12-05 Information processing apparatus, control method, and program

Publications (1)

Publication Number Publication Date
US20200074184A1 true US20200074184A1 (en) 2020-03-05

Family

ID=62076745

Family Applications (5)

Application Number Title Priority Date Filing Date
US16/347,262 Active 2037-01-22 US11532160B2 (en) 2016-11-07 2016-11-07 Information processing apparatus, control method, and program
US16/674,082 Abandoned US20200074184A1 (en) 2016-11-07 2019-11-05 Information processing apparatus, control method, and program
US18/074,700 Pending US20230103243A1 (en) 2016-11-07 2022-12-05 Information processing apparatus, control method, and program
US18/241,760 Pending US20230410510A1 (en) 2016-11-07 2023-09-01 Information processing apparatus, control method, and program
US18/243,357 Pending US20230419665A1 (en) 2016-11-07 2023-09-07 Information processing apparatus, control method, and program

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/347,262 Active 2037-01-22 US11532160B2 (en) 2016-11-07 2016-11-07 Information processing apparatus, control method, and program

Family Applications After (3)

Application Number Title Priority Date Filing Date
US18/074,700 Pending US20230103243A1 (en) 2016-11-07 2022-12-05 Information processing apparatus, control method, and program
US18/241,760 Pending US20230410510A1 (en) 2016-11-07 2023-09-01 Information processing apparatus, control method, and program
US18/243,357 Pending US20230419665A1 (en) 2016-11-07 2023-09-07 Information processing apparatus, control method, and program

Country Status (3)

Country Link
US (5) US11532160B2 (en)
JP (1) JP6740539B2 (en)
WO (1) WO2018083793A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10740446B2 (en) * 2017-08-24 2020-08-11 International Business Machines Corporation Methods and systems for remote sensing device control based on facial information

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10796163B2 (en) * 2014-03-07 2020-10-06 Eagle Eye Networks, Inc. Surveillance video activity summary system and access method of operation (VASSAM)
JP7321050B2 (en) * 2019-10-15 2023-08-04 水ing株式会社 System for inspecting water supply facilities

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110001824A1 (en) * 2009-07-03 2011-01-06 Samsung Techwin Co., Ltd. Sensing apparatus, event sensing method, and photographing system
US20140146998A1 (en) * 2012-11-28 2014-05-29 Dieter Wieser Systems and methods to classify moving airplanes in airports
US20150356840A1 (en) * 2013-02-06 2015-12-10 Sony Corporation Information processing apparatus, information processing method, program, and information processing system
US9378632B2 (en) * 2000-10-24 2016-06-28 Avigilon Fortress Corporation Video surveillance system employing video primitives
US20170358103A1 (en) * 2016-06-09 2017-12-14 California Institute Of Technology Systems and Methods for Tracking Moving Objects
US20180285633A1 (en) * 2017-03-31 2018-10-04 Avigilon Corporation Unusual motion detection method and system
US10645350B2 (en) * 2000-10-24 2020-05-05 Avigilon Fortress Corporation Video analytic rule detection system and method

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0644470A (en) 1992-07-24 1994-02-18 Ibiden Co Ltd Security system
JP2000308042A (en) * 1999-04-21 2000-11-02 Nippon Telegr & Teleph Corp <Ntt> Device and method for displaying monitor video and recording medium recording program of the method
JP3758511B2 (en) 2000-02-28 2006-03-22 株式会社日立国際電気 Object detection apparatus and object detection program
JP2002157599A (en) * 2000-11-17 2002-05-31 Nippon Telegr & Teleph Corp <Ntt> Method for detecting and recognizing object, recording medium recording its program and object monitoring and tracking device
JP2004078762A (en) * 2002-08-21 2004-03-11 Telecommunication Advancement Organization Of Japan Device and method for retrieving image
JP2006202062A (en) 2005-01-20 2006-08-03 Toshiba Corp Facility monitoring system
JP5213123B2 (en) * 2009-01-15 2013-06-19 株式会社日立製作所 Video output method and video output device
JP5818445B2 (en) * 2011-01-26 2015-11-18 京セラ株式会社 Mobile terminal device
JP5738028B2 (en) 2011-03-25 2015-06-17 セコム株式会社 Video processing device
RU2015106938A (en) * 2012-07-31 2016-09-20 Нек Корпорейшн IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD AND PROGRAM
JP5994612B2 (en) * 2012-12-04 2016-09-21 富士通株式会社 Video editing apparatus, video editing method, and video editing program
KR102070924B1 (en) * 2014-01-20 2020-01-29 한화테크윈 주식회사 Image Recoding System
JP6074395B2 (en) 2014-09-12 2017-02-01 富士フイルム株式会社 Content management system, managed content generation method, managed content playback method, program, and recording medium
JP5999394B2 (en) * 2015-02-20 2016-09-28 パナソニックIpマネジメント株式会社 Tracking support device, tracking support system, and tracking support method
US10219026B2 (en) * 2015-08-26 2019-02-26 Lg Electronics Inc. Mobile terminal and method for playback of a multi-view video
KR20180075506A (en) * 2015-10-27 2018-07-04 소니 주식회사 Information processing apparatus, information processing method, and program
WO2017142143A1 (en) * 2016-02-19 2017-08-24 Samsung Electronics Co., Ltd. Method and apparatus for providing summary information of a video

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9378632B2 (en) * 2000-10-24 2016-06-28 Avigilon Fortress Corporation Video surveillance system employing video primitives
US10645350B2 (en) * 2000-10-24 2020-05-05 Avigilon Fortress Corporation Video analytic rule detection system and method
US20110001824A1 (en) * 2009-07-03 2011-01-06 Samsung Techwin Co., Ltd. Sensing apparatus, event sensing method, and photographing system
US20140146998A1 (en) * 2012-11-28 2014-05-29 Dieter Wieser Systems and methods to classify moving airplanes in airports
US20150356840A1 (en) * 2013-02-06 2015-12-10 Sony Corporation Information processing apparatus, information processing method, program, and information processing system
US20170358103A1 (en) * 2016-06-09 2017-12-14 California Institute Of Technology Systems and Methods for Tracking Moving Objects
US20180285633A1 (en) * 2017-03-31 2018-10-04 Avigilon Corporation Unusual motion detection method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10740446B2 (en) * 2017-08-24 2020-08-11 International Business Machines Corporation Methods and systems for remote sensing device control based on facial information

Also Published As

Publication number Publication date
US20230419665A1 (en) 2023-12-28
US20190286913A1 (en) 2019-09-19
US11532160B2 (en) 2022-12-20
WO2018083793A1 (en) 2018-05-11
US20230103243A1 (en) 2023-03-30
JPWO2018083793A1 (en) 2019-10-03
JP6740539B2 (en) 2020-08-19
US20230410510A1 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
US20230410510A1 (en) Information processing apparatus, control method, and program
US20200250462A1 (en) Key point detection method and apparatus, and storage medium
US9762851B1 (en) Shared experience with contextual augmentation
US9992429B2 (en) Video pinning
WO2018090912A1 (en) Target object detection method, apparatus and system and neural network structure
US9678342B2 (en) Information processing device, display control method, and program
US10192128B2 (en) Mobile surveillance apparatus, program, and control method
US11443116B2 (en) Electronic apparatus and control method thereof
US10956763B2 (en) Information terminal device
JP2021531589A (en) Motion recognition method, device and electronic device for target
US20230409632A1 (en) Systems and methods for using conjunctions in a voice input to cause a search application to wait for additional inputs
KR20140052263A (en) Contents service system, method and apparatus for service contents in the system
KR102532230B1 (en) Electronic device and control method thereof
US20210208773A1 (en) Display apparatus and controlling method thereof
JP7052833B2 (en) Information processing equipment, control methods, and programs
CN114296627B (en) Content display method, device, equipment and storage medium
US11604830B2 (en) Systems and methods for performing a search based on selection of on-screen entities and real-world entities
CN114895813A (en) Information display method and device, electronic equipment and readable storage medium
WO2021056165A1 (en) Zoom based on gesture detection
US11380187B2 (en) Information processing apparatus, control method, and program
US20170078618A1 (en) Display control device, display control system, and display control method
KR102438132B1 (en) Electronic device and control method thereof
CN114779935A (en) Hot search entry interaction method and device, equipment and medium thereof
JP2023129657A (en) Information processing apparatus, control method, and program
JP2020091527A (en) Monitoring apparatus, monitoring system, monitoring method and monitoring program

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION