WO2017154845A1 - Reproducing device - Google Patents

Reproducing device Download PDF

Info

Publication number
WO2017154845A1
WO2017154845A1 PCT/JP2017/008816 JP2017008816W WO2017154845A1 WO 2017154845 A1 WO2017154845 A1 WO 2017154845A1 JP 2017008816 W JP2017008816 W JP 2017008816W WO 2017154845 A1 WO2017154845 A1 WO 2017154845A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
event information
information
playback
data
Prior art date
Application number
PCT/JP2017/008816
Other languages
French (fr)
Japanese (ja)
Inventor
雅春 奧
Original Assignee
株式会社smart-FOA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016078322A external-priority patent/JP2017169181A/en
Application filed by 株式会社smart-FOA filed Critical 株式会社smart-FOA
Publication of WO2017154845A1 publication Critical patent/WO2017154845A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a playback device that plays back time-series data such as moving image data.
  • the data generation device is a facility sensor, a programmable logic controller that controls factory automation (FA), a computer that counts products, a mobile terminal that allows a user to input information, and the like.
  • FA factory automation
  • the on-site data is encoded or output as numerical values that are observation results, and is monitored by the fixed asset manager as event information with the occurrence location and date / time information added.
  • the fixed asset manager is, for example, a facility maintenance manager in a factory and a facility manager such as a server in an office.
  • This event information is valuable information that contributes to operations such as factory and office operation management, product quality management, and business management.
  • the event information output directly from the facility does not have secondary or tertiary meanings other than the direct meanings such as codes or simple observation numerical values, occurrence location and occurrence date and time.
  • the information obtained at the site level is sequentially re-interpreted at each layer until it reaches the management and management level, and event information is changed into information with secondary and tertiary meanings.
  • the event information was used at each level. Therefore, a time lag tends to occur in the way of information transmission between the site level and the management or management level, and the site level and the business management level tend to be liberated.
  • This information collection system stores a dictionary in which models that define the types of supplementary information added to field data are accommodated.
  • a database storing supplementary information of various contents is prepared. Then, according to the dictionary, each supplementary information is searched from the database, and each searched supplementary information is added to the field data to generate event information.
  • the event information is not only simple background information for the field data but also explanatory information such as interpretation and evaluation of the contents of the field data, so that the information transmission has immediacy.
  • the event information describing the required contents directly and in an easy-to-understand manner has occurred, and when the event information arrives without requiring any interpretation work, Can understand secondary and tertiary meanings of information. Therefore, from the on-site level to the business management level, not only the arrival time lag of event information but also the understanding time lag of event information can be eliminated.
  • the user displays both event information and moving image data on the viewing terminal, pays attention to the event information, pays attention to the moving image data, goes back and forth between both data, and changes the event information of interest. By changing the moving image data to be noticed, he was trying to obtain useful information.
  • Video data is bulky data that not only records useful information that reinforces the content of event information, but also records a large amount of unnecessary information. Moreover, the existence of useful information is only possible, it is unknown where it exists in the field of view, it is unknown where it exists in the time series, and there is no clear distinction from other unnecessary information They are scattered all together.
  • An object of the present invention is to provide a playback device that supports detection of useful information from time-series data such as moving image data by a user in order to solve the above-described problems of the prior art.
  • a playback device plays back time-series data having a time stamp, and includes a first acquisition unit that acquires event information having occurrence time information, A display unit that reproduces a predetermined time range including a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit among the time-series data, Features.
  • the apparatus further comprises a second acquisition unit that acquires one or more other event information related to the event information acquired by the first acquisition unit, and the display unit includes the second of the time-series data.
  • the acquisition unit may sequentially jump to a predetermined time range including each time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information acquired by the acquisition unit and reproduce the event information.
  • the display unit may perform jump playback to a predetermined time range including the time when the event information is generated during playback of the time series data, and return to playback of the latest time when the jump playback ends.
  • the display unit performs a jump reproduction to a predetermined time range including a time when the event information is generated, and when the jump reproduction is completed, the display unit reproduces the time of reproduction immediately before the jump reproduction or the time You may make it return to reproduction
  • the display unit may combine and display the acquired event information on the time-series data image.
  • the event information includes supplementary information of various contents, and the second acquisition unit is the same as the one or more supplementary information included in the event information acquired by the first acquisition unit.
  • the event information having the following may be acquired.
  • the supplementary information may include event type information indicating the type of event.
  • the database may further include a database associating a plurality of the event information, and the second acquisition unit may acquire the event information associated with the event information acquired by the first acquisition unit in the database. Good.
  • the time series data may be moving image data, flow line trajectory data, or audio data.
  • the predetermined time range may start from a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information.
  • the predetermined time range starts at a past time than the time indicated by the same or closest time stamp as the occurrence time information of the event information, and the time indicated by the same or closest time stamp as the occurrence time information of the event information Alternatively, the future time may be terminated.
  • the predetermined time range may include only a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information, and the display unit may statically display the time-series data.
  • a generation unit that extracts time-series data in a predetermined time range including a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit
  • a storage unit that stores time-series data extracted by the data generation unit, and the display unit may display the time-series data stored in the storage unit.
  • the first acquisition unit acquires a plurality of the event information, and the generation unit is the same as the occurrence time information of the event information acquired by the first acquisition unit among the time-series data, or most You may make it extract the time series data of the continuous predetermined time range including all the time which a near time stamp shows.
  • the generation unit includes all times indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit and the second acquisition unit in the time-series data.
  • time-series data in a continuous predetermined time range may be extracted.
  • the data processing unit further includes an operation unit that receives a user operation, and the data generation unit further generates association data that associates the extracted time-series data with the event information acquired by the first acquisition unit, and the storage unit
  • the association data may be further stored, and the display unit may display time series data associated with the event information selected by the user using the operation unit and the association data.
  • the time range including the time when the event information occurs can be easily reproduced, the time range in which the time series data should be observed can be narrowed down, and the time series can be displayed to the user without requiring a complicated operation. Useful information can be noticed from data.
  • 10 is a block diagram illustrating a configuration of a playback device according to Modification Example 4. It is a schematic diagram which shows the database which concerns on the modification 4. It is a schematic diagram which shows the display screen of the reproducing
  • FIG. 10 is a schematic diagram of moving image data generated by a playback device according to Modification 6.
  • 16 is a flowchart showing a moving image data generation operation of a playback apparatus according to Modification 6.
  • FIG. 1 is a schematic diagram showing an installation environment of the playback apparatus 1.
  • a playback apparatus 1 shown in FIG. 1 is connected to a network 100 in which moving image data 200 and event information 300 are generated.
  • the playback apparatus 1 determines the playback start frame of the moving image data 200 using the occurrence date and time of the event information 300, and plays back the moving image data 200 on the network 100.
  • the moving image data 200 is distributed in real time starting from the surveillance camera being photographed.
  • the moving image data 200 is passed through an encoder in the network 100, and each frame data is given a time stamp.
  • the time stamp directly or indirectly indicates the occurrence date and time of each frame data.
  • the event information 300 is error information, status information, or the like, and is a measurement or observation result output from the measurement or observation equipment. In addition to the measurement or observation result, at least occurrence time information is added to the event information 300.
  • FIG. 2 is a block diagram showing the configuration of the playback device 1.
  • the playback device 1 is a so-called computer having a communication interface with the network 100 and a man-machine input / output interface, and is, for example, a desktop computer, a laptop computer, a smartphone, or a tablet terminal.
  • the playback apparatus 1 includes a playback processing unit 11, a display unit 12, an operation unit 13, and a playback control unit 14.
  • the reproduction processing unit 11 is a so-called graphic engine, and includes an arithmetic control device such as a CPU and a GPU and a frame memory.
  • the reproduction processing unit 11 decodes the moving image data 200, generates frame data in chronological order, and sequentially updates the frame memory read by the display unit 12, such as VRAM or RAMDAC, with newly generated frame data.
  • the reproduction processing unit 11 displays moving image data in an overlay display in a reproduction area defined on the display screen.
  • the display unit 12 is a screen that visualizes the frame data in the frame memory, and is a liquid crystal display, an organic EL display, or the like.
  • the display unit 12 reads frame data from the frame memory at a predetermined fresh rate and sequentially visualizes the frame data.
  • the operation unit 13 is an input interface that receives a user operation, and is a keyboard, a mouse, a touch panel, a microphone, or a camera.
  • the operation unit 13 is used for selecting event information 300 by the user during reproduction of the moving image data 200.
  • the reproduction control unit 14 mainly includes a CPU, and instructs the reproduction processing unit 11 to change the reproduction position within the time range in which the moving image data 200 is captured.
  • the playback control unit 14 temporarily inserts jump playback during real-time playback of the moving image data 200 in response to a user operation.
  • playback or cue playback may be controlled from an arbitrary selection point by the user.
  • the playback control unit 14 plays back the latest frame data of the moving image data 200 generated by the surveillance camera.
  • the reproduction control unit 14 temporarily reproduces a predetermined time range including the generation time of the event information 300 selected by the user using the operation unit 13 during reproduction of the moving image data 200.
  • the reproduction control unit 14 returns to the reproduction of the latest frame data in the moving image data 200. If it is real-time reproduction, the reproduction control unit 14 returns to real-time reproduction of the latest frame data.
  • the predetermined time range is, for example, 3 seconds, but can be changed by the user using the operation unit 13. What is necessary is just to change the value of the parameter which shows a predetermined time range.
  • the predetermined time range may be a range starting from the occurrence time of the event information 300 or a range including around the occurrence time of the event information 300.
  • the position of the predetermined time range starting from the occurrence time of the event information 300 can be changed by the user using the operation unit 13. For example, the parameter value is added to or subtracted from the occurrence time of the event information 300 to determine the start of a predetermined time range, the parameter value may be changed, and if the parameter value is set to zero, the event information 300 The occurrence time is the start time.
  • the reproduction control unit 14 includes a data collection unit 141, a screen generation unit 142, and a reproduction frame determination unit 143.
  • the data collection unit 141 includes an input / output interface with the network 100.
  • the data collection unit 141 receives distribution of the moving image data 200 from the network 100. Further, the data collection unit 141 collects event information 300 in the network 100.
  • the screen generation unit 142 includes a CPU and lays out a display screen.
  • the display screen has a playback area for moving image data 200 and a list display area for event information 300.
  • the screen generation unit 142 expands the contents of the event information 300 and arranges them in the list display area.
  • an event detail area may be provided on the display screen, and the content of the event information 300 selected by the user may be displayed in the event detail area.
  • the playback frame determination unit 143 includes a CPU and a man-machine input interface. The playback frame determination unit 143 determines the timing for jumping the playback position, and determines from the moving image data 200 the frame data serving as the start point of the playback position of the jump destination.
  • the first timing of jump playback is the occurrence of a jump occurrence event.
  • the jump occurrence event is a drag and drop of the event information 300 from the list display area or the event detail area to the reproduction area.
  • the playback frame determination unit 143 monitors the user's operation using the operation unit 13 and determines the occurrence of a jump playback event.
  • the second timing of jump playback is the end of jump playback. Specifically, the second timing is the end of reproduction within a predetermined time range after jump reproduction.
  • the playback frame determination unit 143 monitors the time stamp of the frame data to be written to the frame memory after starting the jump playback, and determines whether the time stamp corresponds to the end of the predetermined time range.
  • the first jump destination frame data is frame data corresponding to the start of a predetermined time range for jump reproduction.
  • the frame data corresponding to the start end is frame data to which a time stamp indicating the start end of the predetermined time range is added.
  • the playback frame determination unit 143 extracts occurrence time information from the event information 300 selected by the user operation.
  • the starting end of the predetermined time range is calculated from this occurrence time information.
  • the start of the predetermined time range is the time itself indicated by the occurrence time information.
  • the playback frame determination unit 143 subtracts the specific time stored in advance from the time indicated by the occurrence time information, so that a predetermined time range is obtained. Get the beginning.
  • the playback frame determination unit 143 searches for frame data having a time stamp at the same time as the start end of the predetermined time range. When there is no time stamp at the same time, the playback frame determination unit 143 searches for frame data having a time stamp indicating the time closest to the same time. The search result is taken over by the playback processing unit 11, and the playback processing unit 11 draws the frame data indicated by the search result in the frame memory in chronological order.
  • the second jump destination frame data is the latest frame data in the moving image data 200.
  • the playback frame determination unit 143 determines the end of jump playback, the playback frame determination unit 143 instructs the playback processing unit 11 to play back the most recently delivered frame data.
  • the reproduction processing unit 11 draws in the frame memory in chronological order from the most recently delivered frame data.
  • the playback device 1 shifts the processing from a routine for playing back the moving image data 200 being distributed in real time to a routine for jump playback, and determines the end of jump playback. What is necessary is just to return to the routine process of reproducing in real time.
  • FIGS. 3 and 4 are flowcharts showing the jump playback operation by the playback apparatus 1.
  • the data collection unit 141 receives the moving image data 200 from the network 100 (step S01), and the reproduction processing unit 11 sequentially updates the frame memory with the latest frame data of the moving image data 200 (step S02).
  • the display unit 12 displays the latest frame in the frame memory (step S03).
  • the playback control unit 14 displays the camera icon or thumbnail on the display screen so that the moving image data 200 is selected. It may be.
  • the data collection unit 141 collects the event information 300 from the network 100 in parallel with the reproduction processing of the moving image data 200 (step S04), and the screen generation unit 142 lists the event information 300 in the list display area of the display screen. And the contents are displayed (step S05).
  • the playback frame determination unit 143 extracts the occurrence time information included in the selected event information 300. (Step S07). Then, the playback frame determination unit 143 refers to the generation time information and the time stamp of the moving image data 200 to search for frame data captured at the same time or the closest time to the generation time information (step S08).
  • the time stamp indicates the occurrence date and time of the frame data to which it belongs.
  • representative occurrence date and time information such as the top frame data is attached to the metadata of the moving image data 200, and each time stamp indicates, for example, the order in which each frame data is reproduced with reference to the representative.
  • the reproduction processing unit 11 updates the frame memory in chronological order from the corresponding frame data by the search (step S09). Therefore, the display unit 12 reproduces and displays the moving image data 200 from the time when the selected event information 300 occurs (step S10). If the frame data exists on the network 100, it is distributed through the data collection unit 141.
  • the playback frame determination unit 143 acquires the end time of jump playback by adding a predetermined time range to the time indicated by the occurrence time information (step S11).
  • the time stamp of the frame data to be displayed next is compared with the end time of jump reproduction (step S12).
  • the playback frame determination unit 143 determines that the time stamp has reached the end time of jump playback (step S13)
  • the data collection unit 141 receives the latest frame data from the moving image data 200 (step S13). S14).
  • the playback processing unit 11 updates the frame memory in chronological order from the latest frame data next to the frame data to which the time stamp that has reached the end time of jump playback is added (step S15). Therefore, when the jump reproduction ends (step S16), the display unit 12 returns to the real-time reproduction of the moving image data 200 (step S17).
  • FIG. 5 is a schematic diagram showing a display screen according to the first application example of the playback apparatus 1, and shows a jump playback event during real-time playback.
  • FIG. 6 is a schematic diagram showing reproduction of the moving image data 200 according to the first application example.
  • a surveillance camera is installed on the production assembly line.
  • a product is tightened by a worker with a designated tool, and a dimension measuring device determines whether the product is good or bad downstream of the tightening operation.
  • the surveillance camera is installed with the photographing optical axis facing the tightening work place.
  • the dimension measuring device When the dimensions of the product are within the allowable range, the dimension measuring device generates event information 300 indicating a non-defective product and sends it to the network 100. On the other hand, if the dimension of the product is out of the allowable range, the dimension measuring instrument generates event information 300 indicating a product defect and sends it to the network 100.
  • a list of event information 300 generated one after another from the dimension measuring instrument is displayed on the display screen.
  • the moving image data 200 of the production assembly line captured by the monitoring camera is also reproduced in real time.
  • the worker who is performing the tightening work at the tightening work place is shown to the hand.
  • the product defect event information 300 is collected by the data collection unit 141 and displayed on the display screen.
  • a monitor who monitors the site using the playback device 1 monitors the moving image data 200 in real time, and if the product failure event information 300 is listed, the product failure with a code “NG” is recorded.
  • the event information 300 is dragged and dropped onto the playback area of the moving image data 200.
  • the event information 300 operated by the jump playback event has occurrence time information indicating, for example, time Tn-a-4.
  • the playback device 1 is set with parameters such that the predetermined time range is 3 seconds and jump playback is performed from the occurrence of the event information 300. Therefore, the playback apparatus 1 starts from the state where the frame data at the latest time Tn is being played back, and the frame to which the time stamp at time Tn-a-4 is added in the moving image data 200 for monitoring the tightening work place. Move to jump playback from data.
  • the real-time playback is resumed from the frame data of the latest time Tn + 5.
  • the video for 3 seconds during the jump playback a scene in which the worker holds another tool different from the designated tool is shown immediately before the event information 300 of product failure occurs. For this reason, the observer who observed the jump-played 3 second video, coupled with the recognition that the product failure event information 300 occurred, tightened the product with another tool different from the designated tool. If an abnormality occurred, the cause of the product failure was estimated.
  • the moving image data 200 is not only information that reinforces the contents of the event information 300 that the worker is using another tool, but also bulk data that includes a large amount of unnecessary information.
  • the specific information is scattered both spatially and temporally in the photographing field of view, and is unifying from other unnecessary information with no clear distinction.
  • this playback device 1 is a time when there is a high possibility that another tool is used in the playback range of the moving image data 200 based on the occurrence time information included in the event information 300 of the product defect occurrence.
  • the user is supported to narrow the range, and the user can obtain useful information that is not included in the event information 300 without leaving the selection of the event information 300.
  • the time range in which there is a high possibility that the situation in which another tool is used is reproduced in accordance with the selection of the event information 300 of product failure, the user can see the contents of the event information 300 and the other tool.
  • the video in the time range where there is a high possibility that the situation where the user is using is projected jumps together, and the content of the event information 300 seems to contain the video.
  • the information value of the event information 300 is improved for the user. Furthermore, since the user is aware of the information to be included in the event information 300, the information value of the event information 300 is also greatly improved.
  • a sensor that detects removal of a tool may be installed in the tool storage area, and the type information of the removed tool detected by the sensor may be included in the event information 300 of the dimension measuring instrument.
  • the playback device 1 is a playback device that plays back the moving image data 200 having the time stamp, and includes the first acquisition unit that acquires the event information 300 having the occurrence time information, and the moving image data 200.
  • the display unit 12 that jumps and reproduces the time zone including the time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information 300 acquired by the first acquisition unit is provided.
  • the first acquisition unit is the playback frame determination unit 143 in the present embodiment.
  • the playback frame determination unit 143 acquires the event information 300 selected by the jump playback event when the jump playback event occurs.
  • event information 300 may be acquired and jump-played. That is, the data collection unit 141 that captures the event information 300 into the playback device 1 is also an example of the first acquisition unit.
  • the playback apparatus 1 can assist the user so that useful information can be easily extracted from the moving image data 200, and is excellent in user friendliness for a person who monitors the moving image data 200.
  • the contents of the event information 300 and the video in the time range where there is a high possibility that the useful information is shown jump into the event information 300, and the contents of the event information 300 appear to contain the video. Therefore, the information value of the event information 300 is improved. Furthermore, since the user is aware of the information to be included in the event information 300, the information value of the event information 300 is also greatly improved.
  • the moving image data 200 is an example of time-series data having a time stamp, which is a result of measurement or observation in the order of time with respect to a phenomenon that varies with time.
  • Examples of the time series data include moving image data 200, flow line image data, electrocardiogram data, voice data, voice recorder data, and the like.
  • the playback apparatus 1 can perform jump playback according to the selection of the event information 300 as long as the data is arranged in time series order and a time stamp is added to each data, and can be applied to time series data. is there.
  • the flow line trajectory image data is data plotted so as to update the presence positions of objects and people in time series within a certain range of space.
  • the flow line trajectory image data includes map image data and a plurality of pieces of presence position information.
  • a time stamp is added to each location information.
  • the reproduction processing unit 11 generates frame data in which the existing position indicated by the existing position information is plotted on the map image data, and updates the frame memory.
  • the nurse carries a device for notifying the location of the RFID tag or the like, and a reader for reading the nurse's identification information from the RFID tag is installed in each place in the hospital.
  • a server that collects output from each reader generates flow line trajectory image data by adding each location information of each reader that has read identification information to map image data in a hospital that is stored in advance. .
  • various medical devices are installed in the hospital, and the various medical devices generate event information 300 and output it to the network 100.
  • the event information 300 transmitted to the network 100 includes patient abnormality event information 300 indicating a patient abnormality detected by the medical device.
  • the monitor of the playback apparatus 1 selects the event information 300 of the patient abnormality that occurred last night from the list display area on the display screen, and executes the jump playback event operation.
  • the data collection unit 141 acquires the flow line locus image data from the network 100 if the flow line locus image data is not reproduced.
  • the playback frame determination unit 143 extracts the occurrence time information from the selected patient abnormality event information 300, refers to the time stamp included in the flow line trajectory image data, and obtains the location information of the same time as the occurrence time information. Explore.
  • the reproduction processing unit 11 generates frame data in which the existing position indicated by the searched existing position information is plotted on the in-hospital map indicated by the hospital map image data, and stores the frame data in the frame memory.
  • the reproduction processing unit 11 generates frame data in a predetermined time range in chronological order by re-plotting the existing position information to which the time stamp that falls within the predetermined time range exceeding the rushing rule time is added in chronological order, Update the frame memory.
  • the flow line trajectory of each nurse in a predetermined time range from the time when the patient abnormality event information 300 occurs is displayed in an overlay manner.
  • jump reproduction is performed.
  • the reproduction starts from the time when the event information 300 indicating that the patient is abnormal.
  • the monitor who observed this regeneration had a nurse in charge in a separate ward away from the medical device that warned of the patient's abnormality, and the nurse in charge immediately started rushing from the ward. I found out that it was over time. In addition, it was found that there was a shortage of nurses in the hospital when the patient abnormality event information 300 occurred. In other words, it was found that due to the shortage of nurses, the nurses in charge were required to carry out work other than those in charge, which led to delays in rushing.
  • the playback device 1 can assist the user so that useful information can be easily extracted from the time series data even if it is time series data other than the moving image data 200, and is user friendly to those who analyze the time series data. Excellent.
  • the information value of the event information 300 is improved.
  • the information value of the event information 300 is also greatly improved.
  • the time-series data such as the moving image data 200 may be generated in real time and distributed over the network, or may be stored in the network 100 after generation of shooting or the like is completed. Time series data generated in real time and distributed over the network is often used for monitoring in real time, and it is desirable to reproduce the latest frame data of the time series data after jump reproduction.
  • the frame data at the time following the predetermined time range after the jump reproduction may be reproduced in time series.
  • the playback frame determination unit 143 does not need to determine the end of the predetermined time range.
  • the time-series data that has already been generated may be played back by returning to the frame data that was played back immediately before the jump playback or just before and after the jump playback, as shown in FIG.
  • the playback frame determination unit 143 returns the playback of the time series data to the time when playback was performed immediately before the jump playback.
  • the playback frame determination unit 143 stores a time stamp of frame data played immediately before jump playback or a time stamp having a time before and after the time, and after completion of jump playback, from the frame data to which the stored time stamp is added. Let it play.
  • the predetermined time range can be set to, for example, zero seconds. In other words, only the time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information 300 may be jump-reproduced. In this case, a still image is displayed.
  • the playback frame determination unit 143 When the playback frame determination unit 143 designates the jump playback destination frame data, the playback frame determination unit 143 starts measuring a predetermined time. During the time counting, the next frame data is instructed, and when the predetermined time elapses, the next frame data is instructed. As a result, after the still image is displayed for a predetermined time, the playback of the portion that was played back immediately before the real-time playback or jump playback is resumed.
  • the measurement or observation device that generates the event information 300 is, for example, a temperature sensor, a dimension measurement device, a medical device, or the like.
  • the measurement or observation equipment monitors the operating status of various facilities and patients operating in factories, offices, etc., and outputs event information 300 when events occur in various facilities and patients.
  • the event is a periodic status report, a predetermined condition of an internal / external situation to be monitored being fulfilled, or a predetermined condition of an internal / external situation to be monitored being fulfilled.
  • FIG. 9 is a schematic diagram showing the data structure of the event information 300.
  • event information 300 in addition to measurement or observation results and occurrence time information, event type information indicating the type of event, explanation information explaining the measurement or observation results, purpose of monitoring, equipment, and patient It is possible to add information according to the situation and property of the other supplementary information.
  • the identification information of the worker who assembled the product may be added as supplementary information to the event information indicating the quality of the product.
  • Such supplemental information is typically added as a result of the information collection system in the network 100 reconstructing the event information 300.
  • the information collection system has a model after reconstruction for each event information 300 output by the measurement or observation equipment.
  • the model lists types of supplementary information to be added to the event information 300.
  • the information collection system refers to the model and determines the type of supplemental information.
  • the information collection system narrows down the group of supplementary information of the determined type from the data of various contents existing in the network 100, and uses the contents of the event information 300 as a search key to search the event information 300 Search supplementary information according to the contents of. Then, the information collection system reconstructs the event information 300 by adding the corresponding supplementary information to the event information 300 output from the measurement or observation equipment.
  • FIG. 10 is a block diagram showing the configuration of the playback apparatus 1.
  • the playback control unit 14 of the playback device 1 further includes a search unit 144.
  • the search unit 144 includes a CPU, and searches for event information 300 belonging to the same category within a certain range as the event information 300 selected by the jump playback event.
  • the event information 300 belongs to the same category within a certain range.
  • the search unit 144 searches the event information 300 belonging to the same category within a predetermined range specified in advance or a predetermined range specified by an operation using the user operation unit 13.
  • the event information 300 belonging to the same category within a predetermined range specified in advance is the event information 300 having the same event type information indicating the type of event added as supplementary information.
  • the search unit 144 searches the event information 300 to which the same event type information is added using the event type information of the event information 300 selected in the jump playback event as a search key.
  • the search destination is in the playback device 1 when the data collection unit 141 has collected the event information 300, and the network 100 when the data collection unit 141 has not collected the event information 300.
  • the search unit 144 monitors the contents of the jump playback event. Then, using the operation unit 13 in the list display area and the event detail area, single or plural supplementary information is selected from the single event information 300, and an event playback event occurs while the selection is valid. Then, the search unit 144 searches the event information 300 to which the same supplement information is added using the selected supplement information as a search key.
  • the search unit 144 searches the event information 300 to which the same supplement information is added using the event type information as a search key. To do.
  • the playback frame determination unit 143 handles the event information 300 selected by the jump playback event and the event information 300 searched by the search unit 144 in the same row. That is, each time when the corresponding event information 300 is generated by selection and search is sequentially jump-reproduced. The timing of each jump playback is the end of a predetermined time range during jump playback. When the previous jump playback ends, the next jump playback is performed.
  • the jump playback order may be the order of occurrence of the selected event information 300 in the order corresponding to the search or the ascending or descending order of the searched event information 300.
  • FIG. 11 is a flowchart showing the jump playback operation by the playback apparatus 1.
  • the search unit 144 selects the selected event. Event type information included in the information 300 is extracted (step S23).
  • the search unit 144 searches the other event information 300 using the extracted event type information as a search key (step S24).
  • the search destination is within the playback device 1, and when other event information 300 has not been collected or is being collected, the search destination is the network 100.
  • the playback frame determination unit 143 extracts each occurrence time information included in the corresponding event information 300 by selection and search (step S25). Then, the playback frame determination unit 143 refers to the occurrence time information and the time stamp of the moving image data 200, and searches each frame data captured at the same time or the closest time to each occurrence time information (step S26).
  • the playback processing unit 11 starts jump playback from the Nth jump playback order of the frame data corresponding to the search (step S27).
  • the playback frame determination unit 143 acquires the end time of the current jump playback (step S28).
  • the jump reproduction order may be ascending order or descending order based on the occurrence time of the corresponding frame data by the search. Initially, the frame data at the same time as the event information 300 selected by the user is desirable.
  • the playback frame determination unit 143 compares the time stamp of the next frame data to be displayed with the end time of jump playback (step S29). When the playback frame determining unit 143 determines that the time stamp has reached the end time of jump playback (step S30, Yes), the playback frame determining unit 143 increments the jump playback order N by 1 (step S31). Then, it is determined whether or not the jump reproduction order N exceeds the added value of the number of selections and the number of hits (step S32).
  • step S32, No If the jump reproduction number N does not exceed the comparison value (step S32, No), the reproduction processing unit 11 proceeds to jump reproduction of the Nth frame data in the next jump reproduction order (step S27). If the jump reproduction number N exceeds the added value of the selection number and the number of hits (step S32, Yes), the jump reproduction is terminated (step S33).
  • FIG. 12 shows another example of the operation of such a playback apparatus 1.
  • FIG. 12 is a flowchart showing the jump playback operation by the playback apparatus 1.
  • the user selects the event information 300 using the operation unit 13 (step S42), and further selects one or more supplemental information from the selected event information 300 (step S43).
  • the search unit 144 extracts supplementary information selected by the user from the selected event information 300 (step S45).
  • the search unit 144 searches the other event information 300 using the extracted supplementary information as a search key (step S46).
  • the search method is an AND search.
  • the playback frame determination unit 143 extracts each occurrence time information included in the corresponding event information 300 by selection and search (step S47). Then, the playback frame determination unit 143 refers to the occurrence time information and the time stamp of the moving image data 200 to search for each frame data captured at the same time or the closest time to each occurrence time information (step S48).
  • the playback processing unit 11 starts jump playback from the Nth jump playback order of the frame data corresponding to the search (step S49).
  • the playback frame determination unit 143 acquires the current jump playback end time (step S50).
  • the jump reproduction order may be ascending order or descending order based on the occurrence time of the corresponding frame data by the search. Initially, the frame data at the same time as the event information 300 selected by the user is desirable.
  • the playback frame determination unit 143 compares the time stamp of the next frame data to be displayed with the end time of jump playback (step S51). If the playback frame determination unit 143 determines that the time stamp has reached the end time of jump playback (step S52, Yes), the playback frame determination unit 143 increments the jump playback order N by 1 (step S53). Then, it is determined whether or not the jump reproduction order N exceeds the added value of the selected number and the number of hits (step S54).
  • step S54 If the jump playback number N is less than or equal to the added value of the selected number and the number of hits (step S54, No), the playback processing unit 11 proceeds to jump playback of the Nth frame data in the next jump playback order (step S49). ). If the jump reproduction number N exceeds the addition value of the selection number and the number of hits (step S54, Yes), the jump reproduction is terminated (step S55).
  • a third application example of the playback apparatus 1 will be described.
  • a monitoring camera is installed on the production assembly line.
  • a product is tightened by a worker with a designated tool, and a dimension measuring device determines whether the product is good or bad downstream of the tightening operation.
  • the surveillance camera is installed with the photographing optical axis facing the tightening work place.
  • the dimension measuring device When the dimensions of the product are within the allowable range, the dimension measuring device generates event information 300 indicating a non-defective product and sends it to the network 100. On the other hand, if the dimension of the product is out of the allowable range, the dimension measuring instrument generates event information 300 indicating a product defect and sends it to the network 100.
  • a list of event information 300 generated one after another from the dimension measuring instrument is displayed.
  • the moving image data 200 of the production assembly line captured by the monitoring camera is also reproduced in real time.
  • the worker who is performing the tightening work at the tightening work place is shown to the hand.
  • the product defect event information 300 is collected by the data collection unit 141 and displayed on the display screen.
  • the monitoring person who monitors the site using the playback apparatus 1 monitors the moving image data 200 in real time, and when the product defect event information 300 is listed, the product defect event information 300 is reproduced from the moving image data 200. Drag and drop into the area.
  • the playback device 1 searches the event information 300 having event type information indicating product defects as well as the dragged and dropped event information 300. Then, as shown in FIG. 13, the generation time Tn-a-4 of the selected event information 300 and the generation times Tn-b-4 and Tn-c-3 of all the event information 300 corresponding to the search are included. Each predetermined time range is jump-played in order. In all jump-played images, a specific worker is holding another tool different from the designated tool. For this reason, the monitoring person estimated that a dimensional abnormality occurred due to the lack of skill of the specific worker and / or the use of another tool.
  • the monitor includes supplementary information for identifying the specific worker who has caused the product failure by the assembly worker of the product, and supplementary information for identifying the worker from the event information 300 indicating that the product is a non-defective product. While selecting, the event information 300 was dragged and dropped to the reproduction area.
  • the playback device 1 further includes a second acquisition unit that acquires one or a plurality of other event information related to the event information 300 acquired by the first acquisition unit.
  • the time information including the time indicated by the time stamp that is the same as or closest to the occurrence time information of the other event information 300 acquired by the second acquisition unit is sequentially jumped and reproduced.
  • the search unit 144 corresponds to a second acquisition unit.
  • the playback apparatus 1 also searches the event information 300 belonging to the same category within a certain range as the selected event information 300, and sequentially jumps and reproduces a predetermined time range including the same time as the event information 300 belonging to the same category. To do. Therefore, information that is difficult to narrow down by the selected event information 300 is clearly revealed, and the chances of the user's awareness are dramatically increased.
  • FIG. 14 is a block diagram showing the configuration of the playback apparatus 1 according to this modification.
  • the playback device 1 includes a database 16.
  • the database 16 may be included in a computer including the display unit 12 or may be included in a computer different from the computer including the display unit 12.
  • Another computer is a component of the information collection system, for example, and may be installed in the network 100.
  • the database 16 is mainly configured to include an external storage device such as an HDD, and stores association of event information 300 in the network 100 as shown in FIG.
  • the association links, for example, each event information 300 generated in each manufacturing process of the same product. Further, the association is associated with each event information 300 generated in a predetermined range such as in the same facility or the same factory.
  • the search unit 144 refers to the database 16 and searches the event information 300 associated with the event information 300 selected in the jump playback event.
  • the playback frame determination unit 143 treats the event information 300 selected by the jump playback event and the event information 300 acquired by the search unit 144 referring to the database 16 in the same row, and sequentially jumps each time when the event information 300 occurs. Reproduce.
  • FIG. 16 is a schematic diagram showing a display screen of the playback apparatus 1 according to the modified example 5, and shows a jump playback event during real-time playback.
  • the event information 300 selected by the user at the time of the jump playback event is not limited to one, and the user can select a plurality of pieces of event information.
  • the playback frame determination unit 143 extracts occurrence time information from all selected event information 300. Then, the reproduction processing unit 11 sequentially performs jump reproduction within a predetermined time range from the frame data to which the same or closest time stamp as each occurrence time information is added.
  • the search unit 144 is not essential.
  • FIG. 17 is a block diagram showing the configuration of the playback device 1.
  • the playback device 1 further includes a synthesis unit 15.
  • the combining unit 15 includes a CPU.
  • the synthesizer 15 superimposes and displays the event information 300 on the video of the moving image data 200 during jump playback.
  • the synthesizing unit 15 writes the content of the event information 300 selected by the jump playback event in the frame memory.
  • the timing of writing is during reproduction within a predetermined time range including the same or the closest time as the event information 300.
  • the synthesis unit 15 writes the content of the event information 300 belonging to the same category within a certain range as the event information 300 selected in the jump playback event in the frame. This writing timing is during reproduction within a predetermined time range including the same or closest time as the event information 300 belonging to the same category within a certain range.
  • FIG. 18 is a flowchart showing the composite display operation by the playback apparatus 1.
  • the search unit 144 selects the selected event.
  • the event information 300 related to the information 300 is searched (step S73).
  • the reproduction processing unit 11 performs jump reproduction of the video in the same time zone as the corresponding event information 300 by the selection and search (step S74).
  • the synthesizing unit 15 synthesizes and displays the event information 300 that occurs in the same time zone as the video on the video being jump-reproduced and is selected and searched (step S75).
  • FIG. 19 is a schematic diagram showing a display screen of the playback apparatus 1 and shows during jump playback. As shown in FIG. 19, in the video that is being jump-played, the occurrence time information included in the time range to be jump-played at the right end is displayed, and the contents of the corresponding event information 300 are displayed in an overlay by selection and search.
  • the synthesis unit 15 may display the event information 300 not in the playback area of the moving image data 200 but in another area, for example, a detailed display area.
  • the jump playback event has a property as a trigger for providing a moving image in a predetermined time range requested by the user at a timing requested by the user. On the other hand, even if there is no request from the user, a moving image in a predetermined time range including the time when the event information 300 is generated may be prepared in advance.
  • FIG. 20 is a block diagram showing the configuration of the playback apparatus 1.
  • the playback device 1 further includes a playback candidate generation unit 17 and a moving image storage unit 18.
  • the reproduction candidate generation unit 17 mainly includes a CPU.
  • the moving image storage unit 18 includes an external storage device such as an HDD.
  • the reproduction candidate generation unit 17 trims the moving image data 201 in a predetermined time range including the occurrence time of the event information 300 from the moving image data 200 and stores the trimmed moving image data 201 in the moving image storage unit 18. On the other hand, the reproduction candidate generation unit 17 trims the moving image data 201 and then deletes the original moving image data 200 from the storage area of the reproducing apparatus 1 or the network 100.
  • the reproduction candidate generation unit 17 trims the moving image data 201 for all the event information 300 that has occurred.
  • the moving image data 201 may be trimmed using only the event information 300 having specific event type information. For example, if the time range for jump reproduction is 3 seconds as in the first embodiment, the predetermined time range to be trimmed by the reproduction candidate generation unit 17 is desirably set to be long, such as about 5 minutes. This is because the original moving image data 200 is deleted, so that the range that the user wants to browse is given a margin.
  • the reproduction candidate generation unit 17 generates association data 400 between the event information 300 used for generating the moving image data 201 and the moving image data 201 generated using the event information 300.
  • the association data 400 is data in which the identification information of each of the moving image data 201 and the event information 300 is paired, or directory data indicating a folder in which both the moving image data 201 and the event information 300 are stored.
  • the screen generation unit 142 refers to the association data 400 and displays the event information 300 specified by the association data 400 in the list display area.
  • the playback frame determination unit 143 refers to the association data 400 and identifies the moving image data 201 associated with the selected event information 300.
  • the reproduction processing unit 11 takes over from the reproduction frame determination unit 143 and updates the frame memory in order from the first frame data of the moving image data 201 so that the specified moving image data 201 is reproduced.
  • FIG. 21 is a flowchart showing the generation operation of the moving image data 201 in the reproducing apparatus 1.
  • the data collection unit 141 monitors the network 100, and when event information 300 is generated (step S81), generation of the moving image data 201 is started.
  • the moving image data 201 may be generated periodically, or a generation button for the moving image data 201 is prepared on the display screen, and the generation of the moving image data 201 is performed by pressing this generation button. Generation may begin.
  • the reproduction candidate generation unit 17 calculates the start of the moving image data 201 corresponding to the event information 300 (step S82).
  • the reproduction candidate generation unit 17 stores in advance time parameters from the time of the event information 300 to the beginning of the moving image data 201.
  • the reproduction candidate generation unit 17 subtracts the time parameter until the start time from the occurrence time information of the event information 300 to derive the time when the moving image data 201 becomes the start time.
  • the reproduction candidate generation unit 17 calculates the end of the moving image data 201 (step S83).
  • the reproduction candidate generation unit 17 stores in advance time parameters from the time of the event information 300 to the end of the moving image data 201.
  • the reproduction candidate generation unit 17 adds the time parameter until the end to the generation time information of the event information 300, and derives the time at which the moving image data 201 ends.
  • the reproduction candidate generating unit 17 duplicates all the frame data from the starting frame data to the ending frame data of the moving image data 201 from the moving image data 200 (step S84), it is stored in the moving image storage unit 18 (step S85).
  • the reproduction candidate generation unit 17 generates association data 400 by associating the identification information of the event information 300 used for the start and end calculations of the moving image data 201 and the identification information of the moving image data 201 ( In step S86, the moving image storage unit 18 stores the information (step S87). When the generation of the moving image data 201 and the association data 400 is completed (step S88), the reproduction candidate generation unit 17 deletes the moving image data 200 from the reproduction apparatus 1 or the network 100 (step S89).
  • FIG. 22 is a flowchart showing the reproduction operation of the moving image data 201 in the reproduction apparatus 1.
  • the screen generation unit 142 refers to the association data 400 in the moving image storage unit 18 (step S91), and displays the event information 300 specified by the association data 400 in the list display area (step S92).
  • the playback frame determination unit 143 refers to the association data 400 of the moving image storage unit 18 (step S94) and associates with the selected event information 300.
  • the obtained moving image data 201 is specified (step S95).
  • the reproduction processing unit 11 reads out the moving image data 201 specified by the reproduction frame determination unit 143 from the moving image storage unit 18 (step S96), and the display unit 12 is associated with the selected event information 300 and selected event
  • the moving image data 201 including the information generation time is displayed (step S97).
  • the playback candidate generation unit 17 acquires the event information acquired from the moving image data 200.
  • the moving image data 201 in a predetermined time range including the time indicated by the time stamp that is the same as or closest to the occurrence time information of 300 is extracted, and the extracted moving image data 201 is stored in the moving image storage unit 18.
  • the display unit 12 displays the moving image data 201 stored in the moving image storage unit 18 in accordance with a user operation.
  • the moving image data 200 only valuable information that contributes to operations such as operation management, product quality management, and business management can be stored as the moving image data 201, thereby reducing the amount of data. Contribute.
  • the reproduction candidate generation unit 17 further generates association data 400 that associates the extracted moving image data 201 with the acquired event information 300, and the moving image storage unit 18 further stores the association data 400.
  • the display unit 12 displays the moving image data 400 associated with the event information 300 selected by the user using the operation unit 13 and the association data 400.
  • the event information 300 and the moving image data 201 are integrated through the association data 400, and the event information 300 and the moving image data 201 are organically connected to synergistically improve the value of the information.
  • FIG. 23 is a schematic diagram of moving image data 201 generated by the playback device 1 of Modification 6 according to the fourth embodiment.
  • one piece of moving image data 201 may be created for each piece of event information 300, or one piece of event information 300 may be used for one piece of event information 300 as in the playback apparatus 1 according to the modified example 6.
  • the moving image data 201 may be generated.
  • the reproduction candidate generating unit 17 trims one moving image data 201 in a time range including all occurrence times of the plurality of event information 300.
  • the association data 400 all the identification information of the event information 300 and the identification information of one moving image data 201 are associated with each other.
  • the plurality of event information 300 may be a group selected by the user, a group generated before the generation timing of the moving image data 201, or the event information 300 belonging to the same category as the event information 300 selected by the user within a certain range. Any of the group which combined.
  • the reproduction candidate generation unit 17 stores in advance time parameters from the most recent occurrence time to the start of the moving image data 201 among these event information 300.
  • the reproduction candidate generation unit 17 subtracts the time parameter until the start time from the occurrence time information of the event information 300 to derive the time when the moving image data 201 becomes the start time.
  • the reproduction candidate generation unit 17 stores in advance time parameters from the latest occurrence time of these event information 300 to the end of the moving image data 201.
  • the reproduction candidate generation unit 17 adds the time parameter until the end to the latest occurrence time information, and derives the time at which the moving image data 201 ends.
  • FIG. 24 is a flowchart showing an example of the generation operation of the moving image data 201 of the playback apparatus 1.
  • the data acquisition unit 141 collects the moving image data 200 and event information 300 existing in the network 100 (step S102).
  • the plurality of event information 300 is selected from a list display area. Also, for example, other event information 300 having supplementary information selected by the user is acquired. Further, for example, by referring to the database 16, the event information 300 selected by the user and a plurality of event information 300 related on the database 16 are acquired.
  • the playback frame determination unit 143 acquires the event information 300 (step S103), and extracts each occurrence time information included in the event information 300 (step S104). At this time, the reproduction candidate generation unit 17 searches for the latest occurrence time information from each occurrence time information (step S105), and searches for the latest occurrence time information from each occurrence time information (step S106).
  • the reproduction candidate generation unit 17 calculates the start time of the moving image data 201 from the most recent occurrence time information (step S107). Further, the reproduction candidate generation unit 17 calculates the end of the moving image data 201 from the latest occurrence time information (step S108).
  • the reproduction candidate generating unit 17 duplicates all the frame data from the starting frame data to the ending frame data of the moving image data 201 (step S109), and the moving image storage unit 18 (step S110).
  • the reproduction candidate generation unit 17 associates all the identification information of the plurality of event information 300 with the identification information of the moving image data 201 to generate association data 400 (step S111) and stores it in the moving image storage unit 18. (Step S112).
  • the reproduction candidate generation unit 17 deletes the moving image data 200 from the reproduction apparatus 1 or the network 100 (step S114).
  • the first acquisition unit taking the playback frame determination unit 143 as an example acquires a plurality of event information 300, and the playback candidate generation unit 17 includes these events in the moving image data 200.
  • the moving image data 201 including all occurrence times of the information 300 is extracted, and the extracted moving image data 201 is stored in the moving image storage unit 18.
  • the reproduction candidate generation unit 17 causes the moving image data 200 to be stored.
  • the moving image data 201 including all occurrence times of the event information 300 is extracted, and the extracted moving image data 201 is stored in the moving image storage unit 18.
  • each event information 300 and one moving image data 201 are organically connected to synergistically improve the value of the information.
  • playback device 11 playback processing unit 12 display unit 13 operation unit 14 playback control unit 141 data collection unit 142 screen generation unit 143 playback frame determination unit 144 search unit 15 combination unit 16 database 17 playback candidate generation unit 18 video storage unit 100 network 200 Moving image data 201 moving image data 300 event information 400 association data

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

Provided is a reproducing device which assists in the detection by a user of useful information from time series data such as moving image data. A reproducing device 1 reproduces time-stamped moving image data 200, and is provided with: a reproduction frame determining unit 143 which acquires event information 300 including time of occurrence information; and a display unit 12 which jumps to and reproduces a time span, from among the moving image data 200, including a time point indicated by a time-stamp that is the same as or closest to the time of occurrence information of the event information 300 acquired by the reproduction frame determining unit 143.

Description

再生装置Playback device
 本発明は、動画像データ等の時系列データを再生する再生装置に関する。 The present invention relates to a playback device that plays back time-series data such as moving image data.
 工場やオフィス等においては、各種の設備が稼働しており、それら設備を監視するデータ発生装置が観測値、エラー情報及びステータス情報等の設備に対する観測結果を示す現場データを出力する。データ発生装置は、設備のセンサ、FA(Factory Automation)を制御するプログラマブルロジックコントローラ、製品をカウントするコンピュータ、ユーザが情報入力可能なモバイル端末等である。 In factories and offices, various facilities are operating, and a data generator that monitors these facilities outputs field data indicating observation results for the facilities such as observation values, error information, and status information. The data generation device is a facility sensor, a programmable logic controller that controls factory automation (FA), a computer that counts products, a mobile terminal that allows a user to input information, and the like.
 現場データは、コード化された、若しくは観測結果である数値として出力され、発生場所や日時情報が付加されたイベント情報として、固定資産管理者によりモニタリングされる。固定資産管理者は、例えば、工場では設備の保守管理者であり、オフィスではサーバ等の設備管理者である。 The on-site data is encoded or output as numerical values that are observation results, and is monitored by the fixed asset manager as event information with the occurrence location and date / time information added. The fixed asset manager is, for example, a facility maintenance manager in a factory and a facility manager such as a server in an office.
 このイベント情報は、工場やオフィスの稼働管理、製品の品質管理、経営管理等の操業に寄与する貴重な情報である。しかしながら、設備から直接出力されるイベント情報は、コード若しくは単なる観測数値、及び発生場所や発生日時等の直接的な意味以外には、2次的、3次的な意味を有していない。従来は、現場レベルで得られた情報を管理や経営レベルに至るまでの間に各層で順次解釈し直し、イベント情報を2次的、3次的な意味を持つ情報に変化させていくことで、各レベルにおいてイベント情報を活用していた。そのため、現場レベルと管理や経営レベルとでは、情報の伝わり方にタイムラグが発生しがちとなり、現場レベルと経営管理レベルが遊離しがちとなっている。 This event information is valuable information that contributes to operations such as factory and office operation management, product quality management, and business management. However, the event information output directly from the facility does not have secondary or tertiary meanings other than the direct meanings such as codes or simple observation numerical values, occurrence location and occurrence date and time. Conventionally, the information obtained at the site level is sequentially re-interpreted at each layer until it reaches the management and management level, and event information is changed into information with secondary and tertiary meanings. The event information was used at each level. Therefore, a time lag tends to occur in the way of information transmission between the site level and the management or management level, and the site level and the business management level tend to be liberated.
 そこで、各レベルにおいてリアルタイムにイベント情報を共有することのできる情報収集システムが提唱されている(例えば特許文献1参照)。この情報収集システムは、現場データに付加する各補足情報の種類を定義するモデルが収容された辞書を記憶しておく。また、各種各内容の補足情報が記憶されたデータベースを用意しておく。そして、辞書に従って、データベースから各補足情報を検索し、検索された各補足情報を現場データに追加してイベント情報を生成する。 Therefore, an information collection system that can share event information in real time at each level has been proposed (see, for example, Patent Document 1). This information collection system stores a dictionary in which models that define the types of supplementary information added to field data are accommodated. In addition, a database storing supplementary information of various contents is prepared. Then, according to the dictionary, each supplementary information is searched from the database, and each searched supplementary information is added to the field data to generate event information.
 これにより、イベント情報は、現場データに対する単なる背景情報のみならず、現場データの内容に対する解釈や評価等の説明情報も加わることになり、情報伝達に即時性を有することになる。つまり、利用者側からは、必要とする内容が直接的かつ分かりやすく記述されたイベント情報が発生しているように見え、解釈の作業を要することなく、そのイベント情報の到達の時点で、イベント情報の持つ2次的、3次的な意味内容を理解することができる。従って、現場レベルから経営管理レベルに至るまで、イベント情報の到達のタイムラグを解消するのみならず、イベント情報に対する理解のタイムラグをも解消することができる。 As a result, the event information is not only simple background information for the field data but also explanatory information such as interpretation and evaluation of the contents of the field data, so that the information transmission has immediacy. In other words, from the user side, it seems that the event information describing the required contents directly and in an easy-to-understand manner has occurred, and when the event information arrives without requiring any interpretation work, Can understand secondary and tertiary meanings of information. Therefore, from the on-site level to the business management level, not only the arrival time lag of event information but also the understanding time lag of event information can be eliminated.
 但し、特に情報収集システムの立ち上げ初期段階においては、辞書の作成において、現場データに付加する各補足情報の種類をモデルに過不足なく定義しておくことは容易ではない。そこで、監視カメラが出力した動画像データの観察を併用する場合も多い。イベント情報に未だ含有していない有益な情報が、動画像データには存在する可能性があるためである。 However, especially in the initial stage of starting up the information collection system, it is not easy to define the types of supplementary information to be added to the field data in the model without any deficiencies. Therefore, in many cases, observation of moving image data output from the surveillance camera is used together. This is because useful information not yet contained in the event information may exist in the moving image data.
 ユーザは、イベント情報と動画像データの両方を閲覧端末に表示させ、イベント情報に注目したり、動画像データを注目したり、両データを行きつ戻りつしたり、注目するイベント情報を変えたり、注目する動画像データを変えたりすることで、有益な情報の気づきを得ようとしていた。 The user displays both event information and moving image data on the viewing terminal, pays attention to the event information, pays attention to the moving image data, goes back and forth between both data, and changes the event information of interest. By changing the moving image data to be noticed, he was trying to obtain useful information.
特開2012-234496号公報JP 2012-23496 A
 動画像データは、イベント情報の内容を補強する有益情報を記録するのみならず、不要な情報も大量に記録したバルキーデータである。しかも、その有益情報の存在はあくまで可能性であり、撮影視野内の何れに存在するか不明であり、時系列上の何れに存在するか不明であり、更に他の不要情報と明瞭な区別なく渾然一体となって散在している。 Video data is bulky data that not only records useful information that reinforces the content of event information, but also records a large amount of unnecessary information. Moreover, the existence of useful information is only possible, it is unknown where it exists in the field of view, it is unknown where it exists in the time series, and there is no clear distinction from other unnecessary information They are scattered all together.
 そのため、監視者は、動画像データが示す映像の各所各時間を手がかりなく観察するしかなかった。観察により有益情報を検出できても、その有益情報を確認するには、ユーザが望む位置にキューポイントを設けることができるまでは、ユーザは手動にて所望位置まで何度も映像を戻さねばならず、この煩雑作業がユーザの有益情報への気づきの機会を低下させる一因でもあった。 Therefore, the observer had no choice but to observe each time of the video indicated by the moving image data. Even if useful information can be detected by observation, in order to confirm the useful information, the user must manually return the video to the desired position repeatedly until a cue point can be provided at the desired position. However, this troublesome work is also one of the causes for reducing the opportunity for the user to notice useful information.
 本発明は、上記のような従来技術の問題点を解決するため、ユーザによる動画像データ等の時系列データからの有益情報の検出を支援する再生装置を提供することを目的とする。 An object of the present invention is to provide a playback device that supports detection of useful information from time-series data such as moving image data by a user in order to solve the above-described problems of the prior art.
 上記目的を達成するために、本発明に係る再生装置は、タイムスタンプを有する時系列データを再生する再生装置であって、発生時刻情報を有するイベント情報を取得する第1の取得部と、前記時系列データのうち、前記第1の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を含む所定時間範囲を再生する表示部と、を備えること、を特徴とする。 In order to achieve the above object, a playback device according to the present invention plays back time-series data having a time stamp, and includes a first acquisition unit that acquires event information having occurrence time information, A display unit that reproduces a predetermined time range including a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit among the time-series data, Features.
 前記第1の取得部が取得した前記イベント情報と関連する単数又は複数の他のイベント情報を取得する第2の取得部を更に備え、前記表示部は、前記時系列データのうち、前記第2の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す各時刻を含む所定時間範囲へ順次ジャンプして再生するようにしてもよい。 The apparatus further comprises a second acquisition unit that acquires one or more other event information related to the event information acquired by the first acquisition unit, and the display unit includes the second of the time-series data. The acquisition unit may sequentially jump to a predetermined time range including each time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information acquired by the acquisition unit and reproduce the event information.
 前記表示部は、前記時系列データの再生中、前記イベント情報が発生した時刻を含む所定時間範囲へジャンプ再生し、前記ジャンプ再生が終了すると、最新時刻の再生に戻すようにしてもよい。 The display unit may perform jump playback to a predetermined time range including the time when the event information is generated during playback of the time series data, and return to playback of the latest time when the jump playback ends.
 前記表示部は、前記時系列データの再生中、前記イベント情報が発生した時刻を含む所定時間範囲へジャンプ再生し、前記ジャンプ再生が終了すると、前記ジャンプ再生の直前に再生していた時刻又は其の前後の再生に戻すようにしてもよい。 During the reproduction of the time series data, the display unit performs a jump reproduction to a predetermined time range including a time when the event information is generated, and when the jump reproduction is completed, the display unit reproduces the time of reproduction immediately before the jump reproduction or the time You may make it return to reproduction | regeneration before and after.
 前記表示部は、取得された前記イベント情報を前記時系列データの画像に合成表示するようにしてもよい。 The display unit may combine and display the acquired event information on the time-series data image.
 前記イベント情報は、各種各内容の補足情報を含み、前記第2の取得部は、前記第1の取得部が取得した前記イベント情報に含まれる単数又は複数の前記補足情報と同一の前記補足情報を有する前記イベント情報を取得するようにしてもよい。 The event information includes supplementary information of various contents, and the second acquisition unit is the same as the one or more supplementary information included in the event information acquired by the first acquisition unit. The event information having the following may be acquired.
 前記補足情報には、イベントの種類を示すイベント種類情報が含まれるようにしてもよい。 The supplementary information may include event type information indicating the type of event.
 複数の前記イベント情報を関連付けるデータベースを更に備え、前記第2の取得部は、前記第1の取得部が取得した前記イベント情報と前記データベースで関連付けられている前記イベント情報を取得するようにしてもよい。 The database may further include a database associating a plurality of the event information, and the second acquisition unit may acquire the event information associated with the event information acquired by the first acquisition unit in the database. Good.
 前記時系列データは、動画像データ、動線軌跡データ又は音声データであるようにしてもよい。 The time series data may be moving image data, flow line trajectory data, or audio data.
 前記所定時間範囲は、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を始端とするようにしてもよい。 The predetermined time range may start from a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information.
 前記所定時間範囲は、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻よりも過去時刻を始端とし、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻よりも将来時刻を終端とするようにしてもよい。 The predetermined time range starts at a past time than the time indicated by the same or closest time stamp as the occurrence time information of the event information, and the time indicated by the same or closest time stamp as the occurrence time information of the event information Alternatively, the future time may be terminated.
 前記所定時間範囲は、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻のみを含み、前記表示部は、前記時系列データを静止表示するようにしてもよい。 The predetermined time range may include only a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information, and the display unit may statically display the time-series data.
 前記時系列データのうち、前記第1の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を含む所定時間範囲の時系列データを抜き出す生成部と、前記データ生成部が抜き出した時系列データを記憶する記憶部と、を更に備え、前記表示部は、前記記憶部に記憶された時系列データを表示するようにしてもよい。 Among the time-series data, a generation unit that extracts time-series data in a predetermined time range including a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit, A storage unit that stores time-series data extracted by the data generation unit, and the display unit may display the time-series data stored in the storage unit.
 前記第1の取得部は、複数の前記イベント情報を取得し、前記生成部は、前記時系列データのうち、前記第1の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す全ての時刻を含む、連続した所定時間範囲の時系列データを抜き出すようにしてもよい。 The first acquisition unit acquires a plurality of the event information, and the generation unit is the same as the occurrence time information of the event information acquired by the first acquisition unit among the time-series data, or most You may make it extract the time series data of the continuous predetermined time range including all the time which a near time stamp shows.
 前記生成部は、前記時系列データのうち、前記第1の取得部及び前記第2の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す全ての時刻を含む、連続した所定時間範囲の時系列データを抜き出すようにしてもよい。 The generation unit includes all times indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit and the second acquisition unit in the time-series data. Alternatively, time-series data in a continuous predetermined time range may be extracted.
 ユーザの操作を受け付ける操作部を更に備え、前記データ生成部は、前記抜き出した時系列データと前記第1の取得部が取得した前記イベント情報とを関連付ける関連付けデータを更に生成し、前記記憶部は、前記関連付けデータを更に記憶し、前記表示部は、前記操作部を用いてユーザが選択した前記イベント情報と前記関連付けデータにより関連付けられた時系列データを表示するようにしてもよい。 The data processing unit further includes an operation unit that receives a user operation, and the data generation unit further generates association data that associates the extracted time-series data with the event information acquired by the first acquisition unit, and the storage unit The association data may be further stored, and the display unit may display time series data associated with the event information selected by the user using the operation unit and the association data.
 本発明によれば、イベント情報が発生した時刻を含む時間範囲を簡便に再生できるので、時系列データを観察すべき時間範囲を絞り込むことができ、煩雑な作業を要することなく、ユーザに時系列データから有益情報の気づきを与えることができる。 According to the present invention, since the time range including the time when the event information occurs can be easily reproduced, the time range in which the time series data should be observed can be narrowed down, and the time series can be displayed to the user without requiring a complicated operation. Useful information can be noticed from data.
再生装置の設置環境を示す模式図である。It is a schematic diagram which shows the installation environment of a reproducing | regenerating apparatus. 第1の実施形態に係る再生装置の構成を示すブロック図である。It is a block diagram which shows the structure of the reproducing | regenerating apparatus concerning 1st Embodiment. 第1の実施形態に係る再生装置によるジャンプ再生の動作を示すフローチャート前半である。It is the first half of the flowchart which shows the operation | movement of jump reproduction | regeneration by the reproducing | regenerating apparatus concerning 1st Embodiment. 第1の実施形態に係る再生装置によるジャンプ再生の動作を示すフローチャート後半である。It is the latter half of the flowchart which shows the operation | movement of jump reproduction | regeneration by the reproducing | regenerating apparatus concerning 1st Embodiment. 再生装置の表示画面を示す模式図であり、リアルタイム再生中のジャンプ再生イベントを示す。It is a schematic diagram which shows the display screen of a reproducing | regenerating apparatus, and shows the jump reproduction | regeneration event in real time reproduction | regeneration. 第1の実施形態に係る動画像データの再生を示す遷移図である。It is a transition diagram which shows reproduction | regeneration of the moving image data which concerns on 1st Embodiment. 第1の実施形態の変形例に係る動画像データの再生を示す遷移図である。It is a transition diagram showing reproduction of video data concerning a modification of a 1st embodiment. 第1の実施形態の変形例に係る動画像データの再生を示す遷移図である。It is a transition diagram showing reproduction of video data concerning a modification of a 1st embodiment. イベント情報のデータ構造を示す模式図である。It is a schematic diagram which shows the data structure of event information. 第2の実施形態に係る再生装置の構成を示すブロック図である。It is a block diagram which shows the structure of the reproducing | regenerating apparatus concerning 2nd Embodiment. 第2の実施形態に係る再生装置によるジャンプ再生の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the jump reproduction | regeneration by the reproducing | regenerating apparatus concerning 2nd Embodiment. 第2の実施形態に係る再生装置によるジャンプ再生の他の動作例を示すフローチャートである。It is a flowchart which shows the other operation example of the jump reproduction | regeneration by the reproducing | regenerating apparatus concerning 2nd Embodiment. 第2の実施形態に係る動画像データの再生を示す遷移図である。It is a transition diagram which shows reproduction | regeneration of the moving image data which concerns on 2nd Embodiment. 変形例4に係る再生装置の構成を示すブロック図である。FIG. 10 is a block diagram illustrating a configuration of a playback device according to Modification Example 4. 変形例4に係るデータベースを示す模式図である。It is a schematic diagram which shows the database which concerns on the modification 4. 変形例5に係る再生装置1の表示画面を示す模式図であり、リアルタイム再生中のジャンプ再生イベントを示す。It is a schematic diagram which shows the display screen of the reproducing | regenerating apparatus 1 which concerns on the modification 5, and shows the jump reproduction | regeneration event in real time reproduction | regeneration. 第3の実施形態に係る再生装置の構成を示すブロック図である。It is a block diagram which shows the structure of the reproducing | regenerating apparatus concerning 3rd Embodiment. 第3の実施形態に係る再生装置による合成表示の動作を示すフローチャートである。10 is a flowchart illustrating a composite display operation by the playback apparatus according to the third embodiment. 第3の実施形態に係る再生装置による合成表の表示画面を示す模式図であり、リアルタイム再生中を示す。It is a schematic diagram which shows the display screen of the synthesis table by the reproducing | regenerating apparatus which concerns on 3rd Embodiment, and shows during real-time reproduction | regeneration. 第4の実施形態に係る再生装置の構成を示すブロック図である。It is a block diagram which shows the structure of the reproducing | regenerating apparatus concerning 4th Embodiment. 第4の実施形態に係る再生装置の動画像データの作成動作を示すフローチャートである。14 is a flowchart illustrating a moving image data creation operation of the playback apparatus according to the fourth embodiment. 第4の実施形態に係る再生装置の動画像データの再生動作を示すフローチャートである。It is a flowchart which shows the reproduction | regeneration operation | movement of the moving image data of the reproducing | regenerating apparatus which concerns on 4th Embodiment. 変形例6に係る再生装置が生成する動画像データの模式図である。FIG. 10 is a schematic diagram of moving image data generated by a playback device according to Modification 6. 変形例6に係る再生装置の動画像データの生成動作を示すフローチャートである。16 is a flowchart showing a moving image data generation operation of a playback apparatus according to Modification 6.
 (第1の実施形態)
 (構成)
 以下、本発明に係る再生装置の第1の実施形態について図面を参照して詳細に説明する。図1は、再生装置1の設置環境を示す模式図である。図1に示す再生装置1は、動画像データ200とイベント情報300が発生するネットワーク100に接続される。この再生装置1は、イベント情報300の発生日時を利用して動画像データ200の再生開始フレームを決定し、ネットワーク100上の動画像データ200を再生する。
(First embodiment)
(Constitution)
Hereinafter, a first embodiment of a playback apparatus according to the present invention will be described in detail with reference to the drawings. FIG. 1 is a schematic diagram showing an installation environment of the playback apparatus 1. A playback apparatus 1 shown in FIG. 1 is connected to a network 100 in which moving image data 200 and event information 300 are generated. The playback apparatus 1 determines the playback start frame of the moving image data 200 using the occurrence date and time of the event information 300, and plays back the moving image data 200 on the network 100.
 動画像データ200は、撮影中の監視カメラを出発点としてリアルタイムに配信される。この動画像データ200は、ネットワーク100にてエンコーダを通され、各フレームデータにタイムスタンプが付されている。タイムスタンプは、各フレームデータの発生日時を直接的又は間接的に示す。イベント情報300は、エラー情報やステータス情報等であり、測定又は観測機器が出力する測定又は観測結果である。イベント情報300には、測定又は観測結果の他、少なくとも発生時刻情報が付加される。 The moving image data 200 is distributed in real time starting from the surveillance camera being photographed. The moving image data 200 is passed through an encoder in the network 100, and each frame data is given a time stamp. The time stamp directly or indirectly indicates the occurrence date and time of each frame data. The event information 300 is error information, status information, or the like, and is a measurement or observation result output from the measurement or observation equipment. In addition to the measurement or observation result, at least occurrence time information is added to the event information 300.
 図2は、再生装置1の構成を示すブロック図である。この再生装置1は、ネットワーク100との通信インターフェースとマンマシン入出力インターフェースを有する所謂コンピュータであり、例えばデスクトップコンピュータ、ラップトップコンピュータ、スマートフォン又はタブレット端末である。この再生装置1は、再生処理部11と表示部12と操作部13と再生制御部14を備える。 FIG. 2 is a block diagram showing the configuration of the playback device 1. The playback device 1 is a so-called computer having a communication interface with the network 100 and a man-machine input / output interface, and is, for example, a desktop computer, a laptop computer, a smartphone, or a tablet terminal. The playback apparatus 1 includes a playback processing unit 11, a display unit 12, an operation unit 13, and a playback control unit 14.
 再生処理部11は、所謂グラフィックエンジンであり、CPUやGPU等の演算制御装置とフレームメモリを含み構成される。再生処理部11は、動画像データ200をデコードし、時系列順にフレームデータを生成し、VRAMやRAMDAC等の表示部12に読み取られるフレームメモリを、新しく生成したフレームデータで順次更新する。この再生処理部11は、表示画面に規定された再生エリアに動画像データをオーバーレイ表示させる。 The reproduction processing unit 11 is a so-called graphic engine, and includes an arithmetic control device such as a CPU and a GPU and a frame memory. The reproduction processing unit 11 decodes the moving image data 200, generates frame data in chronological order, and sequentially updates the frame memory read by the display unit 12, such as VRAM or RAMDAC, with newly generated frame data. The reproduction processing unit 11 displays moving image data in an overlay display in a reproduction area defined on the display screen.
 表示部12は、フレームメモリ内のフレームデータを映像化するスクリーンであり、液晶ディスプレイや有機ELディスプレイ等である。この表示部12は、フレームメモリから所定のフレッシュレートでフレームデータを読み込んで順次映像化する。操作部13は、ユーザの操作を受け付ける入力インターフェースであり、キーボード、マウス、タッチパネル、マイク又はカメラである。操作部13は、動画像データ200の再生中、ユーザによるイベント情報300の選択に利用される。 The display unit 12 is a screen that visualizes the frame data in the frame memory, and is a liquid crystal display, an organic EL display, or the like. The display unit 12 reads frame data from the frame memory at a predetermined fresh rate and sequentially visualizes the frame data. The operation unit 13 is an input interface that receives a user operation, and is a keyboard, a mouse, a touch panel, a microphone, or a camera. The operation unit 13 is used for selecting event information 300 by the user during reproduction of the moving image data 200.
 再生制御部14は、主にCPUを含み構成され、動画像データ200が撮影された時間範囲内で、再生位置の変更を再生処理部11に指示する。この再生制御部14は、ユーザ操作に応じて、動画像データ200のリアルタイム再生中に、一時的にジャンプ再生を挿入する。その他、ユーザによる任意の選択時点から再生、または頭出し再生を制御してもよい。 The reproduction control unit 14 mainly includes a CPU, and instructs the reproduction processing unit 11 to change the reproduction position within the time range in which the moving image data 200 is captured. The playback control unit 14 temporarily inserts jump playback during real-time playback of the moving image data 200 in response to a user operation. In addition, playback or cue playback may be controlled from an arbitrary selection point by the user.
 リアルタイム再生では、再生制御部14は、監視カメラが生成する動画像データ200の最新フレームデータを再生させる。ジャンプ再生では、再生制御部14は、動画像データ200の再生中、ユーザが操作部13を用いて選択したイベント情報300の発生時刻を含む所定時間範囲を一時的に再生させる。再生制御部14は、所定時間範囲の再生が終了すると、動画像データ200のうちの最新のフレームデータの再生に戻す。再生制御部14は、リアルタイム再生であれば、最新フレームデータのリアルタイム再生に戻す。 In real-time playback, the playback control unit 14 plays back the latest frame data of the moving image data 200 generated by the surveillance camera. In jump reproduction, the reproduction control unit 14 temporarily reproduces a predetermined time range including the generation time of the event information 300 selected by the user using the operation unit 13 during reproduction of the moving image data 200. When the reproduction within the predetermined time range is completed, the reproduction control unit 14 returns to the reproduction of the latest frame data in the moving image data 200. If it is real-time reproduction, the reproduction control unit 14 returns to real-time reproduction of the latest frame data.
 所定時間範囲は、例えば3秒間であるが、操作部13を用いてユーザにより変更可能とする。所定時間範囲を示すパラメータの値を変更すればよい。また、所定時間範囲は、イベント情報300の発生時刻を開始時点とする範囲であっても、イベント情報300の発生時刻前後を含む範囲であってもよい。イベント情報300の発生時刻を起点にした所定時間範囲の位置は、操作部13を用いてユーザにより変更可能とする。例えば、イベント情報300の発生時刻にパラメータの値を加減算して、所定時間範囲の始端を決定するようにし、そのパラメータの値を変更すればよく、パラメータの値をゼロとすれば、イベント情報300の発生時刻が開始時点となる。 The predetermined time range is, for example, 3 seconds, but can be changed by the user using the operation unit 13. What is necessary is just to change the value of the parameter which shows a predetermined time range. Further, the predetermined time range may be a range starting from the occurrence time of the event information 300 or a range including around the occurrence time of the event information 300. The position of the predetermined time range starting from the occurrence time of the event information 300 can be changed by the user using the operation unit 13. For example, the parameter value is added to or subtracted from the occurrence time of the event information 300 to determine the start of a predetermined time range, the parameter value may be changed, and if the parameter value is set to zero, the event information 300 The occurrence time is the start time.
 このような再生制御部14は、データ収集部141と画面生成部142と再生フレーム判断部143を備える。データ収集部141は、ネットワーク100との入出力インターフェースを含み構成される。このデータ収集部141は、ネットワーク100から動画像データ200の配信を受ける。また、データ収集部141は、ネットワーク100内のイベント情報300を収集する。 The reproduction control unit 14 includes a data collection unit 141, a screen generation unit 142, and a reproduction frame determination unit 143. The data collection unit 141 includes an input / output interface with the network 100. The data collection unit 141 receives distribution of the moving image data 200 from the network 100. Further, the data collection unit 141 collects event information 300 in the network 100.
 画面生成部142は、CPUを含み構成され、表示画面をレイアウトする。表示画面は、動画像データ200の再生エリアと、イベント情報300のリスト表示エリアを有する。画面生成部142は、イベント情報300の中身を展開し、リスト表示エリアに並べる。イベント情報300の中身が大きい場合、表示画面にイベント詳細エリアを設け、ユーザが選択したイベント情報300の中身をイベント詳細エリアに表示するようにしてもよい。 The screen generation unit 142 includes a CPU and lays out a display screen. The display screen has a playback area for moving image data 200 and a list display area for event information 300. The screen generation unit 142 expands the contents of the event information 300 and arranges them in the list display area. When the content of the event information 300 is large, an event detail area may be provided on the display screen, and the content of the event information 300 selected by the user may be displayed in the event detail area.
 再生フレーム判断部143は、CPUとマンマシン入力インターフェースを含み構成される。この再生フレーム判断部143は、再生位置をジャンプさせるタイミングを判断し、またジャンプ先の再生位置の始端となるフレームデータを動画像データ200から決定する。 The playback frame determination unit 143 includes a CPU and a man-machine input interface. The playback frame determination unit 143 determines the timing for jumping the playback position, and determines from the moving image data 200 the frame data serving as the start point of the playback position of the jump destination.
 ジャンプ再生の第1のタイミングは、ジャンプ発生イベントの発生である。ジャンプ発生イベントは、例えば、マウス操作の場合、リスト表示エリア又はイベント詳細エリアから再生エリアへのイベント情報300のドラッグアンドドロップである。再生フレーム判断部143は、操作部13を用いたユーザの操作を監視し、ジャンプ再生イベントの発生を判断する。 The first timing of jump playback is the occurrence of a jump occurrence event. For example, in the case of a mouse operation, the jump occurrence event is a drag and drop of the event information 300 from the list display area or the event detail area to the reproduction area. The playback frame determination unit 143 monitors the user's operation using the operation unit 13 and determines the occurrence of a jump playback event.
 ジャンプ再生の第2のタイミングは、ジャンプ再生の終了である。詳細には、第2のタイミングは、ジャンプ再生後、所定時間範囲の再生の終了である。再生フレーム判断部143は、ジャンプ再生を開始させてから、フレームメモリへの書き込み対象となったフレームデータのタイムスタンプを監視し、タイムスタンプが所定時間範囲の終端に相当するか判断する。 The second timing of jump playback is the end of jump playback. Specifically, the second timing is the end of reproduction within a predetermined time range after jump reproduction. The playback frame determination unit 143 monitors the time stamp of the frame data to be written to the frame memory after starting the jump playback, and determines whether the time stamp corresponds to the end of the predetermined time range.
 第1のジャンプ先のフレームデータは、ジャンプ再生する所定時間範囲の始端に相当するフレームデータである。始端に相当するフレームデータは、所定時間範囲の始端を示すタイムスタンプが付加されたフレームデータである。再生フレーム判断部143は、ユーザ操作がジャンプ再生イベントと判断すると、そのユーザ操作で選択されたイベント情報300から発生時刻情報を抽出する。この発生時刻情報から所定時間範囲の始端が演算される。 The first jump destination frame data is frame data corresponding to the start of a predetermined time range for jump reproduction. The frame data corresponding to the start end is frame data to which a time stamp indicating the start end of the predetermined time range is added. When determining that the user operation is a jump playback event, the playback frame determination unit 143 extracts occurrence time information from the event information 300 selected by the user operation. The starting end of the predetermined time range is calculated from this occurrence time information.
 イベント情報300の発生時刻からジャンプ再生する場合、所定時間範囲の始端は発生時刻情報が示す時刻そのものである。イベント情報300の発生時刻から特定の時間遡ってジャンプ再生する場合には、再生フレーム判断部143は、発生時刻情報が示す時刻から予め記憶している特定時間を差分することで、所定時間範囲の始端を取得する。 When jump playback is performed from the occurrence time of the event information 300, the start of the predetermined time range is the time itself indicated by the occurrence time information. When jump playback is performed by going back a specific time from the occurrence time of the event information 300, the playback frame determination unit 143 subtracts the specific time stored in advance from the time indicated by the occurrence time information, so that a predetermined time range is obtained. Get the beginning.
 再生フレーム判断部143は、所定時間範囲の始端を取得すると、所定時間範囲の始端と同一時刻のタイムスタンプを有するフレームデータを探索する。同一時刻のタイムスタンプがない場合、再生フレーム判断部143は、同一時刻に最も近い時刻を示すタイムスタンプを有するフレームデータを探索する。探索結果は、再生処理部11に引き継がれ、再生処理部11は、探索結果が示すフレームデータから時系列順にフレームメモリに描画していく。 When the start frame of the predetermined time range is acquired, the playback frame determination unit 143 searches for frame data having a time stamp at the same time as the start end of the predetermined time range. When there is no time stamp at the same time, the playback frame determination unit 143 searches for frame data having a time stamp indicating the time closest to the same time. The search result is taken over by the playback processing unit 11, and the playback processing unit 11 draws the frame data indicated by the search result in the frame memory in chronological order.
 第2のジャンプ先のフレームデータは、動画像データ200のうちの最新フレームデータである。再生フレーム判断部143は、ジャンプ再生の終了を判断すると、最新に配信されたフレームデータの再生を再生処理部11に指示する。再生処理部11は、最新に配信されたフレームデータから時系列順にフレームメモリに描画していく。 The second jump destination frame data is the latest frame data in the moving image data 200. When the playback frame determination unit 143 determines the end of jump playback, the playback frame determination unit 143 instructs the playback processing unit 11 to play back the most recently delivered frame data. The reproduction processing unit 11 draws in the frame memory in chronological order from the most recently delivered frame data.
 典型的には、再生装置1は、ジャンプ再生イベントが発生すると、配信されている動画像データ200をリアルタイムに再生するルーチンから、ジャンプ再生するルーチンに処理を移し、ジャンプ再生の終了を判断すると、リアルタイムに再生するルーチンの処理に戻せば良い。 Typically, when a jump playback event occurs, the playback device 1 shifts the processing from a routine for playing back the moving image data 200 being distributed in real time to a routine for jump playback, and determines the end of jump playback. What is necessary is just to return to the routine process of reproducing in real time.
 (動作)
 このような再生装置1の動作例を図3及び図4に示す。図3及び図4は、再生装置1によるジャンプ再生の動作を示すフローチャートである。まず、データ収集部141はネットワーク100から動画像データ200を受信し(ステップS01)、再生処理部11はフレームメモリ内を動画像データ200の最新フレームデータで順次更新していき(ステップS02)、表示部12はフレームメモリ内の最新フレームを表示する(ステップS03)。
(Operation)
An example of the operation of the reproducing apparatus 1 is shown in FIGS. 3 and 4 are flowcharts showing the jump playback operation by the playback apparatus 1. First, the data collection unit 141 receives the moving image data 200 from the network 100 (step S01), and the reproduction processing unit 11 sequentially updates the frame memory with the latest frame data of the moving image data 200 (step S02). The display unit 12 displays the latest frame in the frame memory (step S03).
 監視カメラが複数存在する等の理由により、動画像データ200が複数種類存在する場合、再生制御部14は、表示画面にカメラアイコンやサムネイルを表示させる等して、動画像データ200を選択させるようにしてもよい。 When there are a plurality of types of moving image data 200 due to the presence of a plurality of surveillance cameras, the playback control unit 14 displays the camera icon or thumbnail on the display screen so that the moving image data 200 is selected. It may be.
 データ収集部141は、動画像データ200の再生処理と並行して、ネットワーク100からイベント情報300を収集し(ステップS04)、画面生成部142は、表示画面のリスト表示エリアにイベント情報300の一覧及び中身を表示させる(ステップS05)。 The data collection unit 141 collects the event information 300 from the network 100 in parallel with the reproduction processing of the moving image data 200 (step S04), and the screen generation unit 142 lists the event information 300 in the list display area of the display screen. And the contents are displayed (step S05).
 ユーザが操作部13を用いてイベント情報300を選択し、ジャンプ再生イベントとなる操作を行うと(ステップS06)、再生フレーム判断部143は、選択されたイベント情報300に含まれる発生時刻情報を抽出する(ステップS07)。そして、再生フレーム判断部143は、発生時刻情報と動画像データ200のタイムスタンプを参照して、発生時刻情報と同一又は最も近い時刻に撮影されたフレームデータを検索する(ステップS08)。 When the user selects the event information 300 using the operation unit 13 and performs an operation that becomes a jump playback event (step S06), the playback frame determination unit 143 extracts the occurrence time information included in the selected event information 300. (Step S07). Then, the playback frame determination unit 143 refers to the generation time information and the time stamp of the moving image data 200 to search for frame data captured at the same time or the closest time to the generation time information (step S08).
 典型的には、タイムスタンプは、所属のフレームデータの発生日時を示す。または、動画像データ200のメタデータに、例えば先頭のフレームデータ等の代表の発生日時情報が付され、各タイムスタンプは、例えば代表を基準に各フレームデータが再生される順番を時間で示す。発生日時情報にタイムスタンプが示す時間を加算することで、そのタイムスタンプが所属するフレームデータの発生日時情報となる。 Typically, the time stamp indicates the occurrence date and time of the frame data to which it belongs. Alternatively, representative occurrence date and time information such as the top frame data is attached to the metadata of the moving image data 200, and each time stamp indicates, for example, the order in which each frame data is reproduced with reference to the representative. By adding the time indicated by the time stamp to the occurrence date / time information, it becomes the occurrence date / time information of the frame data to which the time stamp belongs.
 再生処理部11は、検索により該当のフレームデータから時系列順にフレームメモリを更新する(ステップS09)。そのため、表示部12は、選択されたイベント情報300が発生した時刻から動画像データ200を再生表示する(ステップS10)。尚、フレームデータがネットワーク100上に存在する場合は、データ収集部141を通じて配信を受ける。 The reproduction processing unit 11 updates the frame memory in chronological order from the corresponding frame data by the search (step S09). Therefore, the display unit 12 reproduces and displays the moving image data 200 from the time when the selected event information 300 occurs (step S10). If the frame data exists on the network 100, it is distributed through the data collection unit 141.
 再生フレーム判断部143は、発生時刻情報が示す時刻に所定時間範囲を加算することで、ジャンプ再生の終了時刻を取得する(ステップS11)。そして、ジャンプ再生が始まると、次に表示するフレームデータのタイムスタンプとジャンプ再生の終了時刻とを比較する(ステップS12)。再生フレーム判断部143により、タイムスタンプがジャンプ再生の終了時刻に達したと判断されると(ステップS13)、データ収集部141は動画像データ200のうちの最新のフレームデータを配信を受ける(ステップS14)。 The playback frame determination unit 143 acquires the end time of jump playback by adding a predetermined time range to the time indicated by the occurrence time information (step S11). When jump reproduction starts, the time stamp of the frame data to be displayed next is compared with the end time of jump reproduction (step S12). When the playback frame determination unit 143 determines that the time stamp has reached the end time of jump playback (step S13), the data collection unit 141 receives the latest frame data from the moving image data 200 (step S13). S14).
 再生処理部11は、ジャンプ再生の終了時刻に達したタイムスタンプが付加されるフレームデータの次に、最新のフレームデータから時系列順にフレームメモリを更新する(ステップS15)。そのため、表示部12は、ジャンプ再生が終了すると(ステップS16)、動画像データ200のリアルタイム再生に戻る(ステップS17)。 The playback processing unit 11 updates the frame memory in chronological order from the latest frame data next to the frame data to which the time stamp that has reached the end time of jump playback is added (step S15). Therefore, when the jump reproduction ends (step S16), the display unit 12 returns to the real-time reproduction of the moving image data 200 (step S17).
 (作用)
 図5は、再生装置1の第1の適用例に係る表示画面を示す模式図であり、リアルタイム再生中のジャンプ再生イベントを示す。また、図6は、第1の適用例に係る動画像データ200の再生を示す模式図である。
(Function)
FIG. 5 is a schematic diagram showing a display screen according to the first application example of the playback apparatus 1, and shows a jump playback event during real-time playback. FIG. 6 is a schematic diagram showing reproduction of the moving image data 200 according to the first application example.
 本適用例では、生産組立ラインに監視カメラが設置されている。この生産組立ラインでは、作業員による製品の締め付け作業が指定工具にて行われ、締め付け作業の下流で寸法計測器が製品の良否を判定している。監視カメラは、締め付け作業場所に撮影光軸を向けて設置されている。 In this application example, a surveillance camera is installed on the production assembly line. In this production assembly line, a product is tightened by a worker with a designated tool, and a dimension measuring device determines whether the product is good or bad downstream of the tightening operation. The surveillance camera is installed with the photographing optical axis facing the tightening work place.
 寸法計測器は、製品の寸法が許容範囲内に収まっていると、良品を示すイベント情報300を生成してネットワーク100へ送出する。一方、寸法計測器は、製品の寸法が許容範囲外であると、製品不良を示すイベント情報300を生成してネットワーク100へ送出する。 When the dimensions of the product are within the allowable range, the dimension measuring device generates event information 300 indicating a non-defective product and sends it to the network 100. On the other hand, if the dimension of the product is out of the allowable range, the dimension measuring instrument generates event information 300 indicating a product defect and sends it to the network 100.
 図5に示すように、表示画面には、寸法計測器から続々と発生するイベント情報300がリスト表示される。また、表示画面上では、監視カメラが捉えた生産組立ラインの動画像データ200もリアルタイム再生されている。動画像データ200には、締め付け作業場所で締め付け作業をしている作業員が手元まで映っている。 As shown in FIG. 5, a list of event information 300 generated one after another from the dimension measuring instrument is displayed on the display screen. On the display screen, the moving image data 200 of the production assembly line captured by the monitoring camera is also reproduced in real time. In the moving image data 200, the worker who is performing the tightening work at the tightening work place is shown to the hand.
 寸法計測器が「NG」とのコードが記された製品不良のイベント情報300をネットワーク100に送出すると、その製品不良のイベント情報300はデータ収集部141により収集されて、表示画面に表示される。再生装置1を用いて現場を監視する監視員は、動画像データ200をリアルタイムで監視しつつ、製品不良のイベント情報300がリストに載ると、「NG」とのコードが記された製品不良のイベント情報300を動画像データ200の再生エリアへドラッグアンドドロップする。 When the dimension measuring instrument sends out product defect event information 300 in which the code “NG” is written to the network 100, the product defect event information 300 is collected by the data collection unit 141 and displayed on the display screen. . A monitor who monitors the site using the playback device 1 monitors the moving image data 200 in real time, and if the product failure event information 300 is listed, the product failure with a code “NG” is recorded. The event information 300 is dragged and dropped onto the playback area of the moving image data 200.
 図5及び6に示すように、ジャンプ再生イベントで操作されたイベント情報300は、例えば時刻Tn-a-4を示す発生時刻情報を有している。再生装置1には、所定時間範囲を3秒とし、イベント情報300の発生からジャンプ再生するようにパラメータが設定されている。そのため、再生装置1は、最新時刻Tnのフレームデータを再生していた状態から、締付作業場所を監視する動画像データ200のうち、時刻Tn-a-4のタイムスタンプが付加されているフレームデータからのジャンプ再生に移行する。 As shown in FIGS. 5 and 6, the event information 300 operated by the jump playback event has occurrence time information indicating, for example, time Tn-a-4. The playback device 1 is set with parameters such that the predetermined time range is 3 seconds and jump playback is performed from the occurrence of the event information 300. Therefore, the playback apparatus 1 starts from the state where the frame data at the latest time Tn is being played back, and the frame to which the time stamp at time Tn-a-4 is added in the moving image data 200 for monitoring the tightening work place. Move to jump playback from data.
 そして、再び最新時刻Tn+5のフレームデータからリアルタイム再生に戻る。ジャンプ再生中の3秒間の映像には、製品不良のイベント情報300が発生する直前において、作業員が指定工具と異なる別の工具を保持しているシーンが映っていた。そのため、ジャンプ再生された3秒間の映像を観察した監視員は、製品不良のイベント情報300が発生したことの認識と相俟って、指定工具と異なる別の工具で製品を締め付けていたために寸法異常が生じたと、製品不良の原因を推測した。 Then, the real-time playback is resumed from the frame data of the latest time Tn + 5. In the video for 3 seconds during the jump playback, a scene in which the worker holds another tool different from the designated tool is shown immediately before the event information 300 of product failure occurs. For this reason, the observer who observed the jump-played 3 second video, coupled with the recognition that the product failure event information 300 occurred, tightened the product with another tool different from the designated tool. If an abnormality occurred, the cause of the product failure was estimated.
 このように動画像データ200は、作業員が別工具を使用しているという、イベント情報300の内容を補強する情報を記録するのみならず、不要な情報も大量に記録されたバルキーデータである。しかも、その特定情報は、撮影視野内に空間的にも時間的にも散在し、更に他の不要な情報と明瞭な区別なく渾然一体となっている。 As described above, the moving image data 200 is not only information that reinforces the contents of the event information 300 that the worker is using another tool, but also bulk data that includes a large amount of unnecessary information. . In addition, the specific information is scattered both spatially and temporally in the photographing field of view, and is unifying from other unnecessary information with no clear distinction.
 しかしながら、この再生装置1は、製品不良発生のイベント情報300に含まれる発生時刻情報を基に、動画像データ200の再生範囲を別工具が使用されている状況が映っている可能性の高い時間範囲に絞るようにユーザを支援し、ユーザはイベント情報300の選択から間を置かずに、イベント情報300に無い有益情報を得られる。 However, this playback device 1 is a time when there is a high possibility that another tool is used in the playback range of the moving image data 200 based on the occurrence time information included in the event information 300 of the product defect occurrence. The user is supported to narrow the range, and the user can obtain useful information that is not included in the event information 300 without leaving the selection of the event information 300.
 また、製品不良のイベント情報300の選択に伴い、別工具を使用している状況が映っている可能性の高い時間範囲が再生されるので、ユーザには、イベント情報300の中身と、別工具を使用している状況が映っている可能性の高い時間範囲の映像とが一体的に飛び込み、イベント情報300の中身に当該映像が含有しているように見える。 In addition, since the time range in which there is a high possibility that the situation in which another tool is used is reproduced in accordance with the selection of the event information 300 of product failure, the user can see the contents of the event information 300 and the other tool. The video in the time range where there is a high possibility that the situation where the user is using is projected jumps together, and the content of the event information 300 seems to contain the video.
 そのため、ユーザにとってイベント情報300の情報価値が向上する。更に、イベント情報300に含めるべき情報の気づきをユーザに与えることになるため、イベント情報300の情報価値も真に向上する。例えば、工具の取り出しを検出するセンサを工具置き場に設置し、寸法測定器のイベント情報300に当該センサが検出した取り出し済み工具の種類情報を含めることができる。 Therefore, the information value of the event information 300 is improved for the user. Furthermore, since the user is aware of the information to be included in the event information 300, the information value of the event information 300 is also greatly improved. For example, a sensor that detects removal of a tool may be installed in the tool storage area, and the type information of the removed tool detected by the sensor may be included in the event information 300 of the dimension measuring instrument.
 (効果)
 このように、再生装置1は、タイムスタンプを有する動画像データ200を再生する再生装置であり、発生時刻情報を有するイベント情報300を取得する第1の取得部と、動画像データ200のうち、第1の取得部が取得したイベント情報300の発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を含む時間帯をジャンプ再生する表示部12を備えるようにした。
(effect)
As described above, the playback device 1 is a playback device that plays back the moving image data 200 having the time stamp, and includes the first acquisition unit that acquires the event information 300 having the occurrence time information, and the moving image data 200. The display unit 12 that jumps and reproduces the time zone including the time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information 300 acquired by the first acquisition unit is provided.
 第1の取得部は、本実施形態において再生フレーム判断部143である。再生フレーム判断部143は、ジャンプ再生イベントで選択されたイベント情報300を、ジャンプ再生イベントの発生を契機に取得する。その他、ネットワーク100にイベント情報300が発生すると、そのイベント情報300を取得し、ジャンプ再生するようにしてもよい。すなわち、再生装置1にイベント情報300を取り込むデータ収集部141も第1の取得部の一例である。 The first acquisition unit is the playback frame determination unit 143 in the present embodiment. The playback frame determination unit 143 acquires the event information 300 selected by the jump playback event when the jump playback event occurs. In addition, when event information 300 occurs in the network 100, the event information 300 may be acquired and jump-played. That is, the data collection unit 141 that captures the event information 300 into the playback device 1 is also an example of the first acquisition unit.
 これにより、この再生装置1は、動画像データ200から有益情報を容易に抽出できるようにユーザを支援でき、動画像データ200を監視する者に対するユーザーフレンドリーに優れる。 Thus, the playback apparatus 1 can assist the user so that useful information can be easily extracted from the moving image data 200, and is excellent in user friendliness for a person who monitors the moving image data 200.
 また、ユーザには、イベント情報300の中身と、有益情報が映っている可能性の高い時間範囲の映像とが一体的に飛び込み、イベント情報300の中身に当該映像が含有しているように見えるため、イベント情報300の情報価値が向上する。更に、イベント情報300に含めるべき情報の気づきをユーザに与えることになるため、イベント情報300の情報価値も真に向上する。 In addition, to the user, the contents of the event information 300 and the video in the time range where there is a high possibility that the useful information is shown jump into the event information 300, and the contents of the event information 300 appear to contain the video. Therefore, the information value of the event information 300 is improved. Furthermore, since the user is aware of the information to be included in the event information 300, the information value of the event information 300 is also greatly improved.
 (変形例1)
 動画像データ200は、時間とともに変動する現象に対して時間の順序で測定又は観測した結果であり、タイムスタンプを有する時系列データの一例である。時系列データとしては、動画像データ200の他、動線軌跡画像データ、心電図データ、音声データ、ボイスレコーダのデータ等を挙げることができる。再生装置1は、時系列順にデータが並び、各データにタイムスタンプが付加されているものであれば、イベント情報300の選択に伴うジャンプ再生が可能であり、時系列データであれば適用可能である。
(Modification 1)
The moving image data 200 is an example of time-series data having a time stamp, which is a result of measurement or observation in the order of time with respect to a phenomenon that varies with time. Examples of the time series data include moving image data 200, flow line image data, electrocardiogram data, voice data, voice recorder data, and the like. The playback apparatus 1 can perform jump playback according to the selection of the event information 300 as long as the data is arranged in time series order and a time stamp is added to each data, and can be applied to time series data. is there.
 動線軌跡画像データは、一定範囲の空間内に時系列順に物や人の存在位置を更新するようにプロットしたデータである。この動線軌跡画像データは、マップ画像データと、複数の存在位置情報とから成る。各存在位置情報にはタイムスタンプが付加されている。再生処理部11は、存在位置情報が示す存在位置をマップ画像データ上にプロットしたフレームデータを生成し、フレームメモリを更新していく。 The flow line trajectory image data is data plotted so as to update the presence positions of objects and people in time series within a certain range of space. The flow line trajectory image data includes map image data and a plurality of pieces of presence position information. A time stamp is added to each location information. The reproduction processing unit 11 generates frame data in which the existing position indicated by the existing position information is plotted on the map image data, and updates the frame memory.
 この再生装置1の第2の適用例を説明する。本適用例において、看護師はRFIDタグ等の存在位置を報知する機器を携帯し、病院内にはRFIDタグから看護師の識別情報を読み取るリーダが各箇所に設置されている。各リーダからの出力を収集するサーバは、予め記憶している病院内のマップ画像データに、識別情報を読み取った各リーダの各存在位置情報を付加して動線軌跡画像データを生成している。 A second application example of the playback device 1 will be described. In this application example, the nurse carries a device for notifying the location of the RFID tag or the like, and a reader for reading the nurse's identification information from the RFID tag is installed in each place in the hospital. A server that collects output from each reader generates flow line trajectory image data by adding each location information of each reader that has read identification information to map image data in a hospital that is stored in advance. .
 また、病院内には各種の医療機器が設置され、各種医療機器はイベント情報300を生成してネットワーク100へ出力している。ネットワーク100へ送出するイベント情報300には、医療機器が検出した患者の異常を示す患者異常のイベント情報300が含まれる。 In addition, various medical devices are installed in the hospital, and the various medical devices generate event information 300 and output it to the network 100. The event information 300 transmitted to the network 100 includes patient abnormality event information 300 indicating a patient abnormality detected by the medical device.
 まず、朝の申し合わせの際、昨夜において医療機器が患者異常を警報していたにも関わらず、規則時間内に看護師が駆けつけなかったことが報告された。そこで、再生装置1の監視者は、表示画面上のリスト表示エリアから、昨晩発生した患者異常のイベント情報300を選択してジャンプ再生イベントの操作を実行した。 First of all, at the morning consultation, it was reported that the nurse did not come within the regular time even though the medical device warned the patient abnormality last night. Therefore, the monitor of the playback apparatus 1 selects the event information 300 of the patient abnormality that occurred last night from the list display area on the display screen, and executes the jump playback event operation.
 データ収集部141は、動線軌跡画像データが再生されていなければ、ネットワーク100から動線軌跡画像データを取得する。再生フレーム判断部143は、選択された患者異常のイベント情報300から発生時刻情報を抽出し、動線軌跡画像データに含まれるタイムスタンプを参照して、発生時刻情報と同一時刻の存在位置情報を探索する。 The data collection unit 141 acquires the flow line locus image data from the network 100 if the flow line locus image data is not reproduced. The playback frame determination unit 143 extracts the occurrence time information from the selected patient abnormality event information 300, refers to the time stamp included in the flow line trajectory image data, and obtains the location information of the same time as the occurrence time information. Explore.
 再生処理部11は、探索された存在位置情報が示す存在位置を病院のマップ画像データが示す院内マップにプロットしたフレームデータを生成し、フレームメモリに格納する。再生処理部11は、駆け付け規則時間を超える所定時間範囲に収まるタイムスタンプが付加された存在位置情報を時系列順にプロットし直すことで、所定時間範囲のフレームデータを時系列順に生成していき、フレームメモリを更新していく。 The reproduction processing unit 11 generates frame data in which the existing position indicated by the searched existing position information is plotted on the in-hospital map indicated by the hospital map image data, and stores the frame data in the frame memory. The reproduction processing unit 11 generates frame data in a predetermined time range in chronological order by re-plotting the existing position information to which the time stamp that falls within the predetermined time range exceeding the rushing rule time is added in chronological order, Update the frame memory.
 これにより、再生エリアには、患者異常のイベント情報300が発生した時刻から所定時間範囲の各看護師の動線軌跡がオーバーレイ表示されていく。既に動線軌跡画像データが再生中の場合はジャンプ再生となり、動線軌跡画像データが再生されていなかった場合は、患者異常のイベント情報300が発生した時刻から再生が始まる。 Thereby, in the reproduction area, the flow line trajectory of each nurse in a predetermined time range from the time when the patient abnormality event information 300 occurs is displayed in an overlay manner. When the flow line trajectory image data is already being reproduced, jump reproduction is performed. When the flow line trajectory image data is not reproduced, the reproduction starts from the time when the event information 300 indicating that the patient is abnormal.
 この再生を観察した監視者は、患者異常を警告した医療機器から離れた別棟病棟に担当の看護師が存在し、担当の看護師は直ちに駆け付けを開始したが、その別棟病棟から駆け付けたため、規則時間を超えていたことがわかった。また、患者異常のイベント情報300が発生した当時、院内に看護師が不足していることがわかった。すなわち、看護師不足のために、担当の看護師は担当外業務も遂行する必要が生じ、そのために駆け付けが遅れたことがわかった。 The monitor who observed this regeneration had a nurse in charge in a separate ward away from the medical device that warned of the patient's abnormality, and the nurse in charge immediately started rushing from the ward. I found out that it was over time. In addition, it was found that there was a shortage of nurses in the hospital when the patient abnormality event information 300 occurred. In other words, it was found that due to the shortage of nurses, the nurses in charge were required to carry out work other than those in charge, which led to delays in rushing.
 このように、動画像データ200以外の時系列データであっても再生装置1は、時系列データから有益情報を容易に抽出できるようにユーザを支援でき、時系列データを分析する者に対するユーザーフレンドリーに優れる。また、イベント情報300の中身と、有益情報が映っている可能性の高い時間範囲とが一体的に飛び込むため、イベント情報300の情報価値が向上する。更に、イベント情報300に含めるべき情報の気づきをユーザに与えることになるため、イベント情報300の情報価値も真に向上する。 As described above, the playback device 1 can assist the user so that useful information can be easily extracted from the time series data even if it is time series data other than the moving image data 200, and is user friendly to those who analyze the time series data. Excellent. In addition, since the contents of the event information 300 and the time range in which the useful information is likely to appear are jumped together, the information value of the event information 300 is improved. Furthermore, since the user is aware of the information to be included in the event information 300, the information value of the event information 300 is also greatly improved.
 (変形例2)
 動画像データ200等の時系列データは、リアルタイムに生成されてネットワーク配信される他、撮影等の生成が終了してネットワーク100内に保管されていてもよい。リアルタイムに生成されてネットワーク配信される時系列データについては、リアルタイムでの監視用途に使われることが多く、ジャンプ再生の後は時系列データの最新のフレームデータを再生することが望ましい。
(Modification 2)
The time-series data such as the moving image data 200 may be generated in real time and distributed over the network, or may be stored in the network 100 after generation of shooting or the like is completed. Time series data generated in real time and distributed over the network is often used for monitoring in real time, and it is desirable to reproduce the latest frame data of the time series data after jump reproduction.
 一方、既に生成終了の時系列データに関しては、図7に示すように、ジャンプ再生の終了後、ジャンプ再生した所定時間範囲に続く時刻のフレームデータを時系列順に再生してもよい。所定時間範囲に引き続き時系列順に再生を実行する場合、再生フレーム判断部143は、所定時間範囲の終端の判断をする必要はない。 On the other hand, as to the time series data that has already been generated, as shown in FIG. 7, after the jump reproduction ends, the frame data at the time following the predetermined time range after the jump reproduction may be reproduced in time series. When playback is performed in chronological order following the predetermined time range, the playback frame determination unit 143 does not need to determine the end of the predetermined time range.
 また、既に生成終了の時系列データに関しては、図8に示すように、ジャンプ再生の終了後、ジャンプ再生の直前に再生していたフレームデータ又はその前後に戻して再生してもよい。 Further, as shown in FIG. 8, the time-series data that has already been generated may be played back by returning to the frame data that was played back immediately before the jump playback or just before and after the jump playback, as shown in FIG.
 すなわち、この再生フレーム判断部143は、ジャンプ再生が終了すると、ジャンプ再生の直前に再生していた時点に時系列データの再生を戻す。再生フレーム判断部143は、ジャンプ再生する直前に再生されたフレームデータのタイムスタンプ、又はその前後の時刻を有するタイムスタンプを記憶し、ジャンプ再生終了後、記憶のタイムスタンプが付加されたフレームデータから再生させる。 That is, when the jump playback is completed, the playback frame determination unit 143 returns the playback of the time series data to the time when playback was performed immediately before the jump playback. The playback frame determination unit 143 stores a time stamp of frame data played immediately before jump playback or a time stamp having a time before and after the time, and after completion of jump playback, from the frame data to which the stored time stamp is added. Let it play.
 (変形例3)
 所定時間範囲は、例えばゼロ秒間とすることができる。換言すると、イベント情報300の発生時刻情報と同一又は最も近いタイムスタンプが示す時刻のみをジャンプ再生するようにしてもよい。この場合、静止画表示となる。
(Modification 3)
The predetermined time range can be set to, for example, zero seconds. In other words, only the time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information 300 may be jump-reproduced. In this case, a still image is displayed.
 再生フレーム判断部143は、ジャンプ再生先のフレームデータを指定すると、所定時間の計時を開始する。計時中は次のフレームデータの指示を待機し、所定時間が経過すると次のフレームデータを指示する。これにより、所定時間の静止画表示の後、リアルタイム再生やジャンプ再生の直前に再生されていた箇所の再生に戻る。 When the playback frame determination unit 143 designates the jump playback destination frame data, the playback frame determination unit 143 starts measuring a predetermined time. During the time counting, the next frame data is instructed, and when the predetermined time elapses, the next frame data is instructed. As a result, after the still image is displayed for a predetermined time, the playback of the portion that was played back immediately before the real-time playback or jump playback is resumed.
 (第2の実施形態)
 (構成)
 次に第2の実施形態に係る再生装置1につき図面を参照して詳細に説明する。第1の実施形態、変形例1乃至3と同一機能及び同一構成については同一符号を付して詳細な説明を省略する。
(Second Embodiment)
(Constitution)
Next, the reproducing apparatus 1 according to the second embodiment will be described in detail with reference to the drawings. The same functions and configurations as those of the first embodiment and the first to third modifications are denoted by the same reference numerals, and detailed description thereof is omitted.
 まず、イベント情報300について詳述する。イベント情報300を発生させる測定又は観測機器は、例えば温度センサや寸法測定器や医療機器等である。測定又は観測機器は、工場やオフィス等で稼働する各種設備や患者の稼働状況を監視し、各種設備や患者にイベントが発生すると、イベント情報300を出力する。イベントは、定期的な状況報告、監視対象の内部的外部的な状況の所定条件成就、または監視対象の内部的外部的な状況の所定条件未成就である。 First, the event information 300 will be described in detail. The measurement or observation device that generates the event information 300 is, for example, a temperature sensor, a dimension measurement device, a medical device, or the like. The measurement or observation equipment monitors the operating status of various facilities and patients operating in factories, offices, etc., and outputs event information 300 when events occur in various facilities and patients. The event is a periodic status report, a predetermined condition of an internal / external situation to be monitored being fulfilled, or a predetermined condition of an internal / external situation to be monitored being fulfilled.
 図9は、イベント情報300のデータ構造を示す模式図である。図9に示すように、イベント情報300には、測定又は観測結果と発生時刻情報の他、イベントの種類を示すイベント種類情報、測定又は観測結果を説明する説明情報、監視の目的及び設備や患者の状況や性状に応じた情報、その他の補足情報を付加することができる。 FIG. 9 is a schematic diagram showing the data structure of the event information 300. As shown in FIG. 9, in the event information 300, in addition to measurement or observation results and occurrence time information, event type information indicating the type of event, explanation information explaining the measurement or observation results, purpose of monitoring, equipment, and patient It is possible to add information according to the situation and property of the other supplementary information.
 具体的には、製品の組み立て作業現場において、製品の良否を示すイベント情報に、製品を組み立てた作業員の識別情報が補足情報として付される場合もある。これら補足情報は、典型的には、ネットワーク100内の情報収集システムがイベント情報300を再構築する結果として付加される。 Specifically, in the product assembly work site, the identification information of the worker who assembled the product may be added as supplementary information to the event information indicating the quality of the product. Such supplemental information is typically added as a result of the information collection system in the network 100 reconstructing the event information 300.
 情報収集システムは、測定又は観測機器が出力するイベント情報300毎に再構築後のモデルを有している。モデルには、イベント情報300に付加されるべき補足情報の種類が列記されている。情報収集システムは、モデルを参照して補足情報の種類を決定する。 The information collection system has a model after reconstruction for each event information 300 output by the measurement or observation equipment. The model lists types of supplementary information to be added to the event information 300. The information collection system refers to the model and determines the type of supplemental information.
 情報収集システムは、ネットワーク100に存在する各種各内容のデータから、決定された種類の補足情報の群に絞り、イベント情報300の内容を検索キーとして、絞り込んだ補足情報の群から、イベント情報300の内容に沿った補足情報を検索する。そして、情報収集システムは、測定又は観測機器から出力されたイベント情報300に該当の補足情報を付加することで、イベント情報300を再構成する。 The information collection system narrows down the group of supplementary information of the determined type from the data of various contents existing in the network 100, and uses the contents of the event information 300 as a search key to search the event information 300 Search supplementary information according to the contents of. Then, the information collection system reconstructs the event information 300 by adding the corresponding supplementary information to the event information 300 output from the measurement or observation equipment.
 図10は、この再生装置1の構成を示すブロック図である。図10に示すように、この再生装置1の再生制御部14は、検索部144を更に備える。検索部144は、CPUを含み構成され、ジャンプ再生イベントで選択されたイベント情報300と一定範囲で同一の部類に属するイベント情報300を検索する。 FIG. 10 is a block diagram showing the configuration of the playback apparatus 1. As shown in FIG. 10, the playback control unit 14 of the playback device 1 further includes a search unit 144. The search unit 144 includes a CPU, and searches for event information 300 belonging to the same category within a certain range as the event information 300 selected by the jump playback event.
 イベント情報300に含まれる測定又は観測結果及び補足情報の一部又は全部が同一である場合、それらのイベント情報300は、一定範囲で同一の部類に属する。検索部144は、予め指定された一定範囲又はユーザの操作部13を用いた操作により指定された一定範囲で同一の部類に属するイベント情報300を検索する。 When some or all of the measurement or observation results and supplementary information included in the event information 300 are the same, the event information 300 belongs to the same category within a certain range. The search unit 144 searches the event information 300 belonging to the same category within a predetermined range specified in advance or a predetermined range specified by an operation using the user operation unit 13.
 例えば、予め指定された一定範囲で同一の部類に属するイベント情報300は、補足情報として付加されたイベントの種類を示すイベント種類情報が同一であるイベント情報300である。検索部144は、ジャンプ再生イベントで選択されたイベント情報300のイベント種類情報を検索キーとし、同一のイベント種類情報が付加されたイベント情報300を検索する。検索先は、データ収集部141が各イベント情報300を収集済みの場合は、再生装置1内であり、データ収集部141がイベント情報300を収集していない場合はネットワーク100である。 For example, the event information 300 belonging to the same category within a predetermined range specified in advance is the event information 300 having the same event type information indicating the type of event added as supplementary information. The search unit 144 searches the event information 300 to which the same event type information is added using the event type information of the event information 300 selected in the jump playback event as a search key. The search destination is in the playback device 1 when the data collection unit 141 has collected the event information 300, and the network 100 when the data collection unit 141 has not collected the event information 300.
 また、検索部144は、ジャンプ再生イベントの内容を監視する。そして、リスト表示エリアやイベント詳細エリアで操作部13を用い、単一のイベント情報300から単一又は複数の補足情報が選択され、その選択が有効となっている最中、イベント再生イベントが発生すれば、検索部144は、その選択された補足情報を検索キーとし、同一の補足情報が付加されたイベント情報300を検索する。 Also, the search unit 144 monitors the contents of the jump playback event. Then, using the operation unit 13 in the list display area and the event detail area, single or plural supplementary information is selected from the single event information 300, and an event playback event occurs while the selection is valid. Then, the search unit 144 searches the event information 300 to which the same supplement information is added using the selected supplement information as a search key.
 換言すれば、単一のイベント情報300から単一又は複数の補足情報が選択されなければ、検索部144は、イベント種類情報を検索キーとし、同一の補足情報が付加されたイベント情報300を検索する。 In other words, if a single or a plurality of supplementary information is not selected from a single event information 300, the search unit 144 searches the event information 300 to which the same supplement information is added using the event type information as a search key. To do.
 再生フレーム判断部143は、ジャンプ再生イベントで選択されたイベント情報300と検索部144が検索したイベント情報300を同列に扱う。すなわち、選択及び検索により該当したイベント情報300が発生した各時刻を順次ジャンプ再生する。各ジャンプ再生のタイミングは、ジャンプ再生中の所定時間範囲の終了であり、前のジャンプ再生が終了すると、続けて次のジャンプ再生を行う。 The playback frame determination unit 143 handles the event information 300 selected by the jump playback event and the event information 300 searched by the search unit 144 in the same row. That is, each time when the corresponding event information 300 is generated by selection and search is sequentially jump-reproduced. The timing of each jump playback is the end of a predetermined time range during jump playback. When the previous jump playback ends, the next jump playback is performed.
 ジャンプ再生順序は、選択されたイベント情報300の発生時刻を最初とし、検索により該当した順であっても、検索されたイベント情報300の昇順または降順の発生時刻順であってもよい。 The jump playback order may be the order of occurrence of the selected event information 300 in the order corresponding to the search or the ascending or descending order of the searched event information 300.
 (動作)
 このような再生装置1の動作例を図11に示す。図11は、再生装置1によるジャンプ再生の動作を示すフローチャートである。動画像データ200の再生中(ステップS21)、ユーザが操作部13を用いてイベント情報300を選択し、ジャンプ再生イベントとなる操作を行うと(ステップS22)、検索部144は、選択されたイベント情報300に含まれるイベント種類情報を抽出する(ステップS23)。
(Operation)
An example of the operation of such a playback apparatus 1 is shown in FIG. FIG. 11 is a flowchart showing the jump playback operation by the playback apparatus 1. When the moving image data 200 is being reproduced (step S21), when the user selects the event information 300 using the operation unit 13 and performs an operation that becomes a jump reproduction event (step S22), the search unit 144 selects the selected event. Event type information included in the information 300 is extracted (step S23).
 検索部144は、抽出したイベント種類情報を検索キーとして他のイベント情報300を検索する(ステップS24)。他のイベント情報300が収集済みの場合、検索先を再生装置1内とし、他のイベント情報300が未収集或いは収集途上の場合、検索先はネットワーク100である。 The search unit 144 searches the other event information 300 using the extracted event type information as a search key (step S24). When other event information 300 has been collected, the search destination is within the playback device 1, and when other event information 300 has not been collected or is being collected, the search destination is the network 100.
 再生フレーム判断部143は、選択及び検索により該当したイベント情報300に含まれる各発生時刻情報を抽出する(ステップS25)。そして、再生フレーム判断部143は、発生時刻情報と動画像データ200のタイムスタンプを参照して、各発生時刻情報と同一又は最も近い時刻に撮影された各フレームデータを検索する(ステップS26)。 The playback frame determination unit 143 extracts each occurrence time information included in the corresponding event information 300 by selection and search (step S25). Then, the playback frame determination unit 143 refers to the occurrence time information and the time stamp of the moving image data 200, and searches each frame data captured at the same time or the closest time to each occurrence time information (step S26).
 そして、再生処理部11は、検索により該当したフレームデータのジャンプ再生順N番目からジャンプ再生を開始する(ステップS27)。当初、Nは初期化され、N=1である。再生フレーム判断部143は、現在のジャンプ再生の終了時刻を取得する(ステップS28)。ジャンプ再生順は、検索により該当のフレームデータの発生時刻を基準に昇順又は降順とすればよいが、最初はユーザにより選択されたイベント情報300と同一時刻のフレームデータが望ましい。 Then, the playback processing unit 11 starts jump playback from the Nth jump playback order of the frame data corresponding to the search (step S27). Initially, N is initialized and N = 1. The playback frame determination unit 143 acquires the end time of the current jump playback (step S28). The jump reproduction order may be ascending order or descending order based on the occurrence time of the corresponding frame data by the search. Initially, the frame data at the same time as the event information 300 selected by the user is desirable.
 ジャンプ再生が始まると、再生フレーム判断部143は、次に表示するフレームデータのタイムスタンプとジャンプ再生の終了時刻とを比較する(ステップS29)。再生フレーム判断部143により、タイムスタンプがジャンプ再生の終了時刻に達したと判断されると(ステップS30,Yes)、再生フレーム判断部143は、ジャンプ再生順Nを1カウントアップし(ステップS31)、ジャンプ再生順Nが選択数と検索該当数との加算値超か判断する(ステップS32)。 When jump playback starts, the playback frame determination unit 143 compares the time stamp of the next frame data to be displayed with the end time of jump playback (step S29). When the playback frame determining unit 143 determines that the time stamp has reached the end time of jump playback (step S30, Yes), the playback frame determining unit 143 increments the jump playback order N by 1 (step S31). Then, it is determined whether or not the jump reproduction order N exceeds the added value of the number of selections and the number of hits (step S32).
 ジャンプ再生数Nが比較値を上回ってなければ(ステップS32,No)、再生処理部11は、次のジャンプ再生順N番目のフレームデータのジャンプ再生に移る(ステップS27)。ジャンプ再生数Nが選択数と検索該当数との加算値超であれば(ステップS32,Yes)、ジャンプ再生を終了する(ステップS33)。 If the jump reproduction number N does not exceed the comparison value (step S32, No), the reproduction processing unit 11 proceeds to jump reproduction of the Nth frame data in the next jump reproduction order (step S27). If the jump reproduction number N exceeds the added value of the selection number and the number of hits (step S32, Yes), the jump reproduction is terminated (step S33).
 このような再生装置1の他の動作例を図12に示す。図12は、再生装置1によるジャンプ再生の動作を示すフローチャートである。動画像データ200の再生中(ステップS41)、ユーザが操作部13を用いてイベント情報300を選択し(ステップS42)、更に選択したイベント情報300から単数又は複数の補足情報を選択し(ステップS43)、ジャンプ再生イベントとなる操作を行うと(ステップS44)、検索部144は、選択されたイベント情報300からユーザにより選択された補足情報を抽出する(ステップS45)。検索部144は、抽出した補足情報を検索キーとして他のイベント情報300を検索する(ステップS46)。補足情報が複数の場合、検索方法はアンド検索である。 FIG. 12 shows another example of the operation of such a playback apparatus 1. FIG. 12 is a flowchart showing the jump playback operation by the playback apparatus 1. During reproduction of the moving image data 200 (step S41), the user selects the event information 300 using the operation unit 13 (step S42), and further selects one or more supplemental information from the selected event information 300 (step S43). ) When an operation that becomes a jump reproduction event is performed (step S44), the search unit 144 extracts supplementary information selected by the user from the selected event information 300 (step S45). The search unit 144 searches the other event information 300 using the extracted supplementary information as a search key (step S46). When there are a plurality of supplementary information, the search method is an AND search.
 再生フレーム判断部143は、選択及び検索により該当したイベント情報300に含まれる各発生時刻情報を抽出する(ステップS47)。そして、再生フレーム判断部143は、発生時刻情報と動画像データ200のタイムスタンプを参照して、各発生時刻情報と同一又は最も近い時刻に撮影された各フレームデータを検索する(ステップS48)。 The playback frame determination unit 143 extracts each occurrence time information included in the corresponding event information 300 by selection and search (step S47). Then, the playback frame determination unit 143 refers to the occurrence time information and the time stamp of the moving image data 200 to search for each frame data captured at the same time or the closest time to each occurrence time information (step S48).
 そして、再生処理部11は、検索により該当したフレームデータのジャンプ再生順N番目からジャンプ再生を開始する(ステップS49)。当初、Nは初期化され、N=1である。再生フレーム判断部143は、現在のジャンプ再生の終了時刻を取得する(ステップS50)。ジャンプ再生順は、検索により該当のフレームデータの発生時刻を基準に昇順又は降順とすればよいが、最初はユーザにより選択されたイベント情報300と同一時刻のフレームデータが望ましい。 Then, the playback processing unit 11 starts jump playback from the Nth jump playback order of the frame data corresponding to the search (step S49). Initially, N is initialized and N = 1. The playback frame determination unit 143 acquires the current jump playback end time (step S50). The jump reproduction order may be ascending order or descending order based on the occurrence time of the corresponding frame data by the search. Initially, the frame data at the same time as the event information 300 selected by the user is desirable.
 ジャンプ再生が始まると、再生フレーム判断部143は、次に表示するフレームデータのタイムスタンプとジャンプ再生の終了時刻とを比較する(ステップS51)。再生フレーム判断部143により、タイムスタンプがジャンプ再生の終了時刻に達したと判断されると(ステップS52,Yes)、再生フレーム判断部143は、ジャンプ再生順Nを1カウントアップし(ステップS53)、ジャンプ再生順Nが選択数と検索該当数との加算値を上回ったか判断する(ステップS54)。 When jump playback starts, the playback frame determination unit 143 compares the time stamp of the next frame data to be displayed with the end time of jump playback (step S51). If the playback frame determination unit 143 determines that the time stamp has reached the end time of jump playback (step S52, Yes), the playback frame determination unit 143 increments the jump playback order N by 1 (step S53). Then, it is determined whether or not the jump reproduction order N exceeds the added value of the selected number and the number of hits (step S54).
 ジャンプ再生数Nが選択数と検索該当数との加算値以下であれば(ステップS54,No)、再生処理部11は、次のジャンプ再生順N番目のフレームデータのジャンプ再生に移る(ステップS49)。ジャンプ再生数Nが選択数と検索該当数との加算値超であれば(ステップS54,Yes)、ジャンプ再生を終了する(ステップS55)。 If the jump playback number N is less than or equal to the added value of the selected number and the number of hits (step S54, No), the playback processing unit 11 proceeds to jump playback of the Nth frame data in the next jump playback order (step S49). ). If the jump reproduction number N exceeds the addition value of the selection number and the number of hits (step S54, Yes), the jump reproduction is terminated (step S55).
 (作用)
 再生装置1の第3の適用例を説明する。本適用例において、生産組立ラインに監視カメラが設置されている。この生産組立ラインでは、作業員による製品の締め付け作業が指定工具にて行われ、締め付け作業の下流で寸法計測器が製品の良否を判定している。監視カメラは、締め付け作業場所に撮影光軸を向けて設置されている。
(Function)
A third application example of the playback apparatus 1 will be described. In this application example, a monitoring camera is installed on the production assembly line. In this production assembly line, a product is tightened by a worker with a designated tool, and a dimension measuring device determines whether the product is good or bad downstream of the tightening operation. The surveillance camera is installed with the photographing optical axis facing the tightening work place.
 寸法計測器は、製品の寸法が許容範囲内に収まっていると、良品を示すイベント情報300を生成してネットワーク100へ送出する。一方、寸法計測器は、製品の寸法が許容範囲外であると、製品不良を示すイベント情報300を生成してネットワーク100へ送出する。 When the dimensions of the product are within the allowable range, the dimension measuring device generates event information 300 indicating a non-defective product and sends it to the network 100. On the other hand, if the dimension of the product is out of the allowable range, the dimension measuring instrument generates event information 300 indicating a product defect and sends it to the network 100.
 表示画面には、寸法計測器から続々と発生するイベント情報300がリスト表示される。また、表示画面上では、監視カメラが捉えた生産組立ラインの動画像データ200もリアルタイム再生されている。動画像データ200には、締め付け作業場所で締め付け作業をしている作業員が手元まで映っている。 On the display screen, a list of event information 300 generated one after another from the dimension measuring instrument is displayed. On the display screen, the moving image data 200 of the production assembly line captured by the monitoring camera is also reproduced in real time. In the moving image data 200, the worker who is performing the tightening work at the tightening work place is shown to the hand.
 寸法計測器が製品不良のイベント情報300をネットワーク100に送出すると、その製品不良のイベント情報300はデータ収集部141により収集されて、表示画面に表示される。再生装置1を用いて現場を監視する監視員は、動画像データ200をリアルタイムで監視しつつ、製品不良のイベント情報300がリストに載ると、製品不良のイベント情報300を動画像データ200の再生エリアへドラッグアンドドロップする。 When the dimension measuring instrument sends the product defect event information 300 to the network 100, the product defect event information 300 is collected by the data collection unit 141 and displayed on the display screen. The monitoring person who monitors the site using the playback apparatus 1 monitors the moving image data 200 in real time, and when the product defect event information 300 is listed, the product defect event information 300 is reproduced from the moving image data 200. Drag and drop into the area.
 再生装置1は、ドラッグアンドドロップされたイベント情報300と同じく製品不良のイベント種類情報を有するイベント情報300を検索する。そして、図13に示すように、選択されたイベント情報300の発生時刻Tn-a-4と、検索により該当した全てのイベント情報300の発生時刻Tn-b-4及びTn-c-3を含む各所定時間範囲を順番にジャンプ再生する。全てのジャンプ再生された映像において、特定の作業員が指定工具と異なる別の工具を保持している点が共通していた。そのため、監視員は、特定作業員の技量不足と別工具の使用の何れか又は両方が相俟って、寸法異常が生じたと推測した。 The playback device 1 searches the event information 300 having event type information indicating product defects as well as the dragged and dropped event information 300. Then, as shown in FIG. 13, the generation time Tn-a-4 of the selected event information 300 and the generation times Tn-b-4 and Tn-c-3 of all the event information 300 corresponding to the search are included. Each predetermined time range is jump-played in order. In all jump-played images, a specific worker is holding another tool different from the designated tool. For this reason, the monitoring person estimated that a dimensional abnormality occurred due to the lack of skill of the specific worker and / or the use of another tool.
 そこで、監視員は、その製品の組み立て作業員が製品不良を引き起こした特定作業員を識別する補足情報を含み、製品が良品であることを示すイベント情報300から、作業員を識別する補足情報を選択しつつ、当該イベント情報300を再生エリアへドラッグアンドドロップした。 Therefore, the monitor includes supplementary information for identifying the specific worker who has caused the product failure by the assembly worker of the product, and supplementary information for identifying the worker from the event information 300 indicating that the product is a non-defective product. While selecting, the event information 300 was dragged and dropped to the reproduction area.
 そうすると、全てのジャンプ再生された映像においては、この特定作業員は指定工具を使用している点で共通していた。そのため、監視員は、特定作業員の技量不足ではなく、別工具が使用されたことにより寸法異常が生じたと確信し、専用工具の使用を徹底した。 Then, in all the jump-played images, this specific worker was common in that he used the specified tool. For this reason, the supervisor was convinced that the dimensional abnormality was caused by the use of another tool, not the lack of skills of the specific worker, and thorough use of the dedicated tool.
 (効果)
 このように、再生装置1は、第1の取得部で取得したイベント情報300と関連する単数又は複数の他のイベント情報を取得する第2の取得部を更に備え、時系列データのうち、第2の取得部が取得した他のイベント情報300の発生時刻情報と同一又は最も近いタイムスタンプが示す各時刻を含む時間帯へ順次ジャンプして再生するようにした。本実施形態では、検索部144が第2の取得部に相当する。
(effect)
As described above, the playback device 1 further includes a second acquisition unit that acquires one or a plurality of other event information related to the event information 300 acquired by the first acquisition unit. The time information including the time indicated by the time stamp that is the same as or closest to the occurrence time information of the other event information 300 acquired by the second acquisition unit is sequentially jumped and reproduced. In the present embodiment, the search unit 144 corresponds to a second acquisition unit.
 これにより、再生装置1は、選択されたイベント情報300と一定範囲で同一の部類に属するイベント情報300についても検索し、同一部類に属するイベント情報300と同一時刻を含む所定時間範囲も順次ジャンプ再生する。従って、選択されたイベント情報300だけでは的を絞り難かった情報も明瞭に浮き上がり、ユーザの気づきの機会を飛躍的に増やすことになる。 As a result, the playback apparatus 1 also searches the event information 300 belonging to the same category within a certain range as the selected event information 300, and sequentially jumps and reproduces a predetermined time range including the same time as the event information 300 belonging to the same category. To do. Therefore, information that is difficult to narrow down by the selected event information 300 is clearly revealed, and the chances of the user's awareness are dramatically increased.
 (変形例4)
 図14は、この変形例に係る再生装置1の構成を示すブロック図である。図14に示すように、再生装置1は、データベース16を備えている。このデータベース16は、表示部12を備えるコンピュータが備えていてもよいし、表示部12を備えるコンピュータとは別のコンピュータが備えていてもよい。別のコンピュータは、例えば情報収集システムの構成要素であり、ネットワーク100に設置されていてよい。
(Modification 4)
FIG. 14 is a block diagram showing the configuration of the playback apparatus 1 according to this modification. As shown in FIG. 14, the playback device 1 includes a database 16. The database 16 may be included in a computer including the display unit 12 or may be included in a computer different from the computer including the display unit 12. Another computer is a component of the information collection system, for example, and may be installed in the network 100.
 このデータベース16は、主にHDD等の外部記憶装置を含んで構成され、図15に示すように、ネットワーク100内のイベント情報300の関連付けを記憶している。関連付けは、例えば、同一製品の各製造プロセスで発生した各イベント情報300を結び付けている。また、関連付けは、例えば、同一設備や同一工場内等の所定範囲で発生した各イベント情報300を結び付けている。 The database 16 is mainly configured to include an external storage device such as an HDD, and stores association of event information 300 in the network 100 as shown in FIG. The association links, for example, each event information 300 generated in each manufacturing process of the same product. Further, the association is associated with each event information 300 generated in a predetermined range such as in the same facility or the same factory.
 検索部144は、データベース16を参照し、ジャンプ再生イベントで選択されたイベント情報300と関連付けられているイベント情報300を検索する。再生フレーム判断部143は、ジャンプ再生イベントで選択されたイベント情報300と検索部144がデータベース16を参照して取得したイベント情報300を同列に扱い、これらイベント情報300が発生した各時刻を順次ジャンプ再生する。 The search unit 144 refers to the database 16 and searches the event information 300 associated with the event information 300 selected in the jump playback event. The playback frame determination unit 143 treats the event information 300 selected by the jump playback event and the event information 300 acquired by the search unit 144 referring to the database 16 in the same row, and sequentially jumps each time when the event information 300 occurs. Reproduce.
 (変形例5)
 図16は、変形例5に係る再生装置1の表示画面を示す模式図であり、リアルタイム再生中のジャンプ再生イベントを示す。ジャンプ再生イベントの際にユーザにより選択されるイベント情報300は一個に限らず、ユーザは複数選択することもできる。
(Modification 5)
FIG. 16 is a schematic diagram showing a display screen of the playback apparatus 1 according to the modified example 5, and shows a jump playback event during real-time playback. The event information 300 selected by the user at the time of the jump playback event is not limited to one, and the user can select a plurality of pieces of event information.
 再生フレーム判断部143は、選択された全てのイベント情報300から発生時刻情報を抽出する。そして、再生処理部11は、各発生時刻情報と同一又は最も近いタイムスタンプが付加されたフレームデータからの所定時間範囲のジャンプ再生を順番に行う。この変形例では、第1の取得部たる再生フレーム判断部143が複数のイベント情報300を取得し、表示部12が順次ジャンプ再生を行うため、検索部144は必須ではない。 The playback frame determination unit 143 extracts occurrence time information from all selected event information 300. Then, the reproduction processing unit 11 sequentially performs jump reproduction within a predetermined time range from the frame data to which the same or closest time stamp as each occurrence time information is added. In this modified example, since the playback frame determination unit 143 as the first acquisition unit acquires a plurality of event information 300 and the display unit 12 sequentially performs jump playback, the search unit 144 is not essential.
 (第3の実施形態)
 (構成)
 次に第3の実施形態に係る再生装置1につき図面を参照して詳細に説明する。第1及び2の実施形態、変形例1乃至5と同一機能及び同一構成については同一符号を付して詳細な説明を省略する。
(Third embodiment)
(Constitution)
Next, the reproducing apparatus 1 according to the third embodiment will be described in detail with reference to the drawings. The same functions and configurations as those of the first and second embodiments and modifications 1 to 5 are denoted by the same reference numerals, and detailed description thereof is omitted.
 図17は、再生装置1の構成を示すブロック図である。再生装置1は、合成部15を更に備える。合成部15は、CPUを含み構成される。この合成部15は、ジャンプ再生の際、イベント情報300を動画像データ200の映像上に重ねて表示する。 FIG. 17 is a block diagram showing the configuration of the playback device 1. The playback device 1 further includes a synthesis unit 15. The combining unit 15 includes a CPU. The synthesizer 15 superimposes and displays the event information 300 on the video of the moving image data 200 during jump playback.
 すなわち、合成部15は、ジャンプ再生イベントで選択されたイベント情報300の内容をフレームメモリに書き込む。書き込みのタイミングは、イベント情報300と同一又は最も近い時刻を含む所定時間範囲の再生中である。また、合成部15は、ジャンプ再生イベントで選択されたイベント情報300と一定範囲で同一の部類に属するイベント情報300の内容をフレームに書き込む。この書き込みのタイミングは、一定範囲で同一の部類に属するイベント情報300と同一又は最も近い時刻を含む所定時間範囲の再生中である。 That is, the synthesizing unit 15 writes the content of the event information 300 selected by the jump playback event in the frame memory. The timing of writing is during reproduction within a predetermined time range including the same or the closest time as the event information 300. Further, the synthesis unit 15 writes the content of the event information 300 belonging to the same category within a certain range as the event information 300 selected in the jump playback event in the frame. This writing timing is during reproduction within a predetermined time range including the same or closest time as the event information 300 belonging to the same category within a certain range.
 (動作)
 この再生装置1の動作例を図18に示す。図18は、再生装置1による合成表示の動作を示すフローチャートである。動画像データ200の再生中(ステップS71)、ユーザが操作部13を用いてイベント情報300を選択し、ジャンプ再生イベントとなる操作を行うと(ステップS72)、検索部144は、選択されたイベント情報300と関連するイベント情報300を検索する(ステップS73)。
(Operation)
An example of the operation of the playback apparatus 1 is shown in FIG. FIG. 18 is a flowchart showing the composite display operation by the playback apparatus 1. When the moving image data 200 is being played back (step S71), when the user selects the event information 300 using the operation unit 13 and performs an operation that becomes a jump playback event (step S72), the search unit 144 selects the selected event. The event information 300 related to the information 300 is searched (step S73).
 再生処理部11は、選択及び検索により該当したイベント情報300と同一時間帯の映像をジャンプ再生する(ステップS74)。合成部15は、ジャンプ再生されている映像上に、当該映像と同一時間帯に発生し、選択及び検索により該当したイベント情報300を合成表示する(ステップS75)。 The reproduction processing unit 11 performs jump reproduction of the video in the same time zone as the corresponding event information 300 by the selection and search (step S74). The synthesizing unit 15 synthesizes and displays the event information 300 that occurs in the same time zone as the video on the video being jump-reproduced and is selected and searched (step S75).
 (作用効果)
 図19は、この再生装置1の表示画面を示す模式図であり、ジャンプ再生中を示す。図19に示すように、ジャンプ再生されている映像中には、右端にジャンプ再生される時間範囲に含まれる発生時刻情報を有し、選択及び検索により該当したイベント情報300の内容がオーバーレイ表示される。
(Function and effect)
FIG. 19 is a schematic diagram showing a display screen of the playback apparatus 1 and shows during jump playback. As shown in FIG. 19, in the video that is being jump-played, the occurrence time information included in the time range to be jump-played at the right end is displayed, and the contents of the corresponding event information 300 are displayed in an overlay by selection and search. The
 特に、複数のジャンプ再生を順番に行う場合、イベント情報300の合成表示があると、どのイベント情報300が発生したタイミングの映像かが監視者にとって認識容易となる。尚、合成部15は、動画像データ200の再生エリアではなく、他のエリア、例えば詳細表示エリアにイベント情報300を表示するようにしてもよい。 In particular, when a plurality of jump reproductions are performed in order, if there is a composite display of the event information 300, it becomes easy for the monitor to recognize which event information 300 is generated at the timing. Note that the synthesis unit 15 may display the event information 300 not in the playback area of the moving image data 200 but in another area, for example, a detailed display area.
 (第4の実施形態)
 (構成)
 次に第4の実施形態に係る再生装置1につき図面を参照して詳細に説明する。第1乃至3の実施形態、変形例1乃至5と同一機能及び同一構成については同一符号を付して詳細な説明を省略する。
(Fourth embodiment)
(Constitution)
Next, a reproducing apparatus 1 according to the fourth embodiment will be described in detail with reference to the drawings. The same functions and configurations as those of the first to third embodiments and modifications 1 to 5 are denoted by the same reference numerals, and detailed description thereof is omitted.
 ジャンプ再生イベントは、ユーザが要求したタイミングで、ユーザが要求する所定時間範囲の動画像を提供するためのトリガとしての性質を有する。一方、ユーザによる要求がなくとも、イベント情報300の発生時刻を含む所定時間範囲の動画像を予め用意しておくようにしてもよい。図20は、この再生装置1の構成を示すブロック図である。再生装置1は、再生候補生成部17と動画記憶部18を更に備えている。再生候補生成部17は、CPUを主に含み構成される。動画記憶部18は、HDD等の外部記憶装置を含み構成される。 The jump playback event has a property as a trigger for providing a moving image in a predetermined time range requested by the user at a timing requested by the user. On the other hand, even if there is no request from the user, a moving image in a predetermined time range including the time when the event information 300 is generated may be prepared in advance. FIG. 20 is a block diagram showing the configuration of the playback apparatus 1. The playback device 1 further includes a playback candidate generation unit 17 and a moving image storage unit 18. The reproduction candidate generation unit 17 mainly includes a CPU. The moving image storage unit 18 includes an external storage device such as an HDD.
 再生候補生成部17は、動画像データ200からイベント情報300の発生時刻を含む所定時間範囲の動画像データ201をトリミングし、動画記憶部18にストックさせておく。一方、再生候補生成部17は、動画像データ201をトリミングした後、元の動画像データ200を再生装置1又はネットワーク100等の記憶エリアから削除する。 The reproduction candidate generation unit 17 trims the moving image data 201 in a predetermined time range including the occurrence time of the event information 300 from the moving image data 200 and stores the trimmed moving image data 201 in the moving image storage unit 18. On the other hand, the reproduction candidate generation unit 17 trims the moving image data 201 and then deletes the original moving image data 200 from the storage area of the reproducing apparatus 1 or the network 100.
 再生候補生成部17は、ユーザによる選択に依らず、イベント情報300が発生すると、その発生したイベント情報300の全てを対象に動画像データ201をトリミングする。尚、特定のイベント種類情報を有するイベント情報300のみを用いて動画像データ201をトリミングしてもよい。再生候補生成部17がトリミングする所定時間範囲は、例えば、第1の実施形態のようにジャンプ再生する時間範囲が3秒であれば、5分程度のように長めに設定されることが望ましい。元の動画像データ200を削除するため、ユーザが閲覧したい範囲に余裕を持たせるためである。 When the event information 300 is generated regardless of the selection by the user, the reproduction candidate generation unit 17 trims the moving image data 201 for all the event information 300 that has occurred. Note that the moving image data 201 may be trimmed using only the event information 300 having specific event type information. For example, if the time range for jump reproduction is 3 seconds as in the first embodiment, the predetermined time range to be trimmed by the reproduction candidate generation unit 17 is desirably set to be long, such as about 5 minutes. This is because the original moving image data 200 is deleted, so that the range that the user wants to browse is given a margin.
 また、再生候補生成部17は、動画像データ201の生成に用いたイベント情報300と、このイベント情報300を用いて生成された動画像データ201との関連付けデータ400を生成する。関連付けデータ400は、動画像データ201とイベント情報300の各々の識別情報を対にしたデータ、又は動画像データ201とイベント情報300の両方を格納したフォルダを示すディレクトリデータである。 Further, the reproduction candidate generation unit 17 generates association data 400 between the event information 300 used for generating the moving image data 201 and the moving image data 201 generated using the event information 300. The association data 400 is data in which the identification information of each of the moving image data 201 and the event information 300 is paired, or directory data indicating a folder in which both the moving image data 201 and the event information 300 are stored.
 画面生成部142は、関連付けデータ400を参照し、関連付けデータ400で特定されるイベント情報300をリスト表示エリアに表示させる。再生フレーム判断部143は、リスト表示エリアからイベント情報300が選択されると、関連付けデータ400を参照して、選択されたイベント情報300に関連付けられた動画像データ201を特定する。再生処理部11は、再生フレーム判断部143から引き継いで、特定された動画像データ201を再生させるように、この動画像データ201の先頭フレームデータから順番にフレームメモリを更新していく。 The screen generation unit 142 refers to the association data 400 and displays the event information 300 specified by the association data 400 in the list display area. When the event information 300 is selected from the list display area, the playback frame determination unit 143 refers to the association data 400 and identifies the moving image data 201 associated with the selected event information 300. The reproduction processing unit 11 takes over from the reproduction frame determination unit 143 and updates the frame memory in order from the first frame data of the moving image data 201 so that the specified moving image data 201 is reproduced.
 図21は、この再生装置1における動画像データ201の生成動作を示すフローチャートである。データ収集部141は、ネットワーク100を監視し、イベント情報300が発生すると(ステップS81)、動画像データ201の生成を開始する。その他、定期的に動画像データ201を生成するようにしてもよいし、表示画面に動画像データ201の生成ボタンを用意しておき、この生成ボタンが押下されることで、動画像データ201の生成を開始してもよい。 FIG. 21 is a flowchart showing the generation operation of the moving image data 201 in the reproducing apparatus 1. The data collection unit 141 monitors the network 100, and when event information 300 is generated (step S81), generation of the moving image data 201 is started. In addition, the moving image data 201 may be generated periodically, or a generation button for the moving image data 201 is prepared on the display screen, and the generation of the moving image data 201 is performed by pressing this generation button. Generation may begin.
 イベント情報300の発生等により動画像データ201の生成が開始されると、再生候補生成部17は、イベント情報300に対応する動画像データ201の始期を演算する(ステップS82)。再生候補生成部17は、イベント情報300の時刻から遡って動画像データ201の始期までの時間パラメータを予め記憶している。再生候補生成部17は、始期までの時間パラメータをイベント情報300の発生時刻情報から減算して、動画像データ201の始期となる時刻を導く。 When generation of the moving image data 201 is started due to the generation of the event information 300 or the like, the reproduction candidate generation unit 17 calculates the start of the moving image data 201 corresponding to the event information 300 (step S82). The reproduction candidate generation unit 17 stores in advance time parameters from the time of the event information 300 to the beginning of the moving image data 201. The reproduction candidate generation unit 17 subtracts the time parameter until the start time from the occurrence time information of the event information 300 to derive the time when the moving image data 201 becomes the start time.
 更に、再生候補生成部17は、動画像データ201の終期を演算する(ステップS83)。再生候補生成部17は、イベント情報300の時刻から動画像データ201の終期までの時間パラメータとを予め記憶している。再生候補生成部17は、終期までの時間パラメータをイベント情報300の発生時刻情報に加算して、動画像データ201の終期となる時刻を導く。 Further, the reproduction candidate generation unit 17 calculates the end of the moving image data 201 (step S83). The reproduction candidate generation unit 17 stores in advance time parameters from the time of the event information 300 to the end of the moving image data 201. The reproduction candidate generation unit 17 adds the time parameter until the end to the generation time information of the event information 300, and derives the time at which the moving image data 201 ends.
 動画像データ201の始期と終期を演算すると、再生候補生成部17は、動画像データ201の始期となるフレームデータから終期となるフレームデータまでの全フレームデータを動画像データ200から複製し(ステップS84)、動画記憶部18に記憶させる(ステップS85)。 When the start and end of the moving image data 201 are calculated, the reproduction candidate generating unit 17 duplicates all the frame data from the starting frame data to the ending frame data of the moving image data 201 from the moving image data 200 (step S84), it is stored in the moving image storage unit 18 (step S85).
 また、再生候補生成部17は、この動画像データ201の始期及び終期の演算に用いたイベント情報300の識別情報と、動画像データ201の識別情報とを関連付けて、関連付けデータ400を生成し(ステップS86)、動画記憶部18に記憶させる(ステップS87)。そして、動画像データ201及び関連付けデータ400の生成が終了すると(ステップS88)、再生候補生成部17は、動画像データ200を再生装置1又はネットワーク100から削除する(ステップS89)。 In addition, the reproduction candidate generation unit 17 generates association data 400 by associating the identification information of the event information 300 used for the start and end calculations of the moving image data 201 and the identification information of the moving image data 201 ( In step S86, the moving image storage unit 18 stores the information (step S87). When the generation of the moving image data 201 and the association data 400 is completed (step S88), the reproduction candidate generation unit 17 deletes the moving image data 200 from the reproduction apparatus 1 or the network 100 (step S89).
 図22は、この再生装置1における動画像データ201の再生動作を示すフローチャートである。画面生成部142は、動画記憶部18の関連付けデータ400を参照し(ステップS91)、関連付けデータ400で特定されるイベント情報300をリスト表示エリアに表示する(ステップS92)。 FIG. 22 is a flowchart showing the reproduction operation of the moving image data 201 in the reproduction apparatus 1. The screen generation unit 142 refers to the association data 400 in the moving image storage unit 18 (step S91), and displays the event information 300 specified by the association data 400 in the list display area (step S92).
 ユーザが操作部13を用いてイベント情報300を選択すると(ステップS93)、再生フレーム判断部143は、動画記憶部18の関連付けデータ400を参照し(ステップS94)、選択されたイベント情報300と関連付けられた動画像データ201を特定する(ステップS95)。 When the user selects the event information 300 using the operation unit 13 (step S93), the playback frame determination unit 143 refers to the association data 400 of the moving image storage unit 18 (step S94) and associates with the selected event information 300. The obtained moving image data 201 is specified (step S95).
 再生処理部11は、再生フレーム判断部143により特定された動画像データ201を動画記憶部18から読み出し(ステップS96)、表示部12は、選択されたイベント情報300に関連付けられ、選択されたイベント情報の発生時刻を含む動画像データ201を表示する(ステップS97)。 The reproduction processing unit 11 reads out the moving image data 201 specified by the reproduction frame determination unit 143 from the moving image storage unit 18 (step S96), and the display unit 12 is associated with the selected event information 300 and selected event The moving image data 201 including the information generation time is displayed (step S97).
 このように、この再生装置1において、再生候補生成部17を一例とする第1の取得部がイベント情報300取得すると、この再生候補生成部17は、動画像データ200のうち、取得したイベント情報300の発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を含む所定時間範囲の動画像データ201を抜き出し、抜き出した動画像データ201を動画記憶部18に記憶するようにした。そして、ユーザの操作に応じて、表示部12は、動画記憶部18に記憶された動画像データ201を表示する。 As described above, in the playback device 1, when the first acquisition unit taking the playback candidate generation unit 17 as an example acquires the event information 300, the playback candidate generation unit 17 acquires the event information acquired from the moving image data 200. The moving image data 201 in a predetermined time range including the time indicated by the time stamp that is the same as or closest to the occurrence time information of 300 is extracted, and the extracted moving image data 201 is stored in the moving image storage unit 18. Then, the display unit 12 displays the moving image data 201 stored in the moving image storage unit 18 in accordance with a user operation.
 これにより、動画像データ200のうち、稼働管理、製品の品質管理、経営管理等の操業に寄与する貴重な情報のみを動画像データ201として記憶しておくことができるので、データ量の削減に寄与する。 As a result, in the moving image data 200, only valuable information that contributes to operations such as operation management, product quality management, and business management can be stored as the moving image data 201, thereby reducing the amount of data. Contribute.
 また、再生候補生成部17は、抜き出した動画像データ201と、取得したイベント情報300とを関連付ける関連付けデータ400を更に生成し、動画記憶部18は、この関連付けデータ400を更に記憶するようにした。そして、表示部12は、操作部13を用いてユーザが選択したイベント情報300と関連付けデータ400により関連付けられた動画像データ400を表示するようにした。 In addition, the reproduction candidate generation unit 17 further generates association data 400 that associates the extracted moving image data 201 with the acquired event information 300, and the moving image storage unit 18 further stores the association data 400. . The display unit 12 displays the moving image data 400 associated with the event information 300 selected by the user using the operation unit 13 and the association data 400.
 これにより、関連付けデータ400を介してイベント情報300と動画像データ201とが一体的となり、イベント情報300と動画像データ201とが有機的に繋がって相乗的に情報の価値が向上する。 Thereby, the event information 300 and the moving image data 201 are integrated through the association data 400, and the event information 300 and the moving image data 201 are organically connected to synergistically improve the value of the information.
 (変形例6)
 図23は、第4の実施形態に係る変形例6の再生装置1が生成する動画像データ201の模式図である。再生装置1では、一つのイベント情報300ごとに一つの動画像データ201を作成するようにしてもよいし、変形例6に係る再生装置1のように、複数のイベント情報300に対して一つの動画像データ201を生成するようにしてもよい。
(Modification 6)
FIG. 23 is a schematic diagram of moving image data 201 generated by the playback device 1 of Modification 6 according to the fourth embodiment. In the playback apparatus 1, one piece of moving image data 201 may be created for each piece of event information 300, or one piece of event information 300 may be used for one piece of event information 300 as in the playback apparatus 1 according to the modified example 6. The moving image data 201 may be generated.
 すなわち、図23に示すように、再生候補生成部17は、複数のイベント情報300の発生時刻を全て含む時間範囲で一つの動画像データ201をトリミングする。関連付けデータ400には、これらイベント情報300の全ての識別情報と一つの動画像データ201の識別情報とを関連付けておく。 That is, as shown in FIG. 23, the reproduction candidate generating unit 17 trims one moving image data 201 in a time range including all occurrence times of the plurality of event information 300. In the association data 400, all the identification information of the event information 300 and the identification information of one moving image data 201 are associated with each other.
 複数のイベント情報300は、ユーザにより選択された群、動画像データ201の生成タイミングまでに発生していた群、又はユーザにより選択されたイベント情報300と一定範囲で同一の部類に属するイベント情報300を合わせた群の何れでも良い。 The plurality of event information 300 may be a group selected by the user, a group generated before the generation timing of the moving image data 201, or the event information 300 belonging to the same category as the event information 300 selected by the user within a certain range. Any of the group which combined.
 再生候補生成部17は、これらイベント情報300のうち最も過去の発生時刻から遡って動画像データ201の始期までの時間パラメータを予め記憶している。再生候補生成部17は、始期までの時間パラメータをイベント情報300の発生時刻情報から減算して、動画像データ201の始期となる時刻を導く。 The reproduction candidate generation unit 17 stores in advance time parameters from the most recent occurrence time to the start of the moving image data 201 among these event information 300. The reproduction candidate generation unit 17 subtracts the time parameter until the start time from the occurrence time information of the event information 300 to derive the time when the moving image data 201 becomes the start time.
 また、再生候補生成部17は、これらイベント情報300のうち最も新しいの発生時刻から動画像データ201の終期までの時間パラメータを予め記憶している。再生候補生成部17は、終期までの時間パラメータを最も新しい発生時刻情報に加算して、動画像データ201の終期となる時刻を導く。 Also, the reproduction candidate generation unit 17 stores in advance time parameters from the latest occurrence time of these event information 300 to the end of the moving image data 201. The reproduction candidate generation unit 17 adds the time parameter until the end to the latest occurrence time information, and derives the time at which the moving image data 201 ends.
 図24は、この再生装置1の動画像データ201の生成動作の一例を示すフローチャートである。動画像データ201の定期的な生成タイミングが到達すると(ステップS101)、データ取得部141は、ネットワーク100に存在する動画像データ200とイベント情報300を収集する(ステップS102)。 FIG. 24 is a flowchart showing an example of the generation operation of the moving image data 201 of the playback apparatus 1. When the periodic generation timing of the moving image data 201 arrives (step S101), the data acquisition unit 141 collects the moving image data 200 and event information 300 existing in the network 100 (step S102).
 動画像データ201の定期的な生成タイミングで存在していた複数のイベント情報300を取得する以外の方法としては、例えば、複数のイベント情報300は、リスト表示エリアから選択される。また、例えば、ユーザが選択した補足情報を有する他のイベント情報300を取得する。また、例えば、データベース16を参照して、ユーザが選択したイベント情報300とデータベース16上で関連する複数のイベント情報300を取得する。 As a method other than acquiring a plurality of event information 300 that existed at the regular generation timing of the moving image data 201, for example, the plurality of event information 300 is selected from a list display area. Also, for example, other event information 300 having supplementary information selected by the user is acquired. Further, for example, by referring to the database 16, the event information 300 selected by the user and a plurality of event information 300 related on the database 16 are acquired.
 複数のイベント情報300が収集されると、再生フレーム判断部143は、これらイベント情報300を取得し(ステップS103)、これらイベント情報300に含まれる各発生時刻情報を抽出する(ステップS104)。このとき、再生候補生成部17は、各発生時刻情報から最も過去の発生時刻情報を探索し(ステップS105)、また各発生時刻情報から最も新しい発生時刻情報を探索する(ステップS106)。 When a plurality of event information 300 are collected, the playback frame determination unit 143 acquires the event information 300 (step S103), and extracts each occurrence time information included in the event information 300 (step S104). At this time, the reproduction candidate generation unit 17 searches for the latest occurrence time information from each occurrence time information (step S105), and searches for the latest occurrence time information from each occurrence time information (step S106).
 そして、再生候補生成部17は、最も過去の発生時刻情報から動画像データ201の始期を演算する(ステップS107)。更に、再生候補生成部17は、最も新しい発生時刻情報から動画像データ201の終期を演算する(ステップS108)。 Then, the reproduction candidate generation unit 17 calculates the start time of the moving image data 201 from the most recent occurrence time information (step S107). Further, the reproduction candidate generation unit 17 calculates the end of the moving image data 201 from the latest occurrence time information (step S108).
 動画像データ201の始期と終期を演算すると、再生候補生成部17は、動画像データ201の始期となるフレームデータから終期となるフレームデータまで全フレームデータを複製し(ステップS109)、動画記憶部18に記憶させる(ステップS110)。 When the start and end of the moving image data 201 are calculated, the reproduction candidate generating unit 17 duplicates all the frame data from the starting frame data to the ending frame data of the moving image data 201 (step S109), and the moving image storage unit 18 (step S110).
 また、再生候補生成部17は、複数のイベント情報300の全ての識別情報と、動画像データ201の識別情報とを関連付けて、関連付けデータ400を生成し(ステップS111)、動画記憶部18に記憶させる(ステップS112)。動画像データ201及び関連付けデータ400の生成が終了すると(ステップS113)、再生候補生成部17は、動画像データ200を再生装置1又はネットワーク100から削除する(ステップS114)。 Further, the reproduction candidate generation unit 17 associates all the identification information of the plurality of event information 300 with the identification information of the moving image data 201 to generate association data 400 (step S111) and stores it in the moving image storage unit 18. (Step S112). When the generation of the moving image data 201 and the association data 400 is completed (step S113), the reproduction candidate generation unit 17 deletes the moving image data 200 from the reproduction apparatus 1 or the network 100 (step S114).
 このように、この再生装置1では、再生フレーム判断部143を一例とする第1の取得部が複数のイベント情報300を取得し、再生候補生成部17により、動画像データ200のうち、これらイベント情報300の発生時刻が全て含まれる動画像データ201を抜き出し、抜き出した動画像データ201を動画記憶部18に記憶するようにした。 As described above, in the playback apparatus 1, the first acquisition unit taking the playback frame determination unit 143 as an example acquires a plurality of event information 300, and the playback candidate generation unit 17 includes these events in the moving image data 200. The moving image data 201 including all occurrence times of the information 300 is extracted, and the extracted moving image data 201 is stored in the moving image storage unit 18.
 または、再生フレーム判断部143を一例とする第1の取得部と、検索部144を一例とする第2の取得部がイベント情報300を取得すると、再生候補生成部17により、動画像データ200のうち、これらイベント情報300の発生時刻が全て含まれる動画像データ201を抜き出し、抜き出した動画像データ201を動画記憶部18に記憶するようにした。 Alternatively, when the first acquisition unit taking the reproduction frame determination unit 143 as an example and the second acquisition unit taking the search unit 144 as an example acquire the event information 300, the reproduction candidate generation unit 17 causes the moving image data 200 to be stored. Among them, the moving image data 201 including all occurrence times of the event information 300 is extracted, and the extracted moving image data 201 is stored in the moving image storage unit 18.
 これにより、データ量の削減に寄与するとともに、各イベント情報300と一つの動画像データ201とが有機的に繋がって相乗的に情報の価値が向上する。 This contributes to a reduction in the amount of data, and each event information 300 and one moving image data 201 are organically connected to synergistically improve the value of the information.
1 再生装置
11 再生処理部
12 表示部
13 操作部
14 再生制御部
141 データ収集部
142 画面生成部
143 再生フレーム判断部
144 検索部
15 合成部
16 データベース
17 再生候補生成部
18 動画記憶部
100 ネットワーク
200 動画像データ
201 動画像データ
300 イベント情報
400 関連付けデータ
1 playback device 11 playback processing unit 12 display unit 13 operation unit 14 playback control unit 141 data collection unit 142 screen generation unit 143 playback frame determination unit 144 search unit 15 combination unit 16 database 17 playback candidate generation unit 18 video storage unit 100 network 200 Moving image data 201 moving image data 300 event information 400 association data

Claims (16)

  1.  タイムスタンプを有する時系列データを再生する再生装置であって、
     発生時刻情報を有するイベント情報を取得する第1の取得部と、
     前記時系列データのうち、前記第1の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を含む所定時間範囲を再生する表示部と、
     を備えること、
     を特徴とする再生装置。
    A playback device for playing back time-series data having a time stamp,
    A first acquisition unit for acquiring event information having occurrence time information;
    A display unit that reproduces a predetermined time range including a time indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit among the time-series data;
    Providing
    A reproducing apparatus characterized by the above.
  2.  前記第1の取得部が取得した前記イベント情報と関連する単数又は複数の他のイベント情報を取得する第2の取得部を更に備え、
     前記表示部は、
     前記時系列データのうち、前記第2の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す各時刻を含む所定時間範囲へ順次ジャンプして再生すること、
     を特徴とする請求項1記載の再生装置。
    A second acquisition unit that acquires one or more other event information related to the event information acquired by the first acquisition unit;
    The display unit
    Among the time-series data, sequentially jumping to a predetermined time range including each time indicated by the time stamp that is the same as or closest to the occurrence time information of the event information acquired by the second acquisition unit, and playing it back,
    The reproducing apparatus according to claim 1.
  3.  前記表示部は、
     前記時系列データの再生中、前記イベント情報が発生した時刻を含む所定時間範囲へジャンプ再生し、
     前記ジャンプ再生が終了すると、最新時刻の再生に戻すこと、
     を特徴とする請求項1又は2記載の再生装置。
    The display unit
    During playback of the time series data, jump playback to a predetermined time range including the time when the event information occurred,
    When the jump playback ends, return to the playback at the latest time,
    The reproducing apparatus according to claim 1 or 2, characterized by the above-mentioned.
  4.  前記表示部は、
     前記時系列データの再生中、前記イベント情報が発生した時刻を含む所定時間範囲へジャンプ再生し、
     前記ジャンプ再生が終了すると、前記ジャンプ再生の直前に再生していた時刻又は其の前後の再生に戻すこと、
     を特徴とする請求項1又は2記載の再生装置。
    The display unit
    During playback of the time series data, jump playback to a predetermined time range including the time when the event information occurred,
    When the jump playback ends, return to the playback time before or after the jump playback, or the playback before and after the time.
    The reproducing apparatus according to claim 1 or 2, characterized by the above-mentioned.
  5.  前記表示部は、
     取得された前記イベント情報を前記時系列データの画像に合成表示すること、
     を特徴とする請求項1乃至4の何れかに記載の再生装置。
    The display unit
    Combining and displaying the acquired event information on the image of the time-series data;
    The playback apparatus according to claim 1, wherein:
  6.  前記イベント情報は、各種各内容の補足情報を含み、
     前記第2の取得部は、前記第1の取得部が取得した前記イベント情報に含まれる単数又は複数の前記補足情報と同一の前記補足情報を有する前記イベント情報を取得すること、
     を特徴とする請求項2記載の再生装置。
    The event information includes supplementary information of various contents,
    The second acquisition unit acquires the event information having the same supplement information as one or more supplemental information included in the event information acquired by the first acquisition unit;
    The reproducing apparatus according to claim 2.
  7.  前記補足情報には、イベントの種類を示すイベント種類情報が含まれること、
     を特徴とする請求項6記載の再生装置。
    The supplementary information includes event type information indicating an event type,
    The playback apparatus according to claim 6.
  8.  複数の前記イベント情報を関連付けるデータベースを更に備え、
     前記第2の取得部は、前記第1の取得部が取得した前記イベント情報と前記データベースで関連付けられている前記イベント情報を取得すること、
     を特徴とする請求項2記載の再生装置。
    A database for associating a plurality of the event information;
    The second acquisition unit acquires the event information associated with the event information acquired by the first acquisition unit in the database;
    The reproducing apparatus according to claim 2.
  9.  前記時系列データは、動画像データ、動線軌跡データ又は音声データであること、
     を特徴とする請求項1乃至8の何れかに記載の再生装置。
    The time-series data is moving image data, flow line trajectory data or audio data;
    The playback apparatus according to claim 1, wherein:
  10.  前記所定時間範囲は、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を始端とすること、
     を特徴とする請求項1乃至9の何れかに記載の再生装置。
    The predetermined time range starts from a time indicated by the same or closest time stamp as the occurrence time information of the event information,
    The playback apparatus according to claim 1, wherein
  11.  前記所定時間範囲は、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻よりも過去時刻を始端とし、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻よりも将来時刻を終端とすること、
     を特徴とする請求項1乃至9の何れかに記載の再生装置。
    The predetermined time range starts at a past time than the time indicated by the same or closest time stamp as the occurrence time information of the event information, and the time indicated by the same or closest time stamp as the occurrence time information of the event information Rather than ending with a future time,
    The playback apparatus according to claim 1, wherein
  12.  前記所定時間範囲は、前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻のみを含み、
     前記表示部は、前記時系列データを静止表示すること、
     を特徴とする請求項1乃至9の何れかに記載の再生装置。
    The predetermined time range includes only the time indicated by the same or closest time stamp as the occurrence time information of the event information,
    The display unit statically displays the time-series data;
    The playback apparatus according to claim 1, wherein
  13.  前記時系列データのうち、前記第1の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す時刻を含む所定時間範囲の時系列データを抜き出す生成部と、
     前記データ生成部が抜き出した時系列データを記憶する記憶部と、
     を更に備え、
     前記表示部は、前記記憶部に記憶された時系列データを表示すること、
     を特徴とする請求項1記載の再生装置。
    Among the time series data, a generation unit that extracts time series data in a predetermined time range including a time indicated by the same or closest time stamp as the occurrence time information of the event information acquired by the first acquisition unit;
    A storage unit for storing time-series data extracted by the data generation unit;
    Further comprising
    The display unit displays time-series data stored in the storage unit;
    The reproducing apparatus according to claim 1.
  14.  前記第1の取得部は、複数の前記イベント情報を取得し、
     前記生成部は、前記時系列データのうち、前記第1の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す全ての時刻を含む、連続した所定時間範囲の時系列データを抜き出すこと、
     を特徴とする請求項13記載の再生装置。
    The first acquisition unit acquires a plurality of the event information,
    The generation unit includes, in the time series data, a continuous predetermined time range including all the times indicated by the same or closest time stamp as the occurrence time information of the event information acquired by the first acquisition unit. Extracting time series data,
    The playback apparatus according to claim 13.
  15.  前記生成部は、前記時系列データのうち、前記第1の取得部及び前記第2の取得部が取得した前記イベント情報の前記発生時刻情報と同一又は最も近いタイムスタンプが示す全ての時刻を含む、連続した所定時間範囲の時系列データを抜き出すこと、
     を特徴とする請求項2記載の再生装置。
    The generation unit includes all times indicated by a time stamp that is the same as or closest to the occurrence time information of the event information acquired by the first acquisition unit and the second acquisition unit in the time-series data. Extracting time-series data for a continuous time range,
    The reproducing apparatus according to claim 2.
  16.  ユーザの操作を受け付ける操作部を更に備え、
     前記データ生成部は、前記抜き出した時系列データと前記第1の取得部が取得した前記イベント情報とを関連付ける関連付けデータを更に生成し、
     前記記憶部は、前記関連付けデータを更に記憶し、
     前記表示部は、前記操作部を用いてユーザが選択した前記イベント情報と前記関連付けデータにより関連付けられた時系列データを表示すること、
     を特徴とする請求項13乃至15の何れかに記載の再生装置。
    It further includes an operation unit that receives a user operation,
    The data generation unit further generates association data associating the extracted time series data with the event information acquired by the first acquisition unit,
    The storage unit further stores the association data,
    The display unit displays time-series data associated with the event information and the association data selected by the user using the operation unit;
    The playback apparatus according to claim 13, wherein
PCT/JP2017/008816 2016-03-10 2017-03-06 Reproducing device WO2017154845A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016047249 2016-03-10
JP2016-047249 2016-03-10
JP2016-078322 2016-04-08
JP2016078322A JP2017169181A (en) 2016-03-10 2016-04-08 Reproduction device

Publications (1)

Publication Number Publication Date
WO2017154845A1 true WO2017154845A1 (en) 2017-09-14

Family

ID=59789557

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/008816 WO2017154845A1 (en) 2016-03-10 2017-03-06 Reproducing device

Country Status (1)

Country Link
WO (1) WO2017154845A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000224542A (en) * 1999-01-29 2000-08-11 Hitachi Ltd Image storage device, monitor system and storage medium
JP2007266685A (en) * 2006-03-27 2007-10-11 Toshiba Corp Image recording system and program
JP2008178009A (en) * 2007-01-22 2008-07-31 Seiko Precision Inc Apparatus, method and program for storing/retrieving video image
JP2010075279A (en) * 2008-09-24 2010-04-08 Glory Ltd Game parlor management system and game parlor management method
JP2010233680A (en) * 2009-03-30 2010-10-21 Glory Ltd Game parlor management system and game parlor management method
JP2010258704A (en) * 2009-04-23 2010-11-11 Canon Inc Reproducing unit and reproduction method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000224542A (en) * 1999-01-29 2000-08-11 Hitachi Ltd Image storage device, monitor system and storage medium
JP2007266685A (en) * 2006-03-27 2007-10-11 Toshiba Corp Image recording system and program
JP2008178009A (en) * 2007-01-22 2008-07-31 Seiko Precision Inc Apparatus, method and program for storing/retrieving video image
JP2010075279A (en) * 2008-09-24 2010-04-08 Glory Ltd Game parlor management system and game parlor management method
JP2010233680A (en) * 2009-03-30 2010-10-21 Glory Ltd Game parlor management system and game parlor management method
JP2010258704A (en) * 2009-04-23 2010-11-11 Canon Inc Reproducing unit and reproduction method

Similar Documents

Publication Publication Date Title
JP2017169181A (en) Reproduction device
KR102438200B1 (en) Video editing using contextual data and content discovery using clusters
US20070266304A1 (en) Annotating media files
EP1873732A2 (en) Image processing apparatus, image processing system and filter setting method
US20070285578A1 (en) Method for motion detection and method and system for supporting analysis of software error for video systems
US20100211966A1 (en) View quality judging device, view quality judging method, view quality judging program, and recording medium
KR20000071313A (en) Image recording/reproducing apparatus in monitor system
KR20060128022A (en) Automated system and method for conducting usability testing
CN105872452A (en) System and method for browsing summary image
JP2006085440A (en) Information processing system, information processing method and computer program
US10692591B2 (en) Apparatus, method and computer readable medium for tracking data and events
JP2010282246A (en) Plant operation monitoring system
JP4678043B2 (en) Image storage device, monitoring system, storage medium
JP5904770B2 (en) Medical image diagnostic apparatus and operation information recording apparatus
JP2014225290A (en) Apparatus and method for managing task information of plant
JP4345358B2 (en) Hospital risk management support system
JP5195156B2 (en) Monitoring device, monitoring system, and filter setting method
JP4940465B2 (en) Driving guidance device
JP2008026939A (en) Operation support system and medical diagnostic device mounted with same system
WO2017154845A1 (en) Reproducing device
CN108234941A (en) Monitoring device, monitoring method and computer-readable medium
JP2002262233A (en) Behavior measurement system
JP2010102649A (en) Apparatus and method for supporting data analysis, computer program and recording medium
JP2006129519A (en) Image storing device, monitoring system and storage medium
JP2009049518A (en) Monitoring device, monitoring system, and image search method

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17763185

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17763185

Country of ref document: EP

Kind code of ref document: A1