WO2013001702A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2013001702A1
WO2013001702A1 PCT/JP2012/003015 JP2012003015W WO2013001702A1 WO 2013001702 A1 WO2013001702 A1 WO 2013001702A1 JP 2012003015 W JP2012003015 W JP 2012003015W WO 2013001702 A1 WO2013001702 A1 WO 2013001702A1
Authority
WO
WIPO (PCT)
Prior art keywords
analysis
analysis result
data
analysis engine
predetermined
Prior art date
Application number
PCT/JP2012/003015
Other languages
English (en)
Japanese (ja)
Inventor
淑子 松川
亀井 真一郎
清孝 今野
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2013001702A1 publication Critical patent/WO2013001702A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • G08B13/19615Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion wherein said pattern is defined by the user

Definitions

  • the present invention relates to an information processing apparatus, and more particularly to an information processing apparatus that can use an analysis result from an analysis engine.
  • the analysis engine performs, for example, a process for detecting a person's presence or a pre-registered person from the input moving image, an image recognition process for analyzing the movement trajectory of the person, or a pre-registered from the input voice.
  • a voice recognition process for detecting the voice of a person is performed and used for various purposes.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2004-228561 discloses a technique in which when an input image of a camera installed in an elevator is analyzed by an analysis engine and a suspicious person is detected, the recording density of the input image by the camera is increased more than usual. At this time, the input image from the camera is stored in the image recording apparatus.
  • the technique described above does not describe the subsequent image input from the camera stored in the image recording apparatus. If the camera input image is still stored in the image recording apparatus, there arises a problem that the capacity of the recording apparatus increases. On the other hand, when the camera input image is deleted, there arises a problem that it can no longer be used even if it is necessary for later analysis processing or confirmation.
  • an object of the present invention is to efficiently use the storage device while suppressing the influence on the analysis processing by the analysis engine, which is the above-described problem.
  • an information processing apparatus provides: An analysis result acquisition means for acquiring an analysis result by a predetermined analysis engine; Based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition means, the processing target data input to the analysis engine and / or the processing result data output from the analysis engine are stored in a predetermined storage device.
  • Storage data control means for controlling the storage state of the stored storage data, The configuration is as follows.
  • the program which is the other form of this invention is:
  • An analysis result acquisition means for acquiring an analysis result by a predetermined analysis engine; Based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition means, the processing target data input to the analysis engine and / or the processing result data output from the analysis engine are stored in a predetermined storage device.
  • Storage data control means for controlling the storage state of the stored storage data; It is a program for realizing.
  • a storage data control method includes: The information processing apparatus acquires the analysis result by the predetermined analysis engine, and based on the acquired analysis result by the predetermined analysis engine, the processing target data input to the analysis engine and / or the process output from the analysis engine Controlling the storage state of the storage data stored in the predetermined storage device, which is the result data,
  • the configuration is as follows.
  • the present invention is configured as described above, and can efficiently use the storage device while suppressing the influence on the analysis processing by the analysis engine.
  • FIG. 1 It is a block diagram which shows the structure of the information processing system of this invention. It is a flowchart which shows operation
  • FIG. 1 is a block diagram showing the configuration of the information processing system
  • FIGS. 2 to 3 are flowcharts showing the operation of the control device
  • 4 to 14 are diagrams illustrating examples of control rules stored in the control device.
  • the information processing system is a system that includes a plurality of analysis engines and integrates the results analyzed by the respective analysis engines. As shown in FIG. 1, the information processing system includes a first analysis engine 10, a second analysis engine 20, a control device 30, a raw data acquisition device 40, a raw data storage device 50, and an analysis. A result storage device 60.
  • the information processing system in the present invention is not necessarily limited to a system that integrates analysis results of a plurality of analysis engines.
  • the control device 30 may be configured to use the analysis result of at least one analysis engine.
  • FIG. 1 two analysis engines 10 and 20 are illustrated.
  • the information processing system of the present invention may include only one analysis engine, and may include three or more analysis engines. May be.
  • each device will be described in detail.
  • the first analysis engine 10 and the second analysis engine 20 are analysis engines that perform analysis processing on raw data that is analysis target data to be analyzed.
  • each of the analysis engines 10 and 20 includes an image recognition engine that detects a person from moving image data, an action tracking engine that tracks a movement trajectory of the person from moving image data, a voice recognition engine that recognizes utterance contents from audio data, Etc.
  • the analysis engines 10 and 20 are not limited to those described above, and the number is not limited to two.
  • the first analysis engine 10 and the second analysis engine 20 may be analysis engines that perform different analysis processes, or may be analysis engines that perform the same analysis process.
  • the first analysis engine 10 includes a first analysis unit 11, a first buffer interface unit 12, and a first buffer 13.
  • the first analysis unit 11 acquires raw data as analysis target data from a raw data DB (Data Base) 52 of the raw data storage device 50, and analyzes the raw data set for each analysis engine. Processing is performed, and the analysis result data is output to the first buffer interface unit 12. Note that the raw data that is the analysis target data is first acquired by the first raw data acquisition unit 41 provided in the raw data acquisition device 40. The raw data is accumulated in the raw data DB 52 from the raw data acquisition device 40 via the raw data DB interface unit 51 of the raw data storage device 50.
  • a raw data DB Data Base
  • the first buffer interface unit 12 outputs the analysis result data received from the first analysis unit 11 to the first buffer 13 and temporarily accumulates the analysis result data in the first buffer 13.
  • the data is output to the storage device 60.
  • the analysis result storage device 60 accumulates the analysis result data output from the first analysis engine 10 in the analysis result DB (Data (Base) 62 via the analysis result DB interface unit 61.
  • the first buffer interface unit 12 also outputs the analysis result data received from the first analysis unit 11 to the control device 30.
  • the second analysis engine 20 may be different in content of the analysis process from the first analysis engine 10 depending on the type thereof, but basically has the same configuration as the first analysis engine 10. That is, the second analysis engine 20 includes a second analysis unit 21, a second buffer interface unit 22, and a second buffer 23 as shown in FIG.
  • the second analysis unit 21 acquires raw data as analysis target data from a raw data DB (Data52Base) 52 of the raw data storage device 50, and is set for each analysis engine for the raw data.
  • the analysis process is performed, and the analysis result data is output to the second buffer interface unit 22.
  • the raw data that is the analysis target data is first acquired by the second raw data acquisition unit 42 provided in the raw data acquisition device 40.
  • the raw data is accumulated in the raw data DB 52 from the raw data acquisition device 40 via the raw data DB interface unit 51 of the raw data storage device 50.
  • the second analysis unit 21 executes the analysis process according to the control command. For example, processing such as performing analysis with higher analysis accuracy than normal analysis processing is performed. Specific examples will be described later.
  • the second buffer interface unit 22 outputs the analysis result data received from the second analysis unit 21 to the second buffer 23 and temporarily accumulates the analysis result data in the second buffer 23.
  • the data is output to the storage device 60.
  • the analysis result storage device 60 accumulates the analysis result data output from the second analysis engine 20 in an analysis result DB (DataDBBase) 62 via the analysis result DB interface unit 61.
  • the control device 30 is a server computer that includes an arithmetic device and a storage device. And as shown in FIG. 1, the control apparatus 30 is constructed
  • the program is stored in a storage device or recorded on a computer-readable recording medium.
  • the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
  • the analysis result acquisition unit 31 acquires the analysis result data output from the first analysis engine 10 (predetermined analysis engine) as described above, and passes it to the control instruction unit 32.
  • the control instructing unit 32 uses the control content set in the control rule 36 corresponding to the analysis result data from the first analysis engine 10 received from the analysis result acquisition unit 31, and the second The analysis processing operation by the other analysis engine 20 (other analysis engine) is controlled.
  • the control instruction unit 32 refers to the control rule 36 and determines whether or not the analysis process is executed by the second analysis engine 20 according to the value of the analysis result data by the first analysis engine 10 and the analysis process. Control execution timing.
  • the control instruction unit 32 refers to the control rule 36 and performs analysis processing so as to change the analysis accuracy of the analysis processing by the second analysis engine 20 according to the value of the analysis result data by the first analysis engine 10. Control the behavior.
  • control instruction unit 32 refers to the control rule 36 and determines a predetermined portion of the raw data to be analyzed by the second analysis engine 20 according to the value of the analysis result data by the first analysis engine 10. Designation is performed, and the analysis processing operation is controlled so as to perform analysis processing of the designated portion.
  • control instruction unit 32 sets the control rule 36 according to the degree of the analysis process result.
  • the operation of the second analysis engine 20 is controlled as described above with the set control content.
  • control instruction unit 32 The specific control instruction operation by the control instruction unit 32 will be described in detail together with a specific example of data stored as the control rule 36 when the operation is described.
  • the control instruction unit 32 executes the analysis process by the analysis engine selected from the plurality of analysis engines equipped.
  • the analysis processing operation of the selected analysis engine may be controlled in the same manner as described above, such as controlling the presence or absence of the At this time, the control instruction unit 32 may control the operation of the first analysis engine 10 itself that has received the analysis result.
  • the analysis result acquisition unit 31 of the control device 30 acquires the analysis result data from one or more other analysis engines in addition to the first analysis engine 10, that is, the control device 30 has a plurality of analysis engines. Analysis result data may be acquired from each.
  • the control instruction unit 32 may control the operation of the other analysis engines as described above with the control content set in the control rule 36 corresponding to the combination of the analysis result data. At this time, the control instruction unit 32 may control the operation of the first analysis engine 10 itself that has received the analysis result.
  • the control instruction unit 32 uses the raw data storage device 50 and the analysis result according to the control content set in the control rule 36 corresponding to the analysis result data acquired from the analysis result acquisition unit 31. Control is performed by the raw data acquisition control unit 33, the raw data storage control unit 34, and the analysis result storage control unit 35 (stored data control means) so as to control storage data such as raw data and analysis result data stored in the storage device 60. Issue a command.
  • the raw data acquisition control unit 33 controls the operation of the raw data acquisition device 40 so as to stop the acquisition of the selected raw data, for example, in response to a control command from the control instruction unit 32.
  • the raw data storage control unit 34 and the analysis result storage control unit 35 are stored in the raw data and analysis result DB 62 stored in the raw data DB 52, for example, in accordance with a control command from the control instruction unit 32. Then, the operation of the raw data storage device 50 and the analysis result storage device 60 is controlled so as to delete the selected raw data and analysis result data.
  • the raw data storage control unit 34 and the analysis result storage control unit 35 perform the raw data and the analysis result on the raw data stored in the raw data DB 52 and the analysis result data stored in the analysis result DB 62.
  • Data may be processed and re-stored. For example, compression or trimming a part of the image data may be performed so that the data capacity of the raw data or the analysis result data is reduced.
  • the raw data storage control unit 34 and the analysis result storage control unit 35 may add additional data to the raw data and analysis result data and re-store them.
  • the raw data storage control unit 34 and the analysis result storage control unit 35 are stored in the raw data and analysis result DB 62 stored in the raw data DB 52, for example, in accordance with a control command from the control instruction unit 32.
  • the storage period of the analysis result data may be set to a predetermined period.
  • the raw data storage control unit 34 and the analysis result storage control unit 35 are stored, for example, in the raw data or analysis result DB 62 stored in the raw data DB 52 in accordance with a control command from the control instruction unit 32.
  • the analysis result data may be controlled to be stored in another storage device different from the storage device in which the analysis result data is stored.
  • the raw data storage control unit 34 and the analysis result storage control unit 35 when the analysis result data by the first analysis engine 10 includes a degree indicating the probability of the analysis processing result, The storage state of the raw data stored in the raw data DB 52 and the analysis result data stored in the analysis result DB 62 may be controlled in the same manner as described above with the control content set in the control rule 36 corresponding to the degree. .
  • an analysis result acquisition unit of the control device 30 31 acquires the analysis result data from one or more other analysis engines in addition to the first analysis engine 10, that is, the control device 30 acquires the analysis result data from each of the plurality of analysis engines. Also good.
  • the control instruction unit 32 is stored in the raw data stored in the raw data DB 52 or the analysis result DB 62 with the control content set in the control rule 36 corresponding to the combination of each analysis result data. The storage state of the analysis result data is controlled as described above.
  • the control device 30 may execute both the operation for controlling the analysis engine shown in the flowchart of FIG. 2 and the operation for controlling the stored data shown in the flowchart of FIG. 3, and only one of the operations is performed. May be executed.
  • the first analysis engine 10 is an image recognition engine
  • the second analysis engine 20 is an action tracking engine
  • a control rule 36 having the contents shown in FIG. 4 is stored.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes whether the person existing in the image data is “female” or “male”. And the control apparatus 30 acquires the analysis result data (step S1).
  • control device 30 refers to the control rule shown in FIG. 4 and corresponds to the “tracking interval 1 minute (normal)” Is selected.
  • the control content “tracking interval 30 seconds” corresponding to the control rule shown in FIG. 4 is selected (step S2). .
  • the control device 30 controls the analysis processing by the second analysis engine 20 in accordance with the selected control content (step S3).
  • the control content of “tracking interval 1 minute (normal)” is selected, the control is performed as usual, and thus the second analysis engine 20 is not particularly controlled.
  • the second analysis engine 20 performs tracking processing at regular time intervals on moving image data that is analysis target data (raw data).
  • the control content of “tracking interval 30 seconds” is selected, a control command is issued to the second analysis engine 20 to perform tracking processing at “tracking interval 30 seconds”.
  • the tracking process is executed on the moving image data that is the analysis target data (raw data) at “30-second intervals” that is shorter than usual in accordance with the control command. .
  • the first analysis engine 10 is an image recognition engine
  • the second analysis engine 20 is a speech recognition engine
  • a control rule 36 having the contents shown in FIG. 5 is stored.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes whether or not there is a “suspicious person” registered in advance in the image data. And the control apparatus 30 acquires the analysis result data (step S1).
  • step S2 when the analysis result from the first analysis engine 10 is “not a suspicious person”, the control device 30 refers to the control rule shown in FIG. Is selected.
  • the analysis result from the first analysis engine 10 is “suspicious person”, the control content “recognition accuracy one level higher than usual” corresponding to the control rule shown in FIG. 5 is referred to. Select (step S2).
  • control device 30 controls the analysis processing by the second analysis engine 20 in accordance with the selected control content (step S3).
  • control content of “normal level recognition accuracy” is selected, the control is performed as usual, and thus the second analysis engine 20 is not particularly controlled.
  • the 2nd analysis engine 20 a speech recognition process is performed with normal precision with respect to the speech data used as analysis object data (raw data).
  • the control content of “recognition accuracy one level higher than normal” is selected, the second analysis engine 20 is controlled to perform the speech recognition process with “recognition accuracy one level higher than normal”. Issue a command.
  • the speech recognition process is executed with the recognition accuracy set to be one level higher than usual with respect to the speech data to be analyzed data (raw data) according to the control command.
  • analysis processing such as voice recognition is performed by shortening the sampling period for the voice data.
  • the first analysis engine 10 is an image recognition engine
  • the second analysis engine 20 is an action tracking engine
  • a control rule 36 having the contents shown in FIG. 6 is stored.
  • analysis processing is performed on image data that is analysis target data (raw data) in the first analysis engine 10 to recognize whether or not there is a “suspicious person” registered in the image data in advance. , “Probability of being a suspicious person” is calculated. And the control apparatus 30 acquires the analysis result data (step S1).
  • the control device 30 refers to the control rule shown in FIG. “Normal”) is selected. Further, when the analysis result from the first analysis engine 10 is “a suspicious person with a probability of 70%”, the control rule shown in FIG. 6 is referred to and the corresponding “tracking interval is 30 seconds”. Select the control content. Further, when the analysis result from the first analysis engine 10 is “100% probability of being a suspicious person”, referring to the control rule shown in FIG. The control content is selected (step S2).
  • the control device 30 controls the analysis processing by the second analysis engine 20 in accordance with the selected control content (step S3).
  • the control content of “tracking interval 1 minute (normal)” is selected, the control is performed as usual, and thus the second analysis engine 20 is not particularly controlled.
  • the second analysis engine 20 performs tracking processing at regular time intervals on moving image data that is analysis target data (raw data).
  • a control command is issued to the second analysis engine 20 to perform the tracking process at “tracking interval 30 seconds”.
  • the tracking process is executed on the moving image data that is the analysis target data (raw data) at “30-second intervals” that is shorter than usual in accordance with the control command.
  • the first analysis engine 10 is an image recognition engine
  • the second analysis engine 20 is a speech recognition engine
  • a third analysis engine (not shown) is an action tracking engine.
  • the analysis result data from the first analysis engine 10 and the third analysis engine is output to the control device 30, and the second analysis engine 20 is controlled by the control device 30 in accordance with the combination of these analysis result data. It will be done. Further, it is assumed that the control rule 36 having the contents shown in FIG. 7 is stored.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes whether the person existing in the image data is “female” or “male”.
  • the third analysis engine performs analysis processing on moving image data that is analysis target data (raw data), and tracks the movement trajectory of a person existing in the moving image data.
  • the control apparatus 30 acquires the analysis result data (step S1).
  • the control device 30 determines that the analysis result from the first analysis engine 10 is “female” and the movement trajectory that is the analysis result from the third analysis engine is “from point X to point Z”. Refers to the control rule shown in FIG. 7 and selects the corresponding control content “normal level recognition accuracy”. When the analysis result from the first analysis engine 10 is “female” and the movement trajectory that is the analysis result from the third analysis engine is “from Y point to Z point”, the control rule shown in FIG. , The control content “recognition accuracy one level higher than normal” is selected (step S2).
  • the control device 30 controls the analysis processing by the second analysis engine 20 in accordance with the selected control content (step S3).
  • the control content of “normal level recognition accuracy” is selected, the control is performed as usual, and thus the second analysis engine 20 is not particularly controlled.
  • the 2nd analysis engine 20 a speech recognition process is performed with normal precision with respect to the speech data used as analysis object data (raw data).
  • the control content of “recognition accuracy one level higher than normal” is selected, the second analysis engine 20 is controlled to perform the speech recognition process with “recognition accuracy one level higher than normal”. Issue a command.
  • the speech recognition process is executed with the recognition accuracy set to be one level higher than usual with respect to the speech data to be analyzed data (raw data) according to the control command.
  • the first analysis engine 10 is an image recognition engine
  • the second analysis engine 20 is a behavior tracking engine
  • a third analysis engine (not shown) is a speech recognition engine.
  • the analysis result data from the first analysis engine 10 and the third analysis engine is output to the control device 30, and the second analysis engine 20 is controlled by the control device 30 according to the combination of these analysis result data. It will be controlled. Further, it is assumed that the control rule 36 having the contents shown in FIG. 8 is stored.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes whether or not there is a “suspicious person” registered in advance in the image data.
  • the third analysis engine performs analysis processing on speech data that is analysis target data (raw data), and “words” are recognized from the speech data.
  • the control apparatus 30 acquires the analysis result data (step S1).
  • the control device 30 indicates that the analysis result from the first analysis engine 10 is “not a suspicious person” and is registered in advance in “words” detected as the analysis result from the third analysis engine.
  • the control content “tracking interval 1 minute (normal)” corresponding to the control rule shown in FIG. 8 is selected.
  • the analysis result from the first analysis engine 10 is “suspicious person”, and the “keyword (NG word) registered in advance in the“ word ”detected as the analysis result from the third analysis engine is If “exists”, the control content “tracking interval 15 seconds” corresponding to the control rule shown in FIG. 8 is selected (step S2).
  • the control device 30 controls the analysis processing by the second analysis engine 20 in accordance with the selected control content (step S3).
  • the control content of “tracking interval 1 minute (normal)” is selected, the control is performed as usual, and thus the second analysis engine 20 is not particularly controlled.
  • the second analysis engine 20 performs tracking processing at regular time intervals on moving image data that is analysis target data (raw data).
  • a control command is issued to the second analysis engine 20 to perform the tracking process at “tracking interval 15 seconds”.
  • the tracking process is executed on the moving image data that is the analysis target data (raw data) at “15-second intervals”, which is shorter than usual, according to the control command. .
  • the first analysis engine 10 is an image recognition engine
  • the second analysis engine 20 is a behavior tracking engine
  • a third analysis engine (not shown) is a speech recognition engine.
  • the analysis result data from the first analysis engine 10 and the third analysis engine is output to the control device 30, and the second analysis engine 20 is controlled by the control device 30 according to the combination of these analysis result data. It will be controlled. Further, it is assumed that the control rule 36 having the contents shown in FIG. 9 is stored.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes whether the person existing in the image data is “female” or “male”.
  • the third analysis engine performs analysis processing on the speech data that is analysis target data (raw data), recognizes “words” from the speech data, and calculates the “probability of being words”.
  • the control apparatus 30 acquires these analysis result data (step S1).
  • the control device 30 indicates that the analysis result from the first analysis engine 10 is “male”, and the “keyword (NG” registered in advance in the “word” detected as the analysis result from the third analysis engine. If “word does not exist”, the control content “tracking interval 1 minute (normal)” corresponding to the control rule shown in FIG. 9 is selected.
  • the analysis result from the first analysis engine 10 is “male”, and the “keyword (NG word) registered in advance in the“ word ”detected as the analysis result from the third analysis engine is 70%. In the case of “exists with probability”, the control content “tracking interval 30 seconds” corresponding to the control rule shown in FIG. 9 is selected.
  • the analysis result from the first analysis engine 10 is “male”, and the “keyword (NG word) registered in advance in the“ word ”detected as the analysis result from the third analysis engine is 100%. If “exists with probability”, the control content “tracking interval 15 seconds” corresponding to the control rule shown in FIG. 9 is selected (step S2).
  • the control device 30 controls the analysis processing by the second analysis engine 20 in accordance with the selected control content (step S3).
  • the control content of “tracking interval 1 minute (normal)” is selected, the control is performed as usual, and thus the second analysis engine 20 is not particularly controlled.
  • the second analysis engine 20 performs tracking processing at regular time intervals on moving image data that is analysis target data (raw data).
  • a control command is issued to the second analysis engine 20 to perform the tracking process at “tracking interval 30 seconds”.
  • the tracking process is executed on the moving image data that is the analysis target data (raw data) at “30-second intervals” that is shorter than usual in accordance with the control command.
  • control instruction unit 32 of the control device 30 controls the second analysis engine 20 in accordance with the analysis result data from the first analysis engine 10 and the like as described above, and can also perform the following control. .
  • control instruction unit 32 when the first analysis engine 10 outputs an analysis result that “female” exists from the image data, the control instruction unit 32 causes the second analysis engine 20 to output the “dress” of the person from the image data. Control to extract and analyze As another example, the control instruction unit 32 performs the second analysis when a “person requiring attention” registered in advance is detected from the image data by face authentication when the first analysis engine 10 enters the room. The engine 20 is controlled to execute the flow line analysis in the office from the moving image data.
  • control instruction unit 32 uses the second analysis engine 20 to generate a sound. Control to start the recognition process.
  • control instruction unit 32 uses the second analysis engine 20 when the certainty of recognition in speech recognition by the first analysis engine 10 is lower than a preset value and insufficient. Control is performed to start face authentication processing from image data.
  • the control instruction unit 32 when the first analysis engine 10 recognizes “person's age and sex” from the image data, the control instruction unit 32 is optimal for age and sex among a plurality of other analysis engines. If so, control is performed to select a preset voice recognition engine and start the voice recognition process. As another example, the control instruction unit 32 distinguishes “human” and “dog” from the voice data by the first analysis engine 10, and among the other analysis engines, “human” or “dog”. A dedicated analysis engine is selected, and control is performed to start a process of detecting a smile from image data using the analysis engine.
  • the first analysis engine 10 can detect a person by analyzing the flow line of the person from data such as RFID by the first analysis engine 10, the flow line location where the person can be detected. Only the face is extracted from the image data by the second analysis engine.
  • the control instruction unit 32 performs voice data at that time. To control the voice recognition by the second analysis engine.
  • control instruction unit 32 estimates the arrival time at a specific place (entrance) when the person can be detected by analyzing the flow line of the person by the first analysis engine 10, Control is performed so that the execution of face recognition processing by the second analysis engine at a specific location is started immediately before that time.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes the “existence” of the person and the “identity” of the person in the image data. Then, the control device 30 acquires these analysis result data (step S11). At this time, the analysis target data (raw data) used for the analysis processing is accumulated in the raw data DB 52 of the raw data storage device 50, and the analysis result data as the analysis result is analyzed by the analysis result storage device 60. The result is stored in the DB 62.
  • step S12 when the analysis result from the first analysis engine 10 is “no person detected”, the control instruction unit 32 of the control device 30 refers to the control rule shown in FIG.
  • the control content “DATA DELETION” is selected.
  • the analysis result from the first analysis engine 10 is “identification of extracted person”
  • the control content “raw data deletion” corresponding to the control rule shown in FIG. 10 is referred to. Select (step S12).
  • control instruction unit 32 of the control device 30 performs the raw data storage device 50 and the analysis result storage device 60 via the raw data storage control unit 34 and the analysis result storage control unit 35 according to the selected control content.
  • a control command is issued (step S13).
  • the control content of “DATA DELETION” is selected, the raw data stored in the raw data DB 52 of the raw data storage device 50 and the analysis result data stored in the analysis result DB 62 of the analysis result storage device 60 are displayed. And control to delete.
  • control content of “raw data deletion” is selected, control is performed to delete the raw data stored in the raw data DB 52 of the raw data storage device 50.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), and recognizes whether or not there is a “suspicious person” registered in advance in the image data. And the control apparatus 30 acquires the analysis result data (step S11).
  • control device 30 refers to the control rule shown in FIG. 11 and corresponds to “raw data high compression”. Is selected (step S12).
  • the analysis result from the first analysis engine 10 is “suspicious”, referring to the control rule shown in FIG. 11, there is no control content corresponding to the control rule.
  • control device 30 controls the stored data in accordance with the selected control content (step S13). Therefore, when the control content of “raw data high compression” is selected, among the raw data stored in the raw data DB 52 of the raw data storage device 50, the first analysis engine 10 “not suspicious”. The corresponding raw data analyzed as “is subjected to high compression processing and stored again in the raw data DB. As a result, it is possible to reduce the volume of raw data that is less likely to be checked again.
  • the first analysis engine 10 performs analysis processing on speech data that is analysis target data (raw data), recognizes “words” from the speech data, and calculates the “probability of words”. The Then, the control device 30 acquires these analysis result data (step S11).
  • the control device 30 refers to the control rule shown in FIG. Select the control content of “High level sound quality”.
  • the probability of speech recognition, which is the analysis result from the first analysis engine 10 is “30% or more and less than 70%”, referring to the control rule shown in FIG. Is selected.
  • the probability of speech recognition, which is an analysis result from the first analysis engine 10 is “70% or more”, referring to the control rule shown in FIG. The contents are selected (step S12).
  • the control device 30 controls the stored data in accordance with the selected control content (step S13). Therefore, when the control content of “high level sound quality” is selected, the speech recognition by the first analysis engine 10 among the raw data stored in the raw data DB 52 of the raw data storage device 50 is “30”. The raw data analyzed as “probability of less than%” is processed so as to obtain a high-level sound quality (or nothing is performed so as not to deteriorate the sound quality), and the raw data is stored again. When the control content of “medium level sound quality” is selected, the speech recognition by the first analysis engine 10 among the raw data stored in the raw data DB 52 of the raw data storage device 50 is “30% or more.
  • the raw data analyzed as “probability of less than 70%” is subjected to a compression process or the like so as to obtain an intermediate sound quality, and is stored again in the raw data DB.
  • speech recognition by the first analysis engine 10 among the raw data stored in the raw data DB 52 of the raw data storage device 50 is “70% or more.
  • the raw data analyzed as “probability” is subjected to high compression processing or the like so as to obtain a low-level sound quality and stored again in the raw data DB.
  • the first analysis engine 10 executes analysis processing on image data that is analysis target data (raw data), and recognizes an object existing in the image data. And the control apparatus 30 acquires the analysis result data (step S11).
  • the control device 30 refers to the control rule shown in FIG. 13 and controls corresponding to “retention period 3 years”. Select content.
  • the analysis result from the first analysis engine 10 is “car”
  • the control content “storage period 1 year” corresponding to the control rule shown in FIG. 13 is selected.
  • the analysis result from the first analysis engine 10 is “dog”
  • the control content “save period half year” corresponding to the control rule shown in FIG. 13 is selected (step S12).
  • the control device 30 controls the stored data in accordance with the selected control content (step S13). Therefore, when the control content of “save period 3 years” is selected, the first analysis engine 10 analyzes “person” among the raw data stored in the raw data DB 52 of the raw data storage device 50. The corresponding raw data is set to “save period 3 years” and stored again in the raw data DB. At this time, “retention period 3 years” may also be set for the corresponding analysis result data stored in the analysis result DB 62 of the analysis result storage device 60 and stored again in the analysis result DB. Similarly, when the control content of “save period 1 year” is selected, “save period 1 year” is set for the corresponding raw data and stored again in the raw data DB. When the control content “semi-year” is selected, “save period half-year” is set for the corresponding raw data and stored again in the raw data DB.
  • the first analysis engine 10 performs analysis processing on image data that is analysis target data (raw data), detects a person existing in the image data, and identifies the person. And the control apparatus 30 acquires the analysis result data (step S11).
  • the control device 30 refers to the control rule shown in FIG. 14 and corresponds to the “server on the network”. Select the control content.
  • the control content “tape” corresponding to the control rule shown in FIG. 14 is selected. If the analysis result from the first analysis engine 10 is “outside person”, the control content “disk” corresponding to the control rule shown in FIG. 14 is selected (step S12).
  • the control device 30 controls the stored data in accordance with the selected control content (step S13). Therefore, when the control content of “no person” is selected, the raw data stored in the raw data DB 52 of the raw data storage device 50 is analyzed by the first analysis engine 10 as “no person”. The raw data is stored again in the “server on the network”. At this time, the corresponding analysis result data stored in the analysis result DB 62 of the analysis result storage device 60 may be stored again in the “server on the network”. Similarly, when the control content of “employee” is selected, the corresponding raw data is stored again on “tape”, and when the control content of “external person” is selected, the corresponding raw data is stored. Is stored on the “disk” again. Note that the servers, tapes, and disks on the network are connected to the control device 30.
  • raw data that does not show people is low in necessity, it is stored on a server on a network with low accessibility, and raw data that shows employees is also low in necessity. Therefore, it is stored on a tape with low accessibility.
  • the raw data showing “persons outside the company” needs to be confirmed in the future, so it is stored on a disk with good accessibility.
  • control instruction unit 32 of the control device 30 controls the stored data in accordance with the analysis result data from the first analysis engine 10 and the like as described above, and can also perform the following control.
  • control instruction unit 32 detects the “face part” from the image data by the first analysis engine 10 and cuts out and stores only the “face part” of the raw data to be analyzed, or the raw data Controls to save at a resolution other than the “face part”.
  • control instruction unit 32 detects “words having specific characteristics” set in advance from voice data by the first analysis engine 10, and “ Controls to save only the part of "words with specific characteristics”.
  • control instruction unit 32 detects the “important person” from the image data by the first analysis engine 10 and “circles” the “face part” of the raw data that is the analysis target. "Additional data such as” is added and controlled to be saved.
  • control instruction unit 32 detects a “specific word” preset from the speech data by the first analysis engine 10, and the “specific word” of the raw data to be analyzed "Additional data such as” beep tone "is inserted immediately before the" "part and controlled to be saved.
  • Analysis result acquisition means 101 for acquiring an analysis result by a predetermined analysis engine 111; Based on the analysis result obtained by the predetermined analysis engine 111 acquired by the analysis result acquisition unit 101, predetermined storage that is processing target data input to the analysis engine and / or processing result data output from the analysis engine Storage data control means 102 for controlling the storage state of the storage data stored in the device 112;
  • An information processing apparatus 100 comprising:
  • (Appendix 2) An information processing apparatus according to attachment 1, wherein The storage data control unit selects the storage data stored in a predetermined storage device based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition unit, and deletes the selected storage data To Information processing device.
  • the storage data control means processes and re-stores the storage data stored in a predetermined storage device based on the analysis result by the predetermined analysis engine acquired by the analysis result acquisition means.
  • Information processing device
  • (Appendix 4) An information processing apparatus according to attachment 3, wherein The storage data control means processes and re-stores the data stored in a predetermined storage device so as to reduce the data capacity based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition means. To Information processing device.
  • the storage data control means sets a retention period of the storage data stored in a predetermined storage device based on an analysis result by a predetermined analysis engine acquired by the analysis result acquisition means.
  • Information processing device
  • the storage data control unit stores the storage data stored in a predetermined storage device in another storage device based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition unit. Information processing device.
  • Appendix 7 An information processing apparatus according to any one of appendices 1 to 6,
  • the storage data control means adds and re-stores additional data to the storage data stored in a predetermined storage device based on the analysis result by the predetermined analysis engine acquired by the analysis result acquisition means.
  • Information processing device
  • the storage data control means controls the storage state of the storage data stored in a predetermined storage device based on the degree of analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition means. Information processing device.
  • the analysis result acquisition means acquires analysis results obtained by a plurality of predetermined analysis engines
  • the storage data control means controls a storage state of the storage data stored in a predetermined storage device based on a combination of analysis results from the plurality of predetermined analysis engines.
  • Information processing device
  • An analysis result acquisition means for acquiring an analysis result by a predetermined analysis engine; Based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition means, the processing target data input to the analysis engine and / or the processing result data output from the analysis engine are stored in a predetermined storage device.
  • Storage data control means for controlling the storage state of the stored storage data; A program to realize
  • the storage data control unit selects the storage data stored in a predetermined storage device based on the analysis result obtained by the predetermined analysis engine acquired by the analysis result acquisition unit, and deletes the selected storage data To program.
  • the information processing apparatus acquires the analysis result by the predetermined analysis engine, and based on the acquired analysis result by the predetermined analysis engine, the processing target data input to the analysis engine and / or the process output from the analysis engine Controlling the storage state of the storage data stored in the predetermined storage device, which is the result data, Stored data control method.
  • Appendix 13 A storage data control method according to appendix 12, The information processing apparatus selects the storage data stored in a predetermined storage device based on the obtained analysis result by the predetermined analysis engine, and deletes the selected storage data. Stored data control method.
  • the program is stored in a storage device or recorded on a computer-readable recording medium.
  • the recording medium is a portable medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
  • SYMBOLS 10 1st analysis engine 11 1st analysis part 12 1st buffer interface part 13 1st buffer 20 2nd analysis engine 21 2nd analysis part 22 2nd buffer interface part 23 2nd buffer 30 Control Device 31 Analysis result acquisition unit 32 Control instruction unit 33 Raw data acquisition control unit 34 Raw data storage control unit 35 Analysis result storage control unit 36 Control rule 40 Raw data acquisition device 41 First raw data acquisition unit 42 Second raw data Acquisition unit 50 Raw data storage device 51 Raw data DB interface unit 52 Raw data DB 60 Analysis Result Storage Device 61 Analysis Result DB Interface Unit 62 Analysis Result DB DESCRIPTION OF SYMBOLS 100 Information processing apparatus 101 Analysis result acquisition means 102 Storage data control means 111 Predetermined analysis engine 112 Predetermined storage device

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Le présent dispositif de traitement d'informations (100) comprend : un moyen d'acquisition de résultat d'analyse (101) qui acquiert un résultat d'analyse au moyen d'un moteur d'analyse prédéterminé (111) ; et un moyen de commande de données de stockage (102) qui commande l'état de stockage de données de stockage qui sont stockées sur un dispositif de stockage prédéterminé (112) sur la base du résultat d'analyse fourni par le moteur d'analyse prédéterminé (111), tel qu'il a été acquis par le moyen d'acquisition de résultat d'analyse (101), les données de stockage étant des données à traiter fournies en entrée au moteur d'analyse et/ou des données de résultat de traitement qui ont été fournies en sortie par le moteur d'analyse.
PCT/JP2012/003015 2011-06-29 2012-05-09 Dispositif de traitement d'informations WO2013001702A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-143816 2011-06-29
JP2011143816 2011-06-29

Publications (1)

Publication Number Publication Date
WO2013001702A1 true WO2013001702A1 (fr) 2013-01-03

Family

ID=47423635

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/003015 WO2013001702A1 (fr) 2011-06-29 2012-05-09 Dispositif de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2013001702A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014191246A (ja) * 2013-03-28 2014-10-06 Nec Corp 認識処理制御装置、認識処理制御方法および認識処理制御プログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05137104A (ja) * 1991-11-15 1993-06-01 Hitachi Ltd 動画データ圧縮蓄積再生装置
JP2000253390A (ja) * 1999-03-01 2000-09-14 Hitachi Denshi Ltd 監視記録装置
JP2004056473A (ja) * 2002-07-19 2004-02-19 Matsushita Electric Ind Co Ltd 監視制御装置
JP2007251324A (ja) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc 記録方法
JP2007251321A (ja) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc 画像記録方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05137104A (ja) * 1991-11-15 1993-06-01 Hitachi Ltd 動画データ圧縮蓄積再生装置
JP2000253390A (ja) * 1999-03-01 2000-09-14 Hitachi Denshi Ltd 監視記録装置
JP2004056473A (ja) * 2002-07-19 2004-02-19 Matsushita Electric Ind Co Ltd 監視制御装置
JP2007251324A (ja) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc 記録方法
JP2007251321A (ja) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc 画像記録方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014191246A (ja) * 2013-03-28 2014-10-06 Nec Corp 認識処理制御装置、認識処理制御方法および認識処理制御プログラム

Similar Documents

Publication Publication Date Title
US11270695B2 (en) Augmentation of key phrase user recognition
US11861264B2 (en) Portable terminal device and information processing system
CN108573701B (zh) 基于唇部检测的查询端点化
US9959865B2 (en) Information processing method with voice recognition
CN108962227A (zh) 语音起点和终点检测方法、装置、计算机设备及存储介质
TW201606760A (zh) 從音頻訊號的即時情緒辨識
US10896688B2 (en) Real-time conversation analysis system
CN110853646A (zh) 会议发言角色的区分方法、装置、设备及可读存储介质
JP5849761B2 (ja) 音声認識システム、音声認識方法および音声認識プログラム
CN113947376B (zh) 基于多重生物特征的c/s打卡方法和装置
CN102067589A (zh) 数字录像机系统及其应用方法
KR20220041891A (ko) 얼굴 정보를 데이터베이스에 입력하는 방법 및 설치
JP2024516815A (ja) エピソード的コンテンツをサポートする話者ダイアライゼーション
CN112017663A (zh) 一种语音泛化方法、装置及计算机存储介质
JP2018087838A (ja) 音声認識装置
WO2013001702A1 (fr) Dispositif de traitement d'informations
US10007842B2 (en) Same person determination device and method, and control program therefor
WO2013001703A1 (fr) Dispositif de traitement d'informations
WO2017024835A1 (fr) Procédé et dispositif de reconnaissance vocale
KR101933822B1 (ko) 얼굴인식 기반 지능형 스피커, 이를 이용한 능동적인 대화 제공 방법 및 이를 수행하기 위한 기록매체
JP5751889B2 (ja) 顔画像認証装置
CN112861816A (zh) 异常行为检测方法及装置
JP2017062594A (ja) 迷惑行為者推定システム、迷惑行為者推定システムの制御方法及び制御プログラム
KR102479400B1 (ko) 영상을 활용한 딥러닝 모델 기반의 실시간 립리딩 인터페이스 시스템
KR102195925B1 (ko) 음성 데이터 수집 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12803693

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12803693

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP