WO2020044646A1 - Image processing device, image processing method, and program - Google Patents

Image processing device, image processing method, and program Download PDF

Info

Publication number
WO2020044646A1
WO2020044646A1 PCT/JP2019/015214 JP2019015214W WO2020044646A1 WO 2020044646 A1 WO2020044646 A1 WO 2020044646A1 JP 2019015214 W JP2019015214 W JP 2019015214W WO 2020044646 A1 WO2020044646 A1 WO 2020044646A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image processing
photographing
shooting
processing apparatus
Prior art date
Application number
PCT/JP2019/015214
Other languages
French (fr)
Japanese (ja)
Inventor
齋藤 靖
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2020540045A priority Critical patent/JPWO2020044646A1/en
Publication of WO2020044646A1 publication Critical patent/WO2020044646A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Definitions

  • the present invention relates to an image processing device, an image processing method, and a program.
  • Patent Literature 1 discloses an image recording device attached to a car, which calculates an image acquiring unit that acquires an image, an image encoding unit that encodes an acquired image, and a position and a time when the image is acquired. Generating means for generating position information and time information, recording means for recording the encoded image on the recording medium in association with the position information and time information, and, from among the images recorded by the recording means, There is disclosed an image recording apparatus including an extracting unit that extracts an image based on the extracted position information and time information, and an output unit that outputs the extracted image to the outside.
  • Patent Literature 1 that extracts an image based on only position information and time information
  • the image to be extracted is insufficiently narrowed down.
  • a large number of images are extracted, and a large number of images are transmitted to the outside, and the communication load increases.
  • the present invention has an object to reduce a communication load in a system for transmitting a predetermined image generated by a photographing unit attached to a moving body to the outside.
  • Photographing means attached to a moving body, for photographing the outside of the moving body, Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data; Registration means for registering the object specified by the attribute data as index data of the photographing data, Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected, Determining means for determining whether an object meeting the predetermined condition is registered in the index data, When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device, Is provided.
  • Computer Photographing means attached to a moving body and photographing the outside of the moving body, Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data; Registration means for registering the object specified by the attribute data as index data of the photographing data, Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected, Determining means for determining whether an object meeting the predetermined condition is registered in the index data, When it is determined that an object that meets the predetermined condition is registered in the index data, a transmitting unit that transmits at least a part of the photographing data to an external device, A program to function as a program is provided.
  • a communication load can be reduced in a system for transmitting a predetermined image captured by a capturing unit attached to a moving object to the outside.
  • FIG. 1 is an example of a functional block diagram of an image processing system according to an embodiment.
  • FIG. 2 is a diagram illustrating a realization example of the image processing apparatus 10 according to the embodiment.
  • FIG. 2 is a diagram illustrating a realization example of the image processing apparatus 10 according to the embodiment.
  • FIG. 2 is a diagram illustrating a realization example of the image processing apparatus 10 according to the embodiment.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of the image processing apparatus according to the embodiment.
  • FIG. 2 is an example of a functional block diagram of the image processing apparatus 10 of the present embodiment.
  • FIG. 4 is a diagram schematically illustrating an example of information processed by the image processing apparatus 10 according to the embodiment.
  • FIG. 4 is a diagram schematically illustrating an example of information processed by the image processing apparatus 10 according to the embodiment.
  • FIG. 3 is a sequence diagram illustrating an example of a processing flow of the image processing system according to the embodiment.
  • FIG. 3 is a flowchart illustrating an example of a processing flow of the image processing apparatus 10 according to the embodiment.
  • FIG. 2 is an example of a functional block diagram of a photographing unit 11 of the present embodiment.
  • FIG. 4 is a diagram schematically illustrating an example of information processed by the image processing apparatus 10 according to the embodiment.
  • FIG. 3 is a diagram for describing an example of a process of the image processing apparatus 10 of the present embodiment.
  • the image processing system includes a plurality of image processing devices 10 and a request device 20.
  • Each of the plurality of image processing devices 10 and the request device 20 are configured to be able to communicate with each other.
  • the plurality of image processing apparatuses 10 are used by being attached to different moving bodies, respectively.
  • the moving body is an object configured to be movable on the ground, under the ground, on the water, underwater or in the air, and examples thereof include an automobile, a motorcycle, a bicycle, a train, a ship, an airplane, and an unmanned aerial vehicle. It is not limited to.
  • the image processing device 10 may be the mobile object mounting device 1 as shown in FIG.
  • the moving body mounting device 1 is a device that is mounted on the moving body and is basically always attached to a predetermined position of the moving body.
  • the image processing device 10 may be a portable device 2 as shown in FIG.
  • the portable device 2 include, but are not limited to, a smartphone, a tablet terminal, and a mobile phone.
  • the portable device 2 when the owner of the portable device 2 moves on the moving body, the portable device 2 is brought inside the moving body and attached to a predetermined position of the moving body.
  • the image processing apparatus 10 may be realized by interlocking the mobile device 1 and the portable device 2 as shown in FIG.
  • the moving body mounting device 1 is mounted on a moving body and basically always attached to a predetermined position of the moving body.
  • the portable device 2 is brought inside the moving object when the owner of the portable device 2 moves by the moving object, and is attached to a predetermined position of the moving object.
  • the mobile device 1 and the portable device 2 are connected by wire and / or wirelessly, and execute predetermined processing in conjunction with each other.
  • the image processing device 10 is attached to the moving body, and captures an image of the outside of the moving body, analyzes the captured data, generates attribute data of the object reflected in the captured data, and outputs the object specified by the attribute data. For registering as the index data of the photographing data.
  • the image processing apparatus 10 further includes: a unit that acquires a request for photographing data in which an object that meets a predetermined condition is reflected; a unit that determines whether an object that meets a predetermined condition is registered in the index data; Means for transmitting at least a part of the photographing data to the request device 20 (external device) when it is determined that an object meeting the condition is registered in the index data.
  • the timing at which the imaging data is transmitted to the request terminal 20 is, for example, the timing at which the user operates the image processing apparatus 10 to permit the transmission of the imaging data, but is not limited to this.
  • an image processing device 10 it is possible to extract, from the photographing data generated by the photographing means attached to the moving body, photographing data in which a predetermined object is reflected, and transmit the photographing data to the request device 20. . Since the imaging data can be sufficiently narrowed down based on whether or not a predetermined object is reflected in the imaging data, the image processing apparatus 10 can reduce the imaging data transmitted to the request device 20 in response to the request. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
  • the functional units included in the image processing apparatus 10 of the present embodiment include a CPU (Central Processing Unit) of an arbitrary computer, a memory, a program loaded into the memory, and a storage unit such as a hard disk for storing the program (the apparatus is shipped in advance).
  • a CPU Central Processing Unit
  • the apparatus is shipped in advance
  • storage media such as CDs (Compact Discs) and programs downloaded from servers on the Internet.) It is realized by any combination. It will be understood by those skilled in the art that there are various modifications in the method and apparatus for realizing the method.
  • FIG. 5 is a block diagram illustrating a hardware configuration of the image processing apparatus 10 according to the present embodiment.
  • the image processing apparatus 10 includes a processor 1A, a memory 2A, an input / output interface 3A, a peripheral circuit 4A, and a bus 5A.
  • the peripheral circuit 4A includes various modules.
  • the image processing device 10 may not have the peripheral circuit 4A.
  • the image processing device 10 may be configured by a plurality of physically separated devices. In this case, each of the plurality of devices can have the above hardware configuration.
  • the bus 5A is a data transmission path through which the processor 1A, the memory 2A, the peripheral circuit 4A, and the input / output interface 3A mutually transmit and receive data.
  • the processor 1A is an arithmetic processing device such as a CPU and a GPU (Graphics Processing Unit).
  • the memory 2A is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory).
  • the input / output interface 3A includes an interface for acquiring information from an input device, an external device, an external server, an external sensor, a camera, and the like, and an interface for outputting information to an output device, an external device, an external server, and the like. .
  • the input device is, for example, a keyboard, a mouse, a microphone, and the like.
  • the output device is, for example, a display, a speaker, a printer, a mailer, or the like.
  • the processor 1A can issue a command to each module and perform a calculation based on the calculation results.
  • FIG. 6 shows an example of a functional block diagram of the image processing apparatus 10.
  • the image processing device 10 includes an imaging unit 11, an analysis unit 12, a registration unit 13, a data storage unit 14, a request acquisition unit 15, a determination unit 16, and a transmission unit 17.
  • the image processing apparatus 10 may not have the data storage unit 14.
  • an external device configured to be able to communicate with the image processing device 10 includes the data storage unit 14.
  • the photographing unit 11 is attached to the moving body, and photographs the outside of the moving body. Since the definition of the moving object is as described above, the description here is omitted.
  • the photographing unit 11 may generate moving image data, or may photograph still image data regularly or irregularly.
  • the analysis unit 12 analyzes the photographing data (moving image data, still image data, etc.) generated by the photographing unit 11 and generates attribute data of the object reflected in the photographing data.
  • the analyzing unit 12 may analyze all of a plurality of frame data included in the moving image data or analyze a part of frame data, for example, frame data every predetermined number of frames. It may be targeted.
  • the analysis unit 12 may analyze all still image data.
  • the analysis unit 12 extracts an object shown in the image.
  • the analysis unit 12 can extract the plurality of objects.
  • Object extraction can be realized by using, for example, a contour extraction process, a template matching process, or the like, but the means for realizing the object extraction is not limited.
  • Examples of the object to be extracted from the image include, but are not limited to, a person, another animal, a moving object, a light-emitting object, and the like.
  • the analysis unit 12 After extracting the object from the image, the analysis unit 12 generates attribute data indicating the appearance characteristics of each of the extracted objects.
  • the analysis unit 12 analyzes an image according to a predetermined rule according to various objects, and generates attribute data of the extracted objects.
  • the analysis unit 12 may detect the face of the person and generate attribute data indicating gender, age, and the like from the features of the appearance of the face.
  • the analysis unit 12 may generate attribute data indicating the presence or absence of a wearing object (eg, glasses, sunglasses, a mask, or the like) on the face.
  • the analysis unit 12 may generate attribute data indicating the characteristics of the clothes of the person (eg, the colors and shapes of the upper body and lower body clothes).
  • the analysis unit 12 may generate attribute data such as a person's body type (eg, normal, thin, thick) and height.
  • the analysis unit 12 collates the reference data indicating the appearance characteristics of each of a plurality of persons held in advance with the appearance characteristics of the persons appearing in the image, so that who is the person appearing in the image. May be specified, and attribute data for identifying the person may be generated.
  • the analysis unit 12 If the extracted object is a moving object, the analysis unit 12 generates attribute data indicating the type of the moving object, the vehicle type, the maker, the color of the moving object, information described on a license plate, the number of occupants, and the like. Is also good.
  • the analysis unit 12 may generate attribute data indicating the type of the animal, the color of the body, and the like.
  • the analysis unit 12 may generate attribute data indicating the shape, color, size, emitting light, and the like of the object.
  • the analysis unit 12 can generate the above-described attribute data by analyzing an image using any image analysis technology.
  • the registration unit 13 registers an object specified by the attribute data generated by the analysis unit 12 as index data of shooting data.
  • FIG. 7 schematically shows an example of the index data.
  • an image ID identifier
  • an object specified by attribute data are registered in association with each other.
  • the image ID is information for identifying each image analyzed by the analysis unit 12.
  • the data storage unit 14 shown in FIG. 6 stores the index data. Further, the data storage unit 14 stores the photographing data generated by the photographing unit 11. When the imaging unit 11 generates moving image data, the data storage unit 14 may classify the imaging data generated by the imaging unit 11 into a plurality of groups and store the data, or may store the data without classification. Good. Examples of classification include, but are not limited to, an example in which a group from the start of shooting to the end of shooting is grouped into one group, and an example of grouping based on shooting date and time (eg, grouping by shooting date).
  • the request acquisition unit 15 acquires a request for imaging data in which an object meeting a predetermined condition is shown.
  • an operator who is searching for imaging data showing a predetermined object inputs predetermined conditions via an input device of the request device 20 or an input device connected to the request device 20, and inputs the predetermined conditions.
  • the user makes an input requesting shooting data in which an object suitable for is displayed.
  • the request device 20 transmits a request for imaging data in which an object meeting a predetermined condition is shown to a plurality of image processing devices 10 registered in advance.
  • the request acquisition unit 15 acquires the request transmitted from the request device 20 in this manner.
  • the predetermined condition is a condition for specifying an object with the same type of information as the attribute data generated by the analysis unit 12.
  • the predetermined condition may be represented by a logical expression in which information of the same type as the attribute data is connected by a logical operator.
  • An example of such a predetermined condition is “person” + “30s” + “male” + “wearing glasses”.
  • the determination unit 16 determines whether an object meeting a predetermined condition is registered in the index data.
  • the "object meeting the predetermined condition" may include an object satisfying the predetermined condition at a predetermined level or more, in addition to an object satisfying the predetermined condition 100%.
  • An object that satisfies the predetermined condition at a predetermined level or higher is, for example, a case where the predetermined condition is configured by combining a plurality of attribute data such as “person” + “30s” + “male” + “wearing glasses”.
  • an object that satisfies attribute data of a predetermined ratio or more (eg, 60%, 70%, etc.) among a plurality of attribute data may be used.
  • the transmission unit 17 transmits at least a part of the photographing data to the request device 20 when it is determined that an object meeting a predetermined condition is registered in the index data.
  • the transmission unit 17 may transmit all the moving image data generated by the imaging unit 11 and stored in the data storage unit 14 to the request device 20.
  • the amount of data transmitted and received may be enormous.
  • the transmission unit 17 may transmit a part of the moving image data generated by the imaging unit 11 and stored in the data storage unit 14 to the request device 20. For example, when the data storage unit 14 stores moving image data classified into a plurality of groups, the transmission unit 17 determines the image ID (FIG. 7) in which the index data is associated with an object that meets a predetermined condition. ) May be transmitted to the requesting device 20 only.
  • the transmission unit 17 may transmit to the request device 20 only moving image data for a predetermined time before and after an image ID (see FIG. 7) associated with an object meeting a predetermined condition in the index data. .
  • an operator who is searching for photographing data showing a predetermined object inputs predetermined conditions via an input device of the request device 20 or an input device connected to the request device 20, and inputs the predetermined conditions.
  • the user makes an input requesting shooting data in which an object suitable for is displayed.
  • the request device 20 transmits a request for photographing data in which an object meeting a predetermined condition is shown to a plurality of image processing devices 10 registered in advance (S10).
  • the image processing apparatus 10 determines whether or not an object meeting a predetermined condition is registered in the index data of the photographing data (see FIG. 7) (S11).
  • the image processing device 10 transmits the determination result to the request device 20 (S12). For example, when an object meeting a predetermined condition is registered in the index data of the photographing data, the image processing device 10 can transmit at least a part of the photographing data to the request device 20. On the other hand, when an object matching the predetermined condition is not registered in the index data of the shooting data, the image processing device 10 can notify the request device 20 that there is no shooting data matching the condition.
  • the image processing apparatus 10 may output information for inquiring whether to transmit the photographing data to the requesting apparatus 20 before transmitting at least a part of the photographing data to the requesting apparatus 20. Then, the image processing device 10 may transmit at least a part of the photographing data to the request device 20 after obtaining the answer of “transmittable” in response to the inquiry. Note that, when the image processing apparatus 10 obtains a “transmission impossible” response to the inquiry, the image processing apparatus 10 does not transmit at least a part of the imaging data to the request device 20. The image processing device 10 outputs the inquiry to the user having the image processing device 10. Then, the image processing apparatus 10 acquires a response to the inquiry from the user.
  • the image processing apparatus 10 may display the inquiry on a display of the image processing apparatus 10 or a display connected to the image processing apparatus 10. Then, the image processing device 10 may receive an input of a response to the inquiry via an input device included in the image processing device 10 or an input device connected to the image processing device 10. Examples of the input device include a touch panel display, physical buttons, and a microphone, but are not limited thereto.
  • the image processing apparatus 10 may output information indicating a reward that can be obtained when the imaging data is transmitted to the requesting apparatus 20 in addition to the information to be inquired.
  • the image processing device 10 can receive information indicating a reward from the request device 20.
  • the output of the information indicating the inquiry or the reward may be displayed on a display provided in the image processing apparatus 10 or a display connected to the image processing apparatus 10, or may be an e-mail addressed to a pre-registered e-mail address. May be transmitted.
  • the image processing apparatus 10 Upon detecting the index data generation timing (Yes in S20), the image processing apparatus 10 analyzes the photographing data to generate attribute data (S21) and registers it as index data (S22).
  • the index data generation timing may be, for example, the timing at which the imaging unit 11 has finished imaging.
  • the image processing apparatus 10 analyzes newly generated imaging data by batch processing, generates attribute data, and registers it as index data.
  • the analysis processing is not performed in parallel with the imaging processing of the imaging unit 11 but is performed at the timing when the imaging processing of the imaging unit 11 is completed, so that the processing load on the image processing apparatus 10 can be reduced. .
  • the index data generation timing may be, for example, the timing at which the imaging unit 11 starts imaging.
  • the image processing apparatus 10 analyzes newly generated imaging data by real-time processing, generates attribute data, and registers it as index data. With such real-time processing, it is possible to respond to a request for real-time imaging data immediately after a predetermined event such as an incident or accident has occurred.
  • the imaging data including a predetermined object is extracted from the imaging data generated by the imaging unit attached to the moving object, and transmitted to the request device 20. can do. Since the image data can be sufficiently narrowed down depending on whether or not a predetermined object is reflected in the image data, the image processing device 10 can reduce the image data transmitted to the request device 20 in response to the request. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
  • the image processing device 10 of the present embodiment does not transmit all the moving image data to the request device 20 in response to the request, but includes a portion where a predetermined object is reflected. May be transmitted to the requesting device 20. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
  • the image processing system including the image processing device 10 and the request device 20 for example, it is possible to efficiently collect moving image data obtained by photographing the site of an incident or accident.
  • the image processing apparatus 10 that generates and stores an image can determine the presence or absence of the corresponding data, and transmit the imaging data to the requesting apparatus 20 as necessary. For this reason, all of the photographing data stored in the image processing apparatus 10 is transmitted to the requesting apparatus 20, and compared with the case where the requesting apparatus 20 determines the presence or absence of the corresponding data, the image processing apparatus 10 and the requesting apparatus 20 are compared. The communication load between them can be reduced.
  • the owner of the image processing apparatus 10 recognizes the presence or absence of the corresponding data without performing the troublesome work of visually checking the image. be able to.
  • the image processing apparatus 10 determines that the image data has the corresponding data, the image processing apparatus 10 can inquire of the owner of the image processing apparatus 10 whether or not the imaging data can be transmitted, and can perform a process corresponding to the answer. For this reason, it is possible to suppress the inconvenience that the photographing data that is not desired by the owner of the image processing apparatus 10 is transmitted to the outside without intention.
  • the image processing apparatus 10 can output a reward obtained when the photographing data is provided. By outputting the reward information at an appropriate timing, the owner of the image processing apparatus 10 can quickly determine whether or not to provide the reward.
  • the overall image of the image processing system is the same as in the first embodiment.
  • the image processing apparatus 10 according to the present embodiment is photographed at a predetermined position and / or at a predetermined date and time from a photographing data generated by photographing means attached to a moving object, and a predetermined object is reflected.
  • the photographing data can be extracted and transmitted to the request device 20.
  • the photographing data can be sufficiently narrowed down using the photographing position and / or the photographing date and time. Can reduce the number of pieces of photographing data to be transmitted.
  • FIG. 6 An example of a functional block diagram of the image processing apparatus 10 is shown in FIG. 6, as in the first embodiment.
  • the image capturing unit 11 includes a position detecting unit 111 and a date and time detecting unit 112, as shown in FIG.
  • the position detection unit 111 detects a current position.
  • the detection of the current position includes, for example, the use of a GPS (global positioning system), but is not limited thereto.
  • the date and time detection unit 112 detects the current date and time.
  • the detection of the current date and time includes, for example, use of a built-in clock incorporated in the image processing apparatus 10, but is not limited thereto.
  • the photographing unit 11 can use the values detected by the position detecting unit 111 and the date and time detecting unit 112 to associate the photographing position and the photographing date and time with the photographing data.
  • Other configurations of the imaging unit 11 are the same as those of the first embodiment.
  • the configuration of the analysis unit 12 is the same as that of the first embodiment.
  • the registration unit 13 registers the shooting position and shooting date and time of the shooting data generated by the shooting unit 11 as index data of the shooting data.
  • FIG. 11 schematically shows an example of the index data.
  • date and time information indicating a shooting date and time
  • position information indicating a shooting position
  • an image ID indicating a shooting position
  • an object specified by attribute data are registered in association with each other.
  • the configuration of the data storage unit 14 is the same as that of the first embodiment.
  • the request acquisition unit 15 acquires a request for imaging data in which an object meeting a predetermined condition is shown and at least one of the imaging position and the imaging date and time is further specified. Other configurations of the request acquisition unit 15 are the same as those of the first embodiment.
  • an operator who is searching for photographing data in which a predetermined object is reflected can use an input device included in the request device 20 or an input device connected to the request device 20 to change the condition of the object, the photographing position, the photographing date and time, and the like. And input to request photographing data.
  • the shooting date and time may be specified in a time zone such as “13:00 to 14:00 on August 20, 2018”, or may be specified in “13:00 on August 20, 2018”. As described above.
  • the shooting location may be designated by a place name, such as “XX city” or “XX town”.
  • the request device 20 may receive an input for designating a shooting location via a UI (user @ interface) screen that accepts an input for designating a predetermined area on the map while displaying the map. For example, a frame whose display position, size, shape, and the like are changed based on a user input may be displayed on a map. Then, an area inside this frame may be designated as a shooting location.
  • UI user @ interface
  • the determination unit 16 determines whether an object matching a predetermined condition is reflected and that the shooting data whose shooting position and / or shooting date and time match the specified content is stored in the data storage unit 14. judge.
  • the determination unit 16 determines whether an object meeting a predetermined condition is registered in the index data. Then, if registered, the shooting position and / or shooting date and time of the image ID (see FIG. 11) associated with an object meeting the predetermined condition is changed to the shooting position and / or shooting date and time specified in the request. Judge whether or not it matches.
  • the configuration of the transmission unit 17 is the same as that of the first embodiment.
  • Processing flow of image processing system An example of the processing flow of the image processing system is the same as in the first embodiment.
  • the index data generation timing of the present embodiment may be the same as the example described in the first embodiment, or as another example, the timing at which a photographing data request (S10 in FIG. 8) is received. You may.
  • the image processing apparatus 10 processes only the photographing data generated at the photographing date and time specified by the photographing data request, generates attribute data, and registers the attribute data as index data. As described above, by minimizing the amount of imaging data to be processed in response to a request for imaging data, an effect of reducing the processing load can be obtained.
  • the image processing apparatus 10 of the present embodiment it is possible to sufficiently narrow down desired photographing data using the photographing position and / or photographing date and time in addition to whether or not a predetermined object is reflected in the photographing data. Therefore, the image processing apparatus 10 can reduce the number of pieces of imaging data transmitted to the request apparatus 20 in response to the request. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
  • the overall image of the image processing system is the same as in the first and second embodiments.
  • the image processing apparatus 10 according to the present embodiment can change the type and amount of attribute data registered as index data according to a shooting environment at the time of shooting of shooting data, an object reflected in the shooting data, and the like.
  • FIG. 6 An example of a functional block diagram of the image processing apparatus 10 is shown in FIG. 6, as in the first and second embodiments.
  • the configurations of the imaging unit 11, the data storage unit 14, the request acquisition unit 15, the determination unit 16, and the transmission unit 17 are the same as in the first and second embodiments.
  • the analysis unit 12 changes at least one of the type and the amount of the generated attribute data according to the shooting environment at the time of shooting the shooting data.
  • the analysis unit 12 may generate attribute data based on a generation rule in which the type and amount of attribute data generated for each shooting environment are determined in advance.
  • the increase or decrease in the amount of attribute data may be realized by increasing or decreasing the number of types of attribute data to be generated, or may be realized by increasing or decreasing the number of frame data to be processed in the moving image data.
  • the registration unit 13 changes at least one of the type and the amount of the attribute data to be registered as the index data according to the shooting environment at the time of shooting the shooting data.
  • the shooting environment is exemplified by a shooting position, a shooting time, a shooting date, a shooting day, a weather at the time of shooting, specifications of the shooting device (image resolution and the like), and the like.
  • the analysis unit 12 can acquire information indicating the shooting environment by any means.
  • the amount of generated attribute data may be relatively small. Many people and moving objects are reflected in the photographing data generated in such a photographing environment. If many types of attribute data are generated and registered by analyzing such photographing data, the processing load of the image processing apparatus 10 increases and the storage capacity may be reduced. Therefore, when photographing data is generated in such a photographing environment, the load on the image processing apparatus 10 is reduced by reducing the type of attribute data to be generated or the number of frame data to be processed. To reduce
  • the attribute to be generated is The amount of data may be relatively small.
  • the imaging data generated in such an imaging environment has a low utility value. Therefore, when photographing data is generated in such a photographing environment, the load on the image processing apparatus 10 is reduced by reducing the type of attribute data to be generated or the number of frame data to be processed.
  • the attribute to be generated is set.
  • the amount of data may be relatively large. Since photographing data generated in such a photographing environment has a high utility value, various and large amounts of attribute data are generated and registered.
  • the analysis unit 12 can change at least one of the type and the amount of the attribute data to be generated according to the object reflected in the photographing data.
  • the analysis unit 12 may generate the attribute data based on a generation rule in which the type and amount of the attribute data generated for each object reflected in the shooting data are determined in advance.
  • the increase or decrease in the amount of attribute data may be realized by increasing or decreasing the number of types of attribute data to be generated, or may be realized by increasing or decreasing the number of frame data to be processed in the moving image data.
  • the registration unit 13 changes at least one of the type and the amount of the attribute data to be registered as the index data according to the object reflected in the photographing data.
  • the number of frame data to be processed is increased (for example, all frame data are processed, and a relatively small number is set).
  • the amount of attribute data to be generated may be relatively large.
  • the number of frame data to be processed is reduced (eg, a relatively large number of frame data is processed).
  • the amount of attribute data to be generated may be relatively reduced.
  • the type of attribute data registered as index data is defined in advance for each object (person, vehicle, etc.) reflected in the photographing data.
  • the analysis unit 12 When relatively increasing the amount of attribute data to be generated, the analysis unit 12 generates all or most of the illustrated types of attribute data. On the other hand, when the amount of attribute data to be generated is relatively reduced, the analysis unit 12 generates only a part of the illustrated plurality of types of attribute data.
  • the analysis unit 12 may be set to generate, for example, attribute data (attribute data located on the vague side illustrated in the drawing) that makes it difficult to identify an individual or an individual. Good. This reduces the amount of data and the processing load on the computer. It should be noted that, in advance, the type of attribute data to be generated is determined for each of the shooting environment at the time of shooting of the shooting data and each object reflected in the shooting data. The type can be determined.
  • Processing flow of image processing system An example of the processing flow of the image processing system is the same as in the first and second embodiments.
  • Photographing means attached to a moving body for photographing the outside of the moving body, Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data; Registration means for registering the object specified by the attribute data as index data of the photographing data, Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected, Determining means for determining whether an object meeting the predetermined condition is registered in the index data, When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device, An image processing apparatus having: 2.
  • the photographing unit has a position detecting unit that detects a current position, and a date and time detecting unit that detects a current date and time, The image processing apparatus, wherein the registration unit registers a shooting position and a shooting date and time of the shooting data as index data of the shooting data. 3.
  • the image processing apparatus wherein the request acquiring unit acquires a request for the photographing data further specifying at least one of a photographing position and a photographing date and time. 4.
  • the image processing apparatus according to any one of 1 to 3 The image processing apparatus, wherein the registration unit changes a type of the attribute data to be registered as index data of the shooting data according to a shooting environment at the time of shooting the shooting data. 5.
  • the image processing apparatus changes an amount of the attribute data to be registered as index data of the shooting data according to a shooting environment at the time of shooting the shooting data. 6.
  • the image processing apparatus according to any one of 1 to 5, The image processing device, wherein the registration unit changes a type of the attribute data to be registered as index data of the shooting data according to an object reflected in the shooting data. 7.
  • the image processing apparatus according to any one of 1 to 6, The image processing device, wherein the registration unit changes an amount of the attribute data to be registered as index data of the shooting data according to an object reflected in the shooting data. 8.
  • the image processing apparatus according to any one of 1 to 7,
  • the image processing device, wherein the attribute data includes at least one of information on a physique of a person, a feature of clothes, a color of a moving object, a vehicle type, and a license plate.
  • the image processing apparatus is an image processing apparatus including one or both of a mobile device and a portable device. 10.
  • Computer Photographing means attached to a moving body and photographing the outside of the moving body, Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data; Registration means for registering the object specified by the attribute data as index data of the photographing data, Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected, Determining means for determining whether an object meeting the predetermined condition is registered in the index data, When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device, A program to function as

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Library & Information Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Studio Devices (AREA)
  • Information Transfer Between Computers (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention provides an image processing device (10) comprising: an image capture part (11) attached to a mobile body for capturing images of the outside of the mobile body; an analysis part (12) for analyzing captured image data generated by the image capture part (11) and generating attribute data relating to an object appearing in the captured image data; a registration part (13) for registering the object identified with the attribute data as index data relating to the captured image data; a request acquisition part (15) for acquiring a request for captured image data wherein an object matching a prescribed condition appears; a determination part (16) for determining whether the object matching the prescribed condition is registered in the index data; and a transmission part (17) for transmitting at least a portion of the captured image data to an external device if it is determined that the object matching the prescribed condition is registered in the index data.

Description

画像処理装置、画像処理方法及びプログラムImage processing apparatus, image processing method, and program
 本発明は、画像処理装置、画像処理方法及びプログラムに関する。 The present invention relates to an image processing device, an image processing method, and a program.
 監視カメラを各所に設置することで、各所で発生した事件、事故等のイベントや、各所を通過する移動体、人、動物等を記録し、確認することができる。しかし、多数の監視カメラの設置や設置後のメンテナンスには膨大な費用が必要となる。当該課題を解決する手段が、特許文献1に開示されている。 By installing surveillance cameras at various locations, it is possible to record and check events such as incidents and accidents occurring at various locations, as well as moving objects, people, animals, etc. passing through various locations. However, huge costs are required for installing a large number of surveillance cameras and for maintenance after the installation. Means for solving the problem is disclosed in Patent Document 1.
 特許文献1には、車に取り付けられる画像記録装置であって、画像を取得する画像取得手段と、取得した画像を符号化する画像符号化手段と、画像取得時の位置及び時刻を算出して位置情報及び時刻情報を生成する生成手段と、符号化された画像を位置情報及び時刻情報と関連付けて記録媒体に記録する記録手段と、記録手段で記録された画像の中から、画像に関連付けされた位置情報や時刻情報に基づいて画像を抽出する抽出手段と、抽出された画像を外部へ出力する出力手段と、を有する画像記録装置が開示されている。 Patent Literature 1 discloses an image recording device attached to a car, which calculates an image acquiring unit that acquires an image, an image encoding unit that encodes an acquired image, and a position and a time when the image is acquired. Generating means for generating position information and time information, recording means for recording the encoded image on the recording medium in association with the position information and time information, and, from among the images recorded by the recording means, There is disclosed an image recording apparatus including an extracting unit that extracts an image based on the extracted position information and time information, and an output unit that outputs the extracted image to the outside.
特開2010-158038号公報JP 2010-158038 A
 しかし、位置情報及び時刻情報のみに基づき画像を抽出する特許文献1に記載の装置の場合、抽出する画像の絞り込みが不十分となる。この場合、多数の画像が抽出され、多数の画像が外部へ送信されることとなり、通信負担が大きくなる。 However, in the case of the device described in Patent Literature 1 that extracts an image based on only position information and time information, the image to be extracted is insufficiently narrowed down. In this case, a large number of images are extracted, and a large number of images are transmitted to the outside, and the communication load increases.
 本発明は、移動体に取り付けられた撮影手段が生成した所定の画像を外部に送信するシステムにおいて、通信負担を軽減することを課題とする。 The present invention has an object to reduce a communication load in a system for transmitting a predetermined image generated by a photographing unit attached to a moving body to the outside.
 本発明によれば、
 移動体に取り付けられ、前記移動体の外部を撮影する撮影手段と、
 前記撮影手段が生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析手段と、
 前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録手段と、
 所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得手段と、
 前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定手段と、
 前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信手段と、
を有する画像処理装置が提供される。
According to the present invention,
Photographing means attached to a moving body, for photographing the outside of the moving body,
Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data;
Registration means for registering the object specified by the attribute data as index data of the photographing data,
Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
Determining means for determining whether an object meeting the predetermined condition is registered in the index data,
When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device,
Is provided.
 また、本発明によれば、
 コンピュータが、
 移動体に取り付けられ、前記移動体の外部を撮影する撮影工程と、
 前記撮影工程で生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析工程と、
 前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録工程と、
 所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得工程と、
 前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定工程と、
 前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信工程と、
を実行する画像処理方法が提供される。
According to the present invention,
Computer
A photographing step attached to the moving body and photographing the outside of the moving body,
Analyzing the photographing data generated in the photographing step, an analyzing step of generating attribute data of an object reflected in the photographing data,
A registration step of registering the object specified by the attribute data as index data of the photographing data,
A request acquisition step of acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
A determining step of determining whether an object meeting the predetermined condition is registered in the index data,
When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission step of transmitting at least a part of the imaging data to an external device,
Is provided.
 また、本発明によれば、
 コンピュータを、
 移動体に取り付けられ、前記移動体の外部を撮影する撮影手段、
 前記撮影手段が生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析手段、
 前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録手段、
 所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得手段、
 前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定手段、
 前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信手段、
として機能させるプログラムが提供される。
According to the present invention,
Computer
Photographing means attached to a moving body and photographing the outside of the moving body,
Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data;
Registration means for registering the object specified by the attribute data as index data of the photographing data,
Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
Determining means for determining whether an object meeting the predetermined condition is registered in the index data,
When it is determined that an object that meets the predetermined condition is registered in the index data, a transmitting unit that transmits at least a part of the photographing data to an external device,
A program to function as a program is provided.
 本発明によれば、移動体に取り付けられた撮影手段が撮影した所定の画像を外部に送信するシステムにおいて、通信負担を軽減することができる。 According to the present invention, a communication load can be reduced in a system for transmitting a predetermined image captured by a capturing unit attached to a moving object to the outside.
 上述した目的、およびその他の目的、特徴および利点は、以下に述べる好適な実施の形態、およびそれに付随する以下の図面によってさらに明らかになる。 The above and other objects, features and advantages will become more apparent from the preferred embodiments described below and the accompanying drawings.
本実施形態の画像処理システムの機能ブロック図の一例である。FIG. 1 is an example of a functional block diagram of an image processing system according to an embodiment. 本実施形態の画像処理装置10の実現例を示す図である。FIG. 2 is a diagram illustrating a realization example of the image processing apparatus 10 according to the embodiment. 本実施形態の画像処理装置10の実現例を示す図である。FIG. 2 is a diagram illustrating a realization example of the image processing apparatus 10 according to the embodiment. 本実施形態の画像処理装置10の実現例を示す図である。FIG. 2 is a diagram illustrating a realization example of the image processing apparatus 10 according to the embodiment. 本実施形態の画像処理装置10のハードウエア構成の一例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration of the image processing apparatus according to the embodiment. 本実施形態の画像処理装置10の機能ブロック図の一例である。FIG. 2 is an example of a functional block diagram of the image processing apparatus 10 of the present embodiment. 本実施形態の画像処理装置10が処理する情報の一例を模式的に示す図である。FIG. 4 is a diagram schematically illustrating an example of information processed by the image processing apparatus 10 according to the embodiment. 本実施形態の画像処理システムの処理の流れの一例を示すシーケンス図である。FIG. 3 is a sequence diagram illustrating an example of a processing flow of the image processing system according to the embodiment. 本実施形態の画像処理装置10の処理の流れの一例を示すフロー図である。FIG. 3 is a flowchart illustrating an example of a processing flow of the image processing apparatus 10 according to the embodiment. 本実施形態の撮影部11の機能ブロック図の一例である。FIG. 2 is an example of a functional block diagram of a photographing unit 11 of the present embodiment. 本実施形態の画像処理装置10が処理する情報の一例を模式的に示す図である。FIG. 4 is a diagram schematically illustrating an example of information processed by the image processing apparatus 10 according to the embodiment. 本実施形態の画像処理装置10の処理の一例を説明するための図である。FIG. 3 is a diagram for describing an example of a process of the image processing apparatus 10 of the present embodiment.
<第1の実施形態>
「画像処理システムの全体像及び概要」
 まず、図1を用いて、本実施形態の画像処理システムの全体像を説明する。図1に示すように、画像処理システムは、複数の画像処理装置10と、リクエスト装置20とを有する。複数の画像処理装置10各々と、リクエスト装置20とは互いに通信可能に構成される。
<First embodiment>
"Overview and Overview of Image Processing System"
First, an overall image of the image processing system according to the present embodiment will be described with reference to FIG. As shown in FIG. 1, the image processing system includes a plurality of image processing devices 10 and a request device 20. Each of the plurality of image processing devices 10 and the request device 20 are configured to be able to communicate with each other.
 複数の画像処理装置10は、各々異なる移動体に取り付けて利用される。移動体は、地上、地中、水上、水中又は空中を移動可能に構成される物体であり、例えば、自動車、自動二輪車、自転車、電車、船、飛行機、無人航空機等が例示されるが、これらに限定されない。 The plurality of image processing apparatuses 10 are used by being attached to different moving bodies, respectively. The moving body is an object configured to be movable on the ground, under the ground, on the water, underwater or in the air, and examples thereof include an automobile, a motorcycle, a bicycle, a train, a ship, an airplane, and an unmanned aerial vehicle. It is not limited to.
 例えば、画像処理装置10は、図2に示すように移動体搭載装置1であってもよい。移動体搭載装置1は、移動体に搭載され、基本的には移動体の所定位置に常時取り付けられる装置である。 For example, the image processing device 10 may be the mobile object mounting device 1 as shown in FIG. The moving body mounting device 1 is a device that is mounted on the moving body and is basically always attached to a predetermined position of the moving body.
 その他の例として、画像処理装置10は、図3に示すように可搬型装置2であってもよい。可搬型装置2の例として、スマートフォン、タブレット端末、携帯電話等が例示されるが、これらに限定されない。例えば可搬型装置2の所有者が移動体で移動する際に、可搬型装置2は移動体の内部に持ち込まれ、移動体の所定位置に取り付けられる。 と し て As another example, the image processing device 10 may be a portable device 2 as shown in FIG. Examples of the portable device 2 include, but are not limited to, a smartphone, a tablet terminal, and a mobile phone. For example, when the owner of the portable device 2 moves on the moving body, the portable device 2 is brought inside the moving body and attached to a predetermined position of the moving body.
 その他の例として、画像処理装置10は、図4に示すように移動体搭載装置1と可搬型装置2とが連動することで実現されてもよい。移動体搭載装置1は、移動体に搭載され、基本的には移動体の所定位置に常時取り付けられる。可搬型装置2は、可搬型装置2の所有者が移動体で移動する際に移動体の内部に持ち込まれ、移動体の所定位置に取り付けられる。そして、移動体搭載装置1と可搬型装置2とは有線及び/又は無線で接続され、互いに連動して所定の処理を実行する。 As another example, the image processing apparatus 10 may be realized by interlocking the mobile device 1 and the portable device 2 as shown in FIG. The moving body mounting device 1 is mounted on a moving body and basically always attached to a predetermined position of the moving body. The portable device 2 is brought inside the moving object when the owner of the portable device 2 moves by the moving object, and is attached to a predetermined position of the moving object. The mobile device 1 and the portable device 2 are connected by wire and / or wirelessly, and execute predetermined processing in conjunction with each other.
 画像処理装置10は、移動体に取り付けられ、移動体の外部を撮影する手段と、撮影データを解析し、撮影データに映った物体の属性データを生成する手段と、属性データで特定される物体を撮影データのインデックスデータとして登録する手段とを有する。 The image processing device 10 is attached to the moving body, and captures an image of the outside of the moving body, analyzes the captured data, generates attribute data of the object reflected in the captured data, and outputs the object specified by the attribute data. For registering as the index data of the photographing data.
 また、画像処理装置10は、所定の条件に合った物体が映っている撮影データのリクエストを取得する手段と、所定の条件に合った物体がインデックスデータに登録されているか判定する手段と、所定の条件に合った物体がインデックスデータに登録されていると判定された場合、撮影データの少なくとも一部をリクエスト装置20(外部装置)に送信する手段とを有する。リクエスト端末20に撮影データを送信するタイミングは、例えば、ユーザが撮影データの送信を許諾するために画像処理装置10を操作したタイミング等が例示されるが、これに限定されない。 The image processing apparatus 10 further includes: a unit that acquires a request for photographing data in which an object that meets a predetermined condition is reflected; a unit that determines whether an object that meets a predetermined condition is registered in the index data; Means for transmitting at least a part of the photographing data to the request device 20 (external device) when it is determined that an object meeting the condition is registered in the index data. The timing at which the imaging data is transmitted to the request terminal 20 is, for example, the timing at which the user operates the image processing apparatus 10 to permit the transmission of the imaging data, but is not limited to this.
 このような画像処理装置10によれば、移動体に取り付けられた撮影手段が生成した撮影データの中から、所定の物体が映っている撮影データを抽出し、リクエスト装置20に送信することができる。撮影データに所定の物体が映っているか否かで撮影データを十分に絞り込むことができるので、画像処理装置10は、リクエストに応じてリクエスト装置20に送信する撮影データを減らすことができる。結果、画像処理装置10とリクエスト装置20との間の通信負担を軽減することができる。 According to such an image processing device 10, it is possible to extract, from the photographing data generated by the photographing means attached to the moving body, photographing data in which a predetermined object is reflected, and transmit the photographing data to the request device 20. . Since the imaging data can be sufficiently narrowed down based on whether or not a predetermined object is reflected in the imaging data, the image processing apparatus 10 can reduce the imaging data transmitted to the request device 20 in response to the request. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
「画像処理装置10の構成」
 次に、画像処理装置10の構成を詳細に説明する。まず、画像処理装置10のハードウエア構成の一例について説明する。本実施形態の画像処理装置10が備える各機能部は、任意のコンピュータのCPU(Central Processing Unit)、メモリ、メモリにロードされるプログラム、そのプログラムを格納するハードディスク等の記憶ユニット(あらかじめ装置を出荷する段階から格納されているプログラムのほか、CD(Compact Disc)等の記憶媒体やインターネット上のサーバ等からダウンロードされたプログラムをも格納できる)、ネットワーク接続用インターフェイスを中心にハードウエアとソフトウエアの任意の組合せによって実現される。そして、その実現方法、装置にはいろいろな変形例があることは、当業者には理解されるところである。
“Configuration of Image Processing Device 10”
Next, the configuration of the image processing apparatus 10 will be described in detail. First, an example of a hardware configuration of the image processing apparatus 10 will be described. The functional units included in the image processing apparatus 10 of the present embodiment include a CPU (Central Processing Unit) of an arbitrary computer, a memory, a program loaded into the memory, and a storage unit such as a hard disk for storing the program (the apparatus is shipped in advance). In addition to the programs stored from the initial stage, it can also store storage media such as CDs (Compact Discs) and programs downloaded from servers on the Internet.) It is realized by any combination. It will be understood by those skilled in the art that there are various modifications in the method and apparatus for realizing the method.
 図5は、本実施形態の画像処理装置10のハードウエア構成を例示するブロック図である。図5に示すように、画像処理装置10は、プロセッサ1A、メモリ2A、入出力インターフェイス3A、周辺回路4A、バス5Aを有する。周辺回路4Aには、様々なモジュールが含まれる。画像処理装置10は周辺回路4Aを有さなくてもよい。なお、画像処理装置10は物理的に分かれた複数の装置で構成されてもよい。この場合、複数の装置各々が上記ハードウエア構成を備えることができる。 FIG. 5 is a block diagram illustrating a hardware configuration of the image processing apparatus 10 according to the present embodiment. As shown in FIG. 5, the image processing apparatus 10 includes a processor 1A, a memory 2A, an input / output interface 3A, a peripheral circuit 4A, and a bus 5A. The peripheral circuit 4A includes various modules. The image processing device 10 may not have the peripheral circuit 4A. Note that the image processing device 10 may be configured by a plurality of physically separated devices. In this case, each of the plurality of devices can have the above hardware configuration.
 バス5Aは、プロセッサ1A、メモリ2A、周辺回路4A及び入出力インターフェイス3Aが相互にデータを送受信するためのデータ伝送路である。プロセッサ1Aは、例えばCPU、GPU(Graphics Processing Unit)などの演算処理装置である。メモリ2Aは、例えばRAM(Random Access Memory)やROM(Read Only Memory)などのメモリである。入出力インターフェイス3Aは、入力装置、外部装置、外部サーバ、外部センサ、カメラ等から情報を取得するためのインターフェイスや、出力装置、外部装置、外部サーバ等に情報を出力するためのインターフェイスなどを含む。入力装置は、例えばキーボード、マウス、マイク等である。出力装置は、例えばディスプレイ、スピーカ、プリンター、メーラ等である。プロセッサ1Aは、各モジュールに指令を出し、それらの演算結果をもとに演算を行うことができる。 The bus 5A is a data transmission path through which the processor 1A, the memory 2A, the peripheral circuit 4A, and the input / output interface 3A mutually transmit and receive data. The processor 1A is an arithmetic processing device such as a CPU and a GPU (Graphics Processing Unit). The memory 2A is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory). The input / output interface 3A includes an interface for acquiring information from an input device, an external device, an external server, an external sensor, a camera, and the like, and an interface for outputting information to an output device, an external device, an external server, and the like. . The input device is, for example, a keyboard, a mouse, a microphone, and the like. The output device is, for example, a display, a speaker, a printer, a mailer, or the like. The processor 1A can issue a command to each module and perform a calculation based on the calculation results.
 図6に、画像処理装置10の機能ブロック図の一例を示す。図示するように、画像処理装置10は、撮影部11と、解析部12と、登録部13と、データ記憶部14と、リクエスト取得部15と、判定部16と、送信部17とを有する。なお、画像処理装置10は、データ記憶部14を有さなくてもよい。この場合、画像処理装置10と通信可能に構成された外部装置がデータ記憶部14を有する。 FIG. 6 shows an example of a functional block diagram of the image processing apparatus 10. As illustrated, the image processing device 10 includes an imaging unit 11, an analysis unit 12, a registration unit 13, a data storage unit 14, a request acquisition unit 15, a determination unit 16, and a transmission unit 17. Note that the image processing apparatus 10 may not have the data storage unit 14. In this case, an external device configured to be able to communicate with the image processing device 10 includes the data storage unit 14.
 撮影部11は、移動体に取り付けられ、移動体の外部を撮影する。移動体の定義は上述の通りであるので、ここでの説明は省略する。撮影部11は、動画データを生成してもよいし、定期的又は不定期に静止画データを撮影してもよい。 (4) The photographing unit 11 is attached to the moving body, and photographs the outside of the moving body. Since the definition of the moving object is as described above, the description here is omitted. The photographing unit 11 may generate moving image data, or may photograph still image data regularly or irregularly.
 解析部12は、撮影部11が生成した撮影データ(動画データ、静止画データ等)を解析し、撮影データに映った物体の属性データを生成する。 The analysis unit 12 analyzes the photographing data (moving image data, still image data, etc.) generated by the photographing unit 11 and generates attribute data of the object reflected in the photographing data.
 撮影部11が動画データを生成する場合、解析部12は、動画データに含まれる複数のフレームデータの全てを解析対象としてもよいし、一部のフレームデータ例えば所定フレーム数おきのフレームデータを解析対象としてもよい。撮影部11が静止画データを生成する場合、解析部12は、全ての静止画データを解析対象としてもよい。 When the photographing unit 11 generates moving image data, the analyzing unit 12 may analyze all of a plurality of frame data included in the moving image data or analyze a part of frame data, for example, frame data every predetermined number of frames. It may be targeted. When the imaging unit 11 generates still image data, the analysis unit 12 may analyze all still image data.
 ここで、解析部12が解析対象のデータに対して行う処理の内容を説明する。まず、解析部12は、画像に映っている物体を抽出する。画像に複数の物体が映っている場合、解析部12は複数の物体を抽出することができる。物体抽出は、例えば、輪郭抽出処理、テンプレートマッチング処理等を利用して実現できるが、その実現手段は制限されない。画像から抽出する物体は、人物、その他の動物、移動体、発光物体等が例示されるが、これらに限定されない。 Here, the contents of the processing performed by the analysis unit 12 on the data to be analyzed will be described. First, the analysis unit 12 extracts an object shown in the image. When a plurality of objects appear in the image, the analysis unit 12 can extract the plurality of objects. Object extraction can be realized by using, for example, a contour extraction process, a template matching process, or the like, but the means for realizing the object extraction is not limited. Examples of the object to be extracted from the image include, but are not limited to, a person, another animal, a moving object, a light-emitting object, and the like.
 画像から物体を抽出した後、解析部12は、抽出した物体各々の外観の特徴を示す属性データを生成する。解析部12は、各種物体に応じて予め定められたルールに従い画像を解析し、抽出した物体の属性データを生成する。 After extracting the object from the image, the analysis unit 12 generates attribute data indicating the appearance characteristics of each of the extracted objects. The analysis unit 12 analyzes an image according to a predetermined rule according to various objects, and generates attribute data of the extracted objects.
 例えば、抽出した物体が人物である場合、解析部12は、人物の顔を検出し、顔の外観の特徴から性別や年齢等を示す属性データを生成してもよい。また、解析部12は顔への装着物(例:眼鏡、サングラス、マスク等)の有無を示す属性データを生成してもよい。その他、解析部12は、人物の衣服の特徴(例:上半身及び下半身の衣服の色、形状等)を示す属性データを生成してもよい。その他、解析部12は、人物の体型(例:普通、細め、太め)や身長等の属性データを生成してもよい。その他、解析部12は、予め保持している複数の人物各々の外観の特徴を示す参照データと、画像に映る人物の外観の特徴とを照合することで、画像に映る人物が誰であるかを特定し、その人物を識別する属性データを生成してもよい。 For example, when the extracted object is a person, the analysis unit 12 may detect the face of the person and generate attribute data indicating gender, age, and the like from the features of the appearance of the face. In addition, the analysis unit 12 may generate attribute data indicating the presence or absence of a wearing object (eg, glasses, sunglasses, a mask, or the like) on the face. In addition, the analysis unit 12 may generate attribute data indicating the characteristics of the clothes of the person (eg, the colors and shapes of the upper body and lower body clothes). In addition, the analysis unit 12 may generate attribute data such as a person's body type (eg, normal, thin, thick) and height. In addition, the analysis unit 12 collates the reference data indicating the appearance characteristics of each of a plurality of persons held in advance with the appearance characteristics of the persons appearing in the image, so that who is the person appearing in the image. May be specified, and attribute data for identifying the person may be generated.
 また、抽出した物体が移動体である場合、解析部12は、移動体の種類、車種、メーカ、移動体の色、ナンバープレートに記載された情報、乗員人数等を示す属性データを生成してもよい。 If the extracted object is a moving object, the analysis unit 12 generates attribute data indicating the type of the moving object, the vehicle type, the maker, the color of the moving object, information described on a license plate, the number of occupants, and the like. Is also good.
 また、抽出した物体が人以外の動物である場合、解析部12は、動物の種類や、身体の色等を示す属性データを生成してもよい。 In the case where the extracted object is an animal other than a human, the analysis unit 12 may generate attribute data indicating the type of the animal, the color of the body, and the like.
 また、抽出した物体が発光物体である場合、解析部12は、その物体の形状、色、大きさ、光を発していること等を示す属性データを生成してもよい。 If the extracted object is a light-emitting object, the analysis unit 12 may generate attribute data indicating the shape, color, size, emitting light, and the like of the object.
 解析部12は、あらゆる画像解析技術を利用して画像を解析することで、上述のような属性データを生成することができる。 The analysis unit 12 can generate the above-described attribute data by analyzing an image using any image analysis technology.
 登録部13は、解析部12が生成した属性データで特定される物体を、撮影データのインデックスデータとして登録する。 The registration unit 13 registers an object specified by the attribute data generated by the analysis unit 12 as index data of shooting data.
 図7に、インデックスデータの一例を模式的に示す。図示する例では、画像ID(identifier)と、属性データで特定される物体とが互いに対応付けて登録されている。画像IDは、解析部12が解析対象とした画像各々を識別する情報である。 FIG. 7 schematically shows an example of the index data. In the illustrated example, an image ID (identifier) and an object specified by attribute data are registered in association with each other. The image ID is information for identifying each image analyzed by the analysis unit 12.
 図6に示すデータ記憶部14が、インデックスデータを記憶する。また、データ記憶部14は、撮影部11が生成した撮影データを記憶する。なお、撮影部11が動画データを生成する場合、データ記憶部14は、撮影部11が生成した撮影データを複数のグループに分類して記憶してもよいし、分類せずに記憶してもよい。分類の例としては、撮影開始から撮影終了までを1つのグループにする例や、撮影日時に基づきグループ分けする例(例:撮影日毎にグループ分け)等が例示されるが、これらに限定されない。 デ ー タ The data storage unit 14 shown in FIG. 6 stores the index data. Further, the data storage unit 14 stores the photographing data generated by the photographing unit 11. When the imaging unit 11 generates moving image data, the data storage unit 14 may classify the imaging data generated by the imaging unit 11 into a plurality of groups and store the data, or may store the data without classification. Good. Examples of classification include, but are not limited to, an example in which a group from the start of shooting to the end of shooting is grouped into one group, and an example of grouping based on shooting date and time (eg, grouping by shooting date).
 リクエスト取得部15は、所定の条件に合った物体が映っている撮影データのリクエストを取得する。 The request acquisition unit 15 acquires a request for imaging data in which an object meeting a predetermined condition is shown.
 例えば、所定の物体が映っている撮影データを探しているオペレータは、リクエスト装置20が有する入力装置又はリクエスト装置20に接続された入力装置を介して、所定の条件を入力し、当該所定の条件に合った物体が映っている撮影データをリクエストする入力を行う。当該入力に応じて、リクエスト装置20は、所定の条件に合った物体が映っている撮影データのリクエストを、予め登録されている複数の画像処理装置10に送信する。リクエスト取得部15は、このようにしてリクエスト装置20から送信されてきたリクエストを取得する。 For example, an operator who is searching for imaging data showing a predetermined object inputs predetermined conditions via an input device of the request device 20 or an input device connected to the request device 20, and inputs the predetermined conditions. The user makes an input requesting shooting data in which an object suitable for is displayed. In response to the input, the request device 20 transmits a request for imaging data in which an object meeting a predetermined condition is shown to a plurality of image processing devices 10 registered in advance. The request acquisition unit 15 acquires the request transmitted from the request device 20 in this manner.
 所定の条件は、解析部12が生成する属性データと同種の情報で物体を特定する条件である。例えば、所定の条件は、属性データと同種の情報を論理演算子で結んだ論理式で示されてもよい。このような所定の条件の一例としては、「人」+「30代」+「男」+「眼鏡着用」等が例示される。 The predetermined condition is a condition for specifying an object with the same type of information as the attribute data generated by the analysis unit 12. For example, the predetermined condition may be represented by a logical expression in which information of the same type as the attribute data is connected by a logical operator. An example of such a predetermined condition is “person” + “30s” + “male” + “wearing glasses”.
 判定部16は、所定の条件に合った物体がインデックスデータに登録されているか判定する。「所定の条件に合った物体」は、所定の条件を100%満たす物体のほか、所定の条件を所定レベル以上満たす物体を含んでもよい。所定の条件を所定レベル以上満たす物体は、例えば、所定の条件が「人」+「30代」+「男」+「眼鏡着用」等のように複数の属性データを組み合わせて構成されている場合、複数の属性データの中の所定割合以上(例:60%、70%等)の属性データを満たす物体であってもよい。 The determination unit 16 determines whether an object meeting a predetermined condition is registered in the index data. The "object meeting the predetermined condition" may include an object satisfying the predetermined condition at a predetermined level or more, in addition to an object satisfying the predetermined condition 100%. An object that satisfies the predetermined condition at a predetermined level or higher is, for example, a case where the predetermined condition is configured by combining a plurality of attribute data such as “person” + “30s” + “male” + “wearing glasses”. Alternatively, an object that satisfies attribute data of a predetermined ratio or more (eg, 60%, 70%, etc.) among a plurality of attribute data may be used.
 送信部17は、所定の条件に合った物体がインデックスデータに登録されていると判定された場合、撮影データの少なくとも一部をリクエスト装置20に送信する。 The transmission unit 17 transmits at least a part of the photographing data to the request device 20 when it is determined that an object meeting a predetermined condition is registered in the index data.
 撮影部11が動画データ(撮影データ)を生成する場合、送信部17は、撮影部11が生成し、データ記憶部14に記憶されている全ての動画データをリクエスト装置20に送信してもよい。しかし、このように構成した場合、送受信されるデータの量が膨大になり得る。 When the imaging unit 11 generates moving image data (imaging data), the transmission unit 17 may transmit all the moving image data generated by the imaging unit 11 and stored in the data storage unit 14 to the request device 20. . However, with such a configuration, the amount of data transmitted and received may be enormous.
 そこで、送信部17は、撮影部11が生成し、データ記憶部14に記憶されている動画データの中の一部をリクエスト装置20に送信してもよい。例えば、データ記憶部14が、動画データを複数のグループに分類して記憶している場合、送信部17は、インデックスデータにおいて所定の条件に合った物体が対応付けられている画像ID(図7参照)を含むグループ(所定の条件に合った物体が映っているグループ)の動画データのみを、リクエスト装置20に送信してもよい。 Therefore, the transmission unit 17 may transmit a part of the moving image data generated by the imaging unit 11 and stored in the data storage unit 14 to the request device 20. For example, when the data storage unit 14 stores moving image data classified into a plurality of groups, the transmission unit 17 determines the image ID (FIG. 7) in which the index data is associated with an object that meets a predetermined condition. ) May be transmitted to the requesting device 20 only.
 その他、送信部17は、インデックスデータにおいて所定の条件に合った物体が対応付けられている画像ID(図7参照)の前後所定時間分の動画データのみを、リクエスト装置20に送信してもよい。 In addition, the transmission unit 17 may transmit to the request device 20 only moving image data for a predetermined time before and after an image ID (see FIG. 7) associated with an object meeting a predetermined condition in the index data. .
「画像処理システムの処理の流れ」
 次に、図8のシーケンス図を用いて、画像処理システムの処理の流れの一例を説明する。
"Processing flow of image processing system"
Next, an example of the processing flow of the image processing system will be described with reference to the sequence diagram of FIG.
 まず、所定の物体が映っている撮影データを探しているオペレータは、リクエスト装置20が有する入力装置又はリクエスト装置20に接続された入力装置を介して、所定の条件を入力し、当該所定の条件に合った物体が映っている撮影データをリクエストする入力を行う。当該入力に応じて、リクエスト装置20は、所定の条件に合った物体が映っている撮影データのリクエストを、予め登録されている複数の画像処理装置10に送信する(S10)。 First, an operator who is searching for photographing data showing a predetermined object inputs predetermined conditions via an input device of the request device 20 or an input device connected to the request device 20, and inputs the predetermined conditions. The user makes an input requesting shooting data in which an object suitable for is displayed. In response to the input, the request device 20 transmits a request for photographing data in which an object meeting a predetermined condition is shown to a plurality of image processing devices 10 registered in advance (S10).
 次いで、画像処理装置10は、所定の条件に合った物体が撮影データのインデックスデータ(図7参照)に登録されているか判定する(S11)。 Next, the image processing apparatus 10 determines whether or not an object meeting a predetermined condition is registered in the index data of the photographing data (see FIG. 7) (S11).
 そして、画像処理装置10は、判定結果をリクエスト装置20に送信する(S12)。例えば、所定の条件に合った物体が撮影データのインデックスデータに登録されている場合、画像処理装置10は、撮影データの少なくとも一部をリクエスト装置20に送信することができる。一方、所定の条件に合った物体が撮影データのインデックスデータに登録されていない場合、画像処理装置10は、条件に合った撮影データが存在しないことをリクエスト装置20に通知することができる。 Then, the image processing device 10 transmits the determination result to the request device 20 (S12). For example, when an object meeting a predetermined condition is registered in the index data of the photographing data, the image processing device 10 can transmit at least a part of the photographing data to the request device 20. On the other hand, when an object matching the predetermined condition is not registered in the index data of the shooting data, the image processing device 10 can notify the request device 20 that there is no shooting data matching the condition.
 なお、画像処理装置10は、撮影データの少なくとも一部をリクエスト装置20に送信する前に、撮影データをリクエスト装置20に送信してよいか否かを問い合わせる情報を出力してもよい。そして、画像処理装置10は、当該問い合わせに対して「送信可」の回答を取得した後、撮影データの少なくとも一部をリクエスト装置20に送信してもよい。なお、画像処理装置10は、当該問い合わせに対して「送信不可」の回答を取得した場合、撮影データの少なくとも一部をリクエスト装置20に送信しない。画像処理装置10は、画像処理装置10を有するユーザに対して当該問い合わせを出力する。そして、画像処理装置10は、当該ユーザから当該問い合わせに対する回答を取得する。例えば、画像処理装置10は、画像処理装置10が有するディスプレイや画像処理装置10に接続されたディスプレイに、当該問い合わせを表示してもよい。そして、画像処理装置10は、画像処理装置10が有する入力装置又は画像処理装置10に接続された入力装置を介して、当該問い合わせに対する回答の入力を受付けてもよい。入力装置は、タッチパネルディスプレイ、物理ボタン、マイク等が例示されるが、これらに限定されない。 Note that the image processing apparatus 10 may output information for inquiring whether to transmit the photographing data to the requesting apparatus 20 before transmitting at least a part of the photographing data to the requesting apparatus 20. Then, the image processing device 10 may transmit at least a part of the photographing data to the request device 20 after obtaining the answer of “transmittable” in response to the inquiry. Note that, when the image processing apparatus 10 obtains a “transmission impossible” response to the inquiry, the image processing apparatus 10 does not transmit at least a part of the imaging data to the request device 20. The image processing device 10 outputs the inquiry to the user having the image processing device 10. Then, the image processing apparatus 10 acquires a response to the inquiry from the user. For example, the image processing apparatus 10 may display the inquiry on a display of the image processing apparatus 10 or a display connected to the image processing apparatus 10. Then, the image processing device 10 may receive an input of a response to the inquiry via an input device included in the image processing device 10 or an input device connected to the image processing device 10. Examples of the input device include a touch panel display, physical buttons, and a microphone, but are not limited thereto.
 また、画像処理装置10は、上記問い合わせる情報に加えて、撮影データをリクエスト装置20に送信した場合に取得できる報酬を示す情報を出力してもよい。画像処理装置10は、リクエスト装置20から報酬を示す情報を受信することができる。 In addition, the image processing apparatus 10 may output information indicating a reward that can be obtained when the imaging data is transmitted to the requesting apparatus 20 in addition to the information to be inquired. The image processing device 10 can receive information indicating a reward from the request device 20.
 問い合わせや報酬を示す情報の出力は、画像処理装置10が備えるディスプレイや、画像処理装置10に接続されたディスプレイへの表示であってもよいし、予め登録されたメールアドレスを宛先にした電子メールの送信であってもよい。 The output of the information indicating the inquiry or the reward may be displayed on a display provided in the image processing apparatus 10 or a display connected to the image processing apparatus 10, or may be an e-mail addressed to a pre-registered e-mail address. May be transmitted.
 次に、図9のフローチャートを用いて、画像処理装置10がインデックスデータを生成する処理の一例を説明する。 Next, an example of a process in which the image processing apparatus 10 generates index data will be described with reference to the flowchart in FIG.
 画像処理装置10は、インデックスデータ生成タイミングを検出すると(S20のYes)、撮影データを解析して属性データを生成し(S21)、インデックスデータとして登録する(S22)。 (4) Upon detecting the index data generation timing (Yes in S20), the image processing apparatus 10 analyzes the photographing data to generate attribute data (S21) and registers it as index data (S22).
 インデックスデータ生成タイミングは、例えば、撮影部11が撮影を終了したタイミングであってもよい。この例の場合、撮影部11が撮影を終了すると、その都度、画像処理装置10は新たに生成された撮影データをバッチ処理で解析して属性データを生成し、インデックスデータとして登録する。このように、撮影部11の撮影処理と並行して解析処理を行うのでなく、撮影部11の撮影処理が終了したタイミングで解析処理を実行することで、画像処理装置10の処理負担を軽減できる。 The index data generation timing may be, for example, the timing at which the imaging unit 11 has finished imaging. In this example, each time the imaging unit 11 ends imaging, the image processing apparatus 10 analyzes newly generated imaging data by batch processing, generates attribute data, and registers it as index data. As described above, the analysis processing is not performed in parallel with the imaging processing of the imaging unit 11 but is performed at the timing when the imaging processing of the imaging unit 11 is completed, so that the processing load on the image processing apparatus 10 can be reduced. .
 その他、インデックスデータ生成タイミングは、例えば、撮影部11が撮影を開始したタイミングであってもよい。この例の場合、撮影部11が撮影を開始すると、画像処理装置10は新たに生成された撮影データをリアルタイム処理で解析して属性データを生成し、インデックスデータとして登録する。このようなリアルタイム処理とすることで、事件や事故等の所定のイベントが発生した直後のリアルタイムな撮影データのリクエストに対しても応答可能となる。 In addition, the index data generation timing may be, for example, the timing at which the imaging unit 11 starts imaging. In this example, when the imaging unit 11 starts imaging, the image processing apparatus 10 analyzes newly generated imaging data by real-time processing, generates attribute data, and registers it as index data. With such real-time processing, it is possible to respond to a request for real-time imaging data immediately after a predetermined event such as an incident or accident has occurred.
「作用効果」
 以上説明した本実施形態の画像処理装置10によれば、移動体に取り付けられた撮影手段が生成した撮影データの中から、所定の物体が映っている撮影データを抽出し、リクエスト装置20に送信することができる。撮影データに所定の物体が映っているか否かで撮影データを十分に絞り込むことができるので、画像処理装置10は、リクエストに応じてリクエスト装置20に送信する撮影データを減らすことができる。結果、画像処理装置10とリクエスト装置20との間の通信負担を軽減することができる。
"Effects"
According to the image processing apparatus 10 of the present embodiment described above, the imaging data including a predetermined object is extracted from the imaging data generated by the imaging unit attached to the moving object, and transmitted to the request device 20. can do. Since the image data can be sufficiently narrowed down depending on whether or not a predetermined object is reflected in the image data, the image processing device 10 can reduce the image data transmitted to the request device 20 in response to the request. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
 また、本実施形態の画像処理装置10は、撮影データとして動画データを生成している場合、リクエストに応じて全ての動画データをリクエスト装置20に送信するのでなく、所定の物体が映っている部分を含む一部の動画データをリクエスト装置20に送信することができる。結果、画像処理装置10とリクエスト装置20との間の通信負担を軽減することができる。 Further, when the moving image data is generated as the photographing data, the image processing device 10 of the present embodiment does not transmit all the moving image data to the request device 20 in response to the request, but includes a portion where a predetermined object is reflected. May be transmitted to the requesting device 20. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
 このような本実施形態の画像処理装置10とリクエスト装置20とを有する画像処理システムによれば、例えば、事件や事故の現場等を撮影した動画データを効率的に収集することが可能となる。また、例えば訪ね人(例:犯人、家出人、失踪者等)や迷子になった動物等をたまたま撮影した動画データを効率的に収集することが可能となる。また、隕石や未確認物体等をたまたま撮影した動画データを効率的に収集することが可能となる。 According to the image processing system including the image processing device 10 and the request device 20 according to the present embodiment, for example, it is possible to efficiently collect moving image data obtained by photographing the site of an incident or accident. In addition, for example, it is possible to efficiently collect moving image data obtained by accidentally photographing a visitor (eg, a criminal, a runaway person, a missing person, etc.) or a lost animal. In addition, it becomes possible to efficiently collect moving image data obtained by accidentally photographing meteorites, unidentified objects, and the like.
 また、本実施形態の画像処理システムでは、画像を生成し、記憶している画像処理装置10が該当データの有無を判定し、必要に応じて撮影データをリクエスト装置20に送信することができる。このため、画像処理装置10が記憶している撮影データの全てをリクエスト装置20に送信し、リクエスト装置20が該当データの有無を判定する場合に比べて、画像処理装置10とリクエスト装置20との間の通信負担を軽減することができる。 In addition, in the image processing system according to the present embodiment, the image processing apparatus 10 that generates and stores an image can determine the presence or absence of the corresponding data, and transmit the imaging data to the requesting apparatus 20 as necessary. For this reason, all of the photographing data stored in the image processing apparatus 10 is transmitted to the requesting apparatus 20, and compared with the case where the requesting apparatus 20 determines the presence or absence of the corresponding data, the image processing apparatus 10 and the requesting apparatus 20 are compared. The communication load between them can be reduced.
 また、画像処理装置10がリクエストに応じて該当データの有無を判定するので、画像処理装置10の所有者は画像を目視で確認するという面倒な作業を行うことなく、該当データの有無を認識することができる。 Further, since the image processing apparatus 10 determines the presence or absence of the corresponding data in response to the request, the owner of the image processing apparatus 10 recognizes the presence or absence of the corresponding data without performing the troublesome work of visually checking the image. be able to.
 また、画像処理装置10は、該当データを有すると判定した場合、画像処理装置10の所有者に撮影データを送信してよいか否かを問い合わせ、その回答に応じた処理を行うことができる。このため、画像処理装置10の所有者が希望しない撮影データが意図せず外部に送信される不都合を抑制できる。 When the image processing apparatus 10 determines that the image data has the corresponding data, the image processing apparatus 10 can inquire of the owner of the image processing apparatus 10 whether or not the imaging data can be transmitted, and can perform a process corresponding to the answer. For this reason, it is possible to suppress the inconvenience that the photographing data that is not desired by the owner of the image processing apparatus 10 is transmitted to the outside without intention.
 また、画像処理装置10は、上記問い合わせに加えて、撮影データを提供した場合に得られる報酬を出力することができる。適切なタイミングで報酬の情報を出力することで、画像処理装置10の所有者は迅速に提供可否の判断を行うことができる。 {Circle around (4)} In addition to the above inquiry, the image processing apparatus 10 can output a reward obtained when the photographing data is provided. By outputting the reward information at an appropriate timing, the owner of the image processing apparatus 10 can quickly determine whether or not to provide the reward.
<第2の実施形態>
「画像処理システムの全体像及び概要」
 画像処理システムの全体像は、第1の実施形態と同様である。本実施形態の画像処理装置10は、移動体に取り付けられた撮影手段が生成した撮影データの中から、所定の位置で及び/又は所定の日時に撮影され、かつ、所定の物体が映っている撮影データを抽出し、リクエスト装置20に送信することができる。撮影データに所定の物体が映っているか否かに加えて、撮影位置及び/又は撮影日時を用いて撮影データを十分に絞り込むことができるので、画像処理装置10は、リクエストに応じてリクエスト装置20に送信する撮影データを減らすことができる。
<Second embodiment>
"Overview and Overview of Image Processing System"
The overall image of the image processing system is the same as in the first embodiment. The image processing apparatus 10 according to the present embodiment is photographed at a predetermined position and / or at a predetermined date and time from a photographing data generated by photographing means attached to a moving object, and a predetermined object is reflected. The photographing data can be extracted and transmitted to the request device 20. In addition to whether or not a predetermined object is reflected in the photographing data, the photographing data can be sufficiently narrowed down using the photographing position and / or the photographing date and time. Can reduce the number of pieces of photographing data to be transmitted.
「画像処理装置10の構成」
 画像処理装置10のハードウエア構成の一例は、第1の実施形態と同様である。
“Configuration of Image Processing Device 10”
An example of the hardware configuration of the image processing apparatus 10 is the same as in the first embodiment.
 画像処理装置10の機能ブロック図の一例は、第1の実施形態同様、図6で示される。 An example of a functional block diagram of the image processing apparatus 10 is shown in FIG. 6, as in the first embodiment.
 撮影部11は、図10に示すように、位置検出部111と、日時検出部112とを有する。位置検出部111は現在位置を検出する。現在位置の検出は、例えばGPS(global positioning system)の利用などが例示されるが、これに限定されない。日時検出部112は、現在日時を検出する。現在日時の検出は、例えば画像処理装置10内に組み込まれた内蔵時計の利用等が例示されるが、これに限定されない。 The image capturing unit 11 includes a position detecting unit 111 and a date and time detecting unit 112, as shown in FIG. The position detection unit 111 detects a current position. The detection of the current position includes, for example, the use of a GPS (global positioning system), but is not limited thereto. The date and time detection unit 112 detects the current date and time. The detection of the current date and time includes, for example, use of a built-in clock incorporated in the image processing apparatus 10, but is not limited thereto.
 撮影部11は、位置検出部111及び日時検出部112が検出した値を利用して、撮影データに撮影位置及び撮影日時を紐付けることができる。撮影部11のその他の構成は、第1の実施形態と同様である。 (4) The photographing unit 11 can use the values detected by the position detecting unit 111 and the date and time detecting unit 112 to associate the photographing position and the photographing date and time with the photographing data. Other configurations of the imaging unit 11 are the same as those of the first embodiment.
 解析部12の構成は、第1の実施形態と同様である。 The configuration of the analysis unit 12 is the same as that of the first embodiment.
 登録部13は、撮影部11が生成した撮影データの撮影位置及び撮影日時を、撮影データのインデックスデータとして登録する。 The registration unit 13 registers the shooting position and shooting date and time of the shooting data generated by the shooting unit 11 as index data of the shooting data.
 図11に、インデックスデータの一例を模式的に示す。図示する例では、撮影日時を示す日時情報と、撮影位置を示す位置情報と、画像IDと、属性データで特定される物体とが互いに対応付けて登録されている。 FIG. 11 schematically shows an example of the index data. In the illustrated example, date and time information indicating a shooting date and time, position information indicating a shooting position, an image ID, and an object specified by attribute data are registered in association with each other.
 データ記憶部14の構成は、第1の実施形態と同様である。 The configuration of the data storage unit 14 is the same as that of the first embodiment.
 リクエスト取得部15は、所定の条件に合った物体が映っており、かつ、撮影位置及び撮影日時の少なくとも一方をさらに指定した撮影データのリクエストを取得する。リクエスト取得部15のその他の構成は、第1の実施形態と同様である。 The request acquisition unit 15 acquires a request for imaging data in which an object meeting a predetermined condition is shown and at least one of the imaging position and the imaging date and time is further specified. Other configurations of the request acquisition unit 15 are the same as those of the first embodiment.
 例えば、所定の物体が映っている撮影データを探しているオペレータは、リクエスト装置20が有する入力装置又はリクエスト装置20に接続された入力装置を介して、物体の条件、撮影位置、撮影日時等を入力し、撮影データをリクエストする入力を行う。 For example, an operator who is searching for photographing data in which a predetermined object is reflected can use an input device included in the request device 20 or an input device connected to the request device 20 to change the condition of the object, the photographing position, the photographing date and time, and the like. And input to request photographing data.
 当該入力において、撮影日時は、「2018年8月20日13時00分から14時00分」のように時間帯で指定されてもよいし、「2018年8月20日13時00分」のようにピンポイントで指定されてもよい。 In this input, the shooting date and time may be specified in a time zone such as “13:00 to 14:00 on August 20, 2018”, or may be specified in “13:00 on August 20, 2018”. As described above.
 また、当該入力において、撮影場所は、「○○市」、「○○町」のように地名で指定されてもよい。その他、リクエスト装置20は、地図を表示するとともに、当該地図上で所定のエリアを指定する入力を受付けるUI(user interface)画面を介して、撮影場所を指定する入力を受付けてもよい。例えば、地図上に、ユーザ入力に基づき、表示位置、大きさ、形状等が変更される枠が表示されてもよい。そして、この枠の内部の領域が撮影場所として指定されてもよい。 In addition, in the input, the shooting location may be designated by a place name, such as “XX city” or “XX town”. In addition, the request device 20 may receive an input for designating a shooting location via a UI (user @ interface) screen that accepts an input for designating a predetermined area on the map while displaying the map. For example, a frame whose display position, size, shape, and the like are changed based on a user input may be displayed on a map. Then, an area inside this frame may be designated as a shooting location.
 判定部16は、インデックスデータに基づき、所定の条件に合った物体が映っており、かつ、撮影位置及び/又は撮影日時が指定内容に合っている撮影データがデータ記憶部14に記憶されているか判定する。 Based on the index data, the determination unit 16 determines whether an object matching a predetermined condition is reflected and that the shooting data whose shooting position and / or shooting date and time match the specified content is stored in the data storage unit 14. judge.
 例えば、判定部16は、所定の条件に合った物体がインデックスデータに登録されているか判定する。そして、登録されている場合、所定の条件に合った物体が対応付けられている画像ID(図11参照)の撮影位置及び/又は撮影日時が、リクエストで指定された撮影位置及び/または撮影日時と合っているか判定する。 For example, the determination unit 16 determines whether an object meeting a predetermined condition is registered in the index data. Then, if registered, the shooting position and / or shooting date and time of the image ID (see FIG. 11) associated with an object meeting the predetermined condition is changed to the shooting position and / or shooting date and time specified in the request. Judge whether or not it matches.
 送信部17の構成は、第1の実施形態と同様である。 構成 The configuration of the transmission unit 17 is the same as that of the first embodiment.
「画像処理システムの処理の流れ」
 画像処理システムの処理の流れの一例は、第1の実施形態と同様である。
"Processing flow of image processing system"
An example of the processing flow of the image processing system is the same as in the first embodiment.
 なお、本実施形態のインデックスデータ生成タイミングは、第1の実施形態で説明した例と同じであってもよいし、他の例として、撮影データリクエスト(図8のS10)を受信したタイミングであってもよい。この場合、画像処理装置10は、撮影データのリクエストで指定された撮影日時に生成された撮影データのみを処理して属性データを生成し、インデックスデータとして登録する。このように、撮影データのリクエストに応じて処理を行う撮像データの量を最小限に留めることで、処理負荷を軽減する効果が得られる。 Note that the index data generation timing of the present embodiment may be the same as the example described in the first embodiment, or as another example, the timing at which a photographing data request (S10 in FIG. 8) is received. You may. In this case, the image processing apparatus 10 processes only the photographing data generated at the photographing date and time specified by the photographing data request, generates attribute data, and registers the attribute data as index data. As described above, by minimizing the amount of imaging data to be processed in response to a request for imaging data, an effect of reducing the processing load can be obtained.
「作用効果」
 以上説明した本実施形態の画像処理装置10によれば、第1の実施形態と同様な作用効果を実現できる。
"Effects"
According to the image processing apparatus 10 of the present embodiment described above, the same operation and effect as those of the first embodiment can be realized.
 また、本実施形態の画像処理装置10によれば、撮影データに所定の物体が映っているか否かに加えて、撮影位置及び/又は撮影日時を用いて所望の撮影データを十分に絞り込むことができるので、画像処理装置10は、リクエストに応じてリクエスト装置20に送信する撮影データを減らすことができる。結果、画像処理装置10とリクエスト装置20との間の通信負担を軽減することができる。 Further, according to the image processing apparatus 10 of the present embodiment, it is possible to sufficiently narrow down desired photographing data using the photographing position and / or photographing date and time in addition to whether or not a predetermined object is reflected in the photographing data. Therefore, the image processing apparatus 10 can reduce the number of pieces of imaging data transmitted to the request apparatus 20 in response to the request. As a result, the communication load between the image processing device 10 and the request device 20 can be reduced.
<第3の実施形態>
「画像処理システムの全体像及び概要」
 画像処理システムの全体像は、第1及び第2の実施形態と同様である。本実施形態の画像処理装置10は、撮影データの撮影時の撮影環境や、撮影データに映った物体等に応じて、インデックスデータとして登録する属性データの種類や量を変えることができる。
<Third embodiment>
"Overview and Overview of Image Processing System"
The overall image of the image processing system is the same as in the first and second embodiments. The image processing apparatus 10 according to the present embodiment can change the type and amount of attribute data registered as index data according to a shooting environment at the time of shooting of shooting data, an object reflected in the shooting data, and the like.
「画像処理装置10の構成」
 画像処理装置10のハードウエア構成の一例は、第1及び第2の実施形態と同様である。
“Configuration of Image Processing Device 10”
An example of the hardware configuration of the image processing apparatus 10 is the same as in the first and second embodiments.
 画像処理装置10の機能ブロック図の一例は、第1及び第2の実施形態同様、図6で示される。 An example of a functional block diagram of the image processing apparatus 10 is shown in FIG. 6, as in the first and second embodiments.
 撮影部11、データ記憶部14、リクエスト取得部15、判定部16及び送信部17の構成は、第1及び第2の実施形態と同様である。 The configurations of the imaging unit 11, the data storage unit 14, the request acquisition unit 15, the determination unit 16, and the transmission unit 17 are the same as in the first and second embodiments.
 解析部12は、撮影データの撮影時の撮影環境に応じて、生成する属性データの種類及び量の少なくとも一方を変える。 The analysis unit 12 changes at least one of the type and the amount of the generated attribute data according to the shooting environment at the time of shooting the shooting data.
 例えば、解析部12は、撮影環境毎に生成する属性データの種類や量を予め定めた生成ルールに基づき、属性データを生成してもよい。属性データの量の増減は、生成する属性データの種類の数の増減で実現されてもよいし、動画データの中の処理対象とするフレームデータの数の増減で実現されてもよい。 For example, the analysis unit 12 may generate attribute data based on a generation rule in which the type and amount of attribute data generated for each shooting environment are determined in advance. The increase or decrease in the amount of attribute data may be realized by increasing or decreasing the number of types of attribute data to be generated, or may be realized by increasing or decreasing the number of frame data to be processed in the moving image data.
 そして、登録部13は、撮影データの撮影時の撮影環境に応じて、インデックスデータとして登録する属性データの種類及び量の少なくとも一方を変える。 (4) The registration unit 13 changes at least one of the type and the amount of the attribute data to be registered as the index data according to the shooting environment at the time of shooting the shooting data.
 撮影環境は、撮影位置、撮影時刻、撮影日、撮影曜日、撮影時の天気、撮影装置のスペック(画像の解像度等)等が例示される。解析部12は、任意の手段で、当該撮影環境を示す情報を取得することができる。 The shooting environment is exemplified by a shooting position, a shooting time, a shooting date, a shooting day, a weather at the time of shooting, specifications of the shooting device (image resolution and the like), and the like. The analysis unit 12 can acquire information indicating the shooting environment by any means.
 例えば、人や移動体の量が比較的多いエリア、時間帯、日、曜日においては、生成する属性データの量を相対的に少なくしてもよい。このような撮影環境で生成された撮影データには、多くの人や移動体が映っている。このような撮影データを解析して多数の種類の属性データを生成し、登録すると、画像処理装置10の処理負担が大きくなるほか、記憶容量を圧迫する恐れがある。このため、このような撮影環境で撮影データが生成された場合、生成する属性データの種類を少なくしたり、処理対象とするフレームデータの数を少なくしたりすることで、画像処理装置10の負担を軽減する。 For example, in an area, a time zone, a day, and a day of the week where the number of people or moving objects is relatively large, the amount of generated attribute data may be relatively small. Many people and moving objects are reflected in the photographing data generated in such a photographing environment. If many types of attribute data are generated and registered by analyzing such photographing data, the processing load of the image processing apparatus 10 increases and the storage capacity may be reduced. Therefore, when photographing data is generated in such a photographing environment, the load on the image processing apparatus 10 is reduced by reducing the type of attribute data to be generated or the number of frame data to be processed. To reduce
 一方、人や移動体の量が比較的少ないエリア、時間帯、日、曜日においては、上述のような問題が生じにくいので、生成する属性データの量を相対的に多くしてもよい。 On the other hand, in an area, a time zone, a day, and a day of the week in which the number of people and moving objects is relatively small, the above-described problem hardly occurs. Therefore, the amount of attribute data to be generated may be relatively large.
 その他、物体が明確に映った画像を生成しにくい撮影環境、例えば夜の時間帯、雨の日、撮影装置のスペックが低い(例:画像の解像度が所定レベル以下)等においては、生成する属性データの量を相対的に少なくしてもよい。このような撮影環境で生成された撮影データは利用価値が低い。このため、このような撮影環境で撮影データが生成された場合、生成する属性データの種類を少なくしたり、処理対象とするフレームデータの数を少なくしたりすることで、画像処理装置10の負担を軽減する。 In addition, in an imaging environment in which it is difficult to generate an image in which an object is clearly reflected, for example, in a night time zone, on a rainy day, when the specifications of the imaging device are low (for example, the resolution of the image is lower than a predetermined level), the attribute to be generated is The amount of data may be relatively small. The imaging data generated in such an imaging environment has a low utility value. Therefore, when photographing data is generated in such a photographing environment, the load on the image processing apparatus 10 is reduced by reducing the type of attribute data to be generated or the number of frame data to be processed. To reduce
 一方、物体が明確に映った画像を生成しやすい撮影環境、例えば昼の時間帯、晴れの日、撮影装置のスペックが高い(例:画像の解像度が所定レベル以上)等においては、生成する属性データの量を相対的に多くしてもよい。このような撮影環境で生成された撮影データは利用価値が高いので、多種かつ多量の属性データを生成し、登録する。 On the other hand, in a shooting environment in which an image in which an object is clearly reflected is easily generated, for example, in the daytime, on a sunny day, or when the specification of the shooting device is high (eg, the resolution of the image is equal to or higher than a predetermined level), the attribute to be generated is set. The amount of data may be relatively large. Since photographing data generated in such a photographing environment has a high utility value, various and large amounts of attribute data are generated and registered.
 また、解析部12は、撮影データに映った物体に応じて、生成する属性データの種類及び量の少なくとも一方を変えることができる。 {Circle around (4)} The analysis unit 12 can change at least one of the type and the amount of the attribute data to be generated according to the object reflected in the photographing data.
 例えば、解析部12は、撮影データに映った物体毎に生成する属性データの種類や量を予め定めた生成ルールに基づき、属性データを生成してもよい。属性データの量の増減は、生成する属性データの種類の数の増減で実現されてもよいし、動画データの中の処理対象とするフレームデータの数の増減で実現されてもよい。 For example, the analysis unit 12 may generate the attribute data based on a generation rule in which the type and amount of the attribute data generated for each object reflected in the shooting data are determined in advance. The increase or decrease in the amount of attribute data may be realized by increasing or decreasing the number of types of attribute data to be generated, or may be realized by increasing or decreasing the number of frame data to be processed in the moving image data.
 そして、登録部13は、撮影データに映った物体に応じて、インデックスデータとして登録する属性データの種類及び量の少なくとも一方を変える。 (4) The registration unit 13 changes at least one of the type and the amount of the attribute data to be registered as the index data according to the object reflected in the photographing data.
 例えば、移動速度が比較的速い移動体が撮影データに映っている場合、処理対象とするフレームデータの数を多くすることで(例:全てのフレームデータを処理対象とする、比較的少ない数おきのフレームデータを処理対象とする)、生成する属性データの量を相対的に多くしてもよい。このようにすることで、撮影データに映っている物体を漏らさずインデックスデータに登録することができる。 For example, when a moving object having a relatively fast moving speed is included in the photographing data, the number of frame data to be processed is increased (for example, all frame data are processed, and a relatively small number is set). And the amount of attribute data to be generated may be relatively large. By doing so, the object reflected in the photographing data can be registered in the index data without leaking.
 一方、移動速度が比較的速い移動体が撮影データに映っていない場合、処理対象とするフレームデータの数を少なくすることで(例:比較的多い数おきのフレームデータを処理対象とする)、生成する属性データの量を相対的に少なくしてもよい。移動速度が比較的速い移動体が撮影データに映っておらず、移動速度が比較的遅い物体が映っている場合、処理対象とするフレームデータの数を少なくしても、撮影データに映っている物体を漏らさず抽出し、インデックスデータに登録することができる。 On the other hand, when a moving object having a relatively fast moving speed is not included in the photographing data, the number of frame data to be processed is reduced (eg, a relatively large number of frame data is processed). The amount of attribute data to be generated may be relatively reduced. When a moving object having a relatively fast moving speed is not reflected in the photographing data, and an object having a relatively slow moving speed is reflected in the photographing data, the moving object is reflected in the photographing data even if the number of frame data to be processed is reduced. Objects can be extracted without leaking and registered in the index data.
 ここで、撮影データの撮影時の撮影環境や、撮影データに映った物体等に応じて、インデックスデータとして登録する属性データの種類や量を変える処理の具体例を説明する。図12に示すように、撮影データに映った物体(人物、車両等)毎に、インデックスデータとして登録する属性データの種類が予め定義される。そして、生成する属性データの量を相対的に多くする場合には、解析部12は、図示する複数種類の属性データの中の全て又は大部分を生成する。一方、生成する属性データの量を相対的に少なくする場合には、解析部12は、図示する複数種類の属性データの一部分のみを生成する。複数の属性データの一部分のみを生成する場合、解析部12は、例えば、個人や個体をより特定し難くなる属性データ(図示する曖昧側に位置する属性データ)を生成するよう定められていてもよい。このようにした方が、データ量が少なくなるほか、コンピュータの処理負担が軽減される。なお、予め、撮影データの撮影時の撮影環境や撮影データに映った物体毎に、どの種類の属性データを生成するか定められており、解析部12は当該定めに基づき、生成する属性データの種類を決定することができる。 Here, a specific example of a process of changing the type and amount of attribute data registered as index data according to a shooting environment at the time of shooting of shooting data, an object reflected in the shooting data, and the like will be described. As shown in FIG. 12, the type of attribute data registered as index data is defined in advance for each object (person, vehicle, etc.) reflected in the photographing data. When relatively increasing the amount of attribute data to be generated, the analysis unit 12 generates all or most of the illustrated types of attribute data. On the other hand, when the amount of attribute data to be generated is relatively reduced, the analysis unit 12 generates only a part of the illustrated plurality of types of attribute data. When only a part of the plurality of attribute data is generated, the analysis unit 12 may be set to generate, for example, attribute data (attribute data located on the vague side illustrated in the drawing) that makes it difficult to identify an individual or an individual. Good. This reduces the amount of data and the processing load on the computer. It should be noted that, in advance, the type of attribute data to be generated is determined for each of the shooting environment at the time of shooting of the shooting data and each object reflected in the shooting data. The type can be determined.
「画像処理システムの処理の流れ」
 画像処理システムの処理の流れの一例は、第1及び第2の実施形態と同様である。
"Processing flow of image processing system"
An example of the processing flow of the image processing system is the same as in the first and second embodiments.
「作用効果」
 以上説明した本実施形態の画像処理装置10によれば、第1及び第2の実施形態と同様な作用効果を実現できる。
"Effects"
According to the image processing apparatus 10 of the present embodiment described above, the same functions and effects as those of the first and second embodiments can be realized.
 また、撮影データの撮影時の撮影環境及び撮影データに映った物体等に応じて、インデックスデータとして登録する属性データの種類や量を変えることで、必要な属性データをインデックスデータとして登録しつつ、画像処理装置10の処理負担を軽減できる。 In addition, by changing the type and amount of attribute data registered as index data according to the shooting environment at the time of shooting the shooting data and the object reflected in the shooting data, etc., while registering necessary attribute data as index data, The processing load on the image processing device 10 can be reduced.
 以下、参考形態の例を付記する。
1. 移動体に取り付けられ、前記移動体の外部を撮影する撮影手段と、
 前記撮影手段が生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析手段と、
 前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録手段と、
 所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得手段と、
 前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定手段と、
 前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信手段と、
を有する画像処理装置。
2. 1に記載の画像処理装置において、
 前記撮影手段は、現在位置を検出する位置検出手段と、現在日時を検出する日時検出手段とを有し、
 前記登録手段は、前記撮影データの撮影位置及び撮影日時を前記撮影データのインデックスデータとして登録する画像処理装置。
3. 2に記載の画像処理装置において、
 前記リクエスト取得手段は、撮影位置及び撮影日時の少なくとも一方をさらに指定した前記撮影データのリクエストを取得する画像処理装置。
4. 1から3のいずれかに記載の画像処理装置において、
 前記登録手段は、前記撮影データの撮影時の撮影環境に応じて、前記撮影データのインデックスデータとして登録する前記属性データの種類を変える画像処理装置。
5. 1から4のいずれかに記載の画像処理装置において、
 前記登録手段は、前記撮影データの撮影時の撮影環境に応じて、前記撮影データのインデックスデータとして登録する前記属性データの量を変える画像処理装置。
6. 1から5のいずれかに記載の画像処理装置において、
 前記登録手段は、前記撮影データに映った物体に応じて、前記撮影データのインデックスデータとして登録する前記属性データの種類を変える画像処理装置。
7. 1から6のいずれかに記載の画像処理装置において、
 前記登録手段は、前記撮影データに映った物体に応じて、前記撮影データのインデックスデータとして登録する前記属性データの量を変える画像処理装置。
8. 1から7のいずれかに記載の画像処理装置において、
 前記属性データは、人の体格、衣服の特徴、移動体の色、車種、及び、ナンバープレートに記載された情報の中の少なくとも1つを含む画像処理装置。
9. 1から8のいずれかに記載の画像処理装置において、
 前記画像処理装置は、移動体搭載装置及び可搬型装置の一方又は両方で構成される画像処理装置。
10. コンピュータが、
 移動体に取り付けられ、前記移動体の外部を撮影する撮影工程と、
 前記撮影工程で生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析工程と、
 前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録工程と、
 所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得工程と、
 前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定工程と、
 前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信工程と、
を実行する画像処理方法。
11. コンピュータを、
 移動体に取り付けられ、前記移動体の外部を撮影する撮影手段、
 前記撮影手段が生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析手段、
 前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録手段、
 所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得手段、
 前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定手段、
 前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信手段、
として機能させるプログラム。
Hereinafter, examples of the reference embodiment will be additionally described.
1. Photographing means attached to a moving body, for photographing the outside of the moving body,
Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data;
Registration means for registering the object specified by the attribute data as index data of the photographing data,
Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
Determining means for determining whether an object meeting the predetermined condition is registered in the index data,
When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device,
An image processing apparatus having:
2. In the image processing apparatus according to 1,
The photographing unit has a position detecting unit that detects a current position, and a date and time detecting unit that detects a current date and time,
The image processing apparatus, wherein the registration unit registers a shooting position and a shooting date and time of the shooting data as index data of the shooting data.
3. 2. In the image processing apparatus according to 2,
The image processing apparatus, wherein the request acquiring unit acquires a request for the photographing data further specifying at least one of a photographing position and a photographing date and time.
4. The image processing apparatus according to any one of 1 to 3,
The image processing apparatus, wherein the registration unit changes a type of the attribute data to be registered as index data of the shooting data according to a shooting environment at the time of shooting the shooting data.
5. The image processing apparatus according to any one of 1 to 4,
The image processing device, wherein the registration unit changes an amount of the attribute data to be registered as index data of the shooting data according to a shooting environment at the time of shooting the shooting data.
6. The image processing apparatus according to any one of 1 to 5,
The image processing device, wherein the registration unit changes a type of the attribute data to be registered as index data of the shooting data according to an object reflected in the shooting data.
7. The image processing apparatus according to any one of 1 to 6,
The image processing device, wherein the registration unit changes an amount of the attribute data to be registered as index data of the shooting data according to an object reflected in the shooting data.
8. The image processing apparatus according to any one of 1 to 7,
The image processing device, wherein the attribute data includes at least one of information on a physique of a person, a feature of clothes, a color of a moving object, a vehicle type, and a license plate.
9. The image processing apparatus according to any one of 1 to 8,
The image processing apparatus is an image processing apparatus including one or both of a mobile device and a portable device.
10. Computer
A photographing step attached to the moving body and photographing the outside of the moving body,
Analyzing the photographing data generated in the photographing step, an analyzing step of generating attribute data of an object reflected in the photographing data,
A registration step of registering the object specified by the attribute data as index data of the photographing data,
A request acquisition step of acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
A determining step of determining whether an object meeting the predetermined condition is registered in the index data,
When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission step of transmitting at least a part of the imaging data to an external device,
Image processing method to execute.
11. Computer
Photographing means attached to a moving body and photographing the outside of the moving body,
Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data;
Registration means for registering the object specified by the attribute data as index data of the photographing data,
Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
Determining means for determining whether an object meeting the predetermined condition is registered in the index data,
When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device,
A program to function as
 この出願は、2018年8月30日に出願された日本出願特願2018-161893号を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2018-161893 filed on Aug. 30, 2018, the entire disclosure of which is incorporated herein.

Claims (11)

  1.  移動体に取り付けられ、前記移動体の外部を撮影する撮影手段と、
     前記撮影手段が生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析手段と、
     前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録手段と、
     所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得手段と、
     前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定手段と、
     前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信手段と、
    を有する画像処理装置。
    Photographing means attached to a moving body, for photographing the outside of the moving body,
    Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data;
    Registration means for registering the object specified by the attribute data as index data of the photographing data,
    Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
    Determining means for determining whether an object meeting the predetermined condition is registered in the index data,
    When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission unit that transmits at least a part of the imaging data to an external device,
    An image processing apparatus having:
  2.  請求項1に記載の画像処理装置において、
     前記撮影手段は、現在位置を検出する位置検出手段と、現在日時を検出する日時検出手段とを有し、
     前記登録手段は、前記撮影データの撮影位置及び撮影日時を前記撮影データのインデックスデータとして登録する画像処理装置。
    The image processing apparatus according to claim 1,
    The photographing unit has a position detecting unit that detects a current position, and a date and time detecting unit that detects a current date and time,
    The image processing apparatus, wherein the registration unit registers a shooting position and a shooting date and time of the shooting data as index data of the shooting data.
  3.  請求項2に記載の画像処理装置において、
     前記リクエスト取得手段は、撮影位置及び撮影日時の少なくとも一方をさらに指定した前記撮影データのリクエストを取得する画像処理装置。
    The image processing apparatus according to claim 2,
    The image processing apparatus, wherein the request acquiring unit acquires a request for the photographing data further specifying at least one of a photographing position and a photographing date and time.
  4.  請求項1から3のいずれか1項に記載の画像処理装置において、
     前記登録手段は、前記撮影データの撮影時の撮影環境に応じて、前記撮影データのインデックスデータとして登録する前記属性データの種類を変える画像処理装置。
    The image processing apparatus according to any one of claims 1 to 3,
    The image processing apparatus, wherein the registration unit changes a type of the attribute data to be registered as index data of the shooting data according to a shooting environment at the time of shooting the shooting data.
  5.  請求項1から4のいずれか1項に記載の画像処理装置において、
     前記登録手段は、前記撮影データの撮影時の撮影環境に応じて、前記撮影データのインデックスデータとして登録する前記属性データの量を変える画像処理装置。
    The image processing apparatus according to claim 1, wherein:
    The image processing device, wherein the registration unit changes an amount of the attribute data to be registered as index data of the shooting data according to a shooting environment at the time of shooting the shooting data.
  6.  請求項1から5のいずれか1項に記載の画像処理装置において、
     前記登録手段は、前記撮影データに映った物体に応じて、前記撮影データのインデックスデータとして登録する前記属性データの種類を変える画像処理装置。
    The image processing apparatus according to any one of claims 1 to 5,
    The image processing device, wherein the registration unit changes a type of the attribute data to be registered as index data of the shooting data according to an object reflected in the shooting data.
  7.  請求項1から6のいずれか1項に記載の画像処理装置において、
     前記登録手段は、前記撮影データに映った物体に応じて、前記撮影データのインデックスデータとして登録する前記属性データの量を変える画像処理装置。
    The image processing apparatus according to any one of claims 1 to 6,
    The image processing device, wherein the registration unit changes an amount of the attribute data to be registered as index data of the shooting data according to an object reflected in the shooting data.
  8.  請求項1から7のいずれか1項に記載の画像処理装置において、
     前記属性データは、人の体格、衣服の特徴、移動体の色、車種、及び、ナンバープレートに記載された情報の中の少なくとも1つを含む画像処理装置。
    The image processing apparatus according to any one of claims 1 to 7,
    The image processing device, wherein the attribute data includes at least one of information on a physique of a person, a feature of clothes, a color of a moving object, a vehicle type, and a license plate.
  9.  請求項1から8のいずれか1項に記載の画像処理装置において、
     前記画像処理装置は、移動体搭載装置及び可搬型装置の一方又は両方で構成される画像処理装置。
    The image processing apparatus according to any one of claims 1 to 8,
    The image processing apparatus is an image processing apparatus including one or both of a mobile device and a portable device.
  10.  コンピュータが、
     移動体に取り付けられ、前記移動体の外部を撮影する撮影工程と、
     前記撮影工程で生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析工程と、
     前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録工程と、
     所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得工程と、
     前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定工程と、
     前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信工程と、
    を実行する画像処理方法。
    Computer
    A photographing step attached to the moving body and photographing the outside of the moving body,
    Analyzing the photographing data generated in the photographing step, an analyzing step of generating attribute data of an object reflected in the photographing data,
    A registration step of registering the object specified by the attribute data as index data of the photographing data,
    A request acquisition step of acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
    A determining step of determining whether an object meeting the predetermined condition is registered in the index data,
    When it is determined that an object that meets the predetermined condition is registered in the index data, a transmission step of transmitting at least a part of the imaging data to an external device,
    Image processing method to execute.
  11.  コンピュータを、
     移動体に取り付けられ、前記移動体の外部を撮影する撮影手段、
     前記撮影手段が生成した撮影データを解析し、前記撮影データに映った物体の属性データを生成する解析手段、
     前記属性データで特定される物体を前記撮影データのインデックスデータとして登録する登録手段、
     所定の条件に合った物体が映っている前記撮影データのリクエストを取得するリクエスト取得手段、
     前記所定の条件に合った物体が前記インデックスデータに登録されているか判定する判定手段、
     前記所定の条件に合った物体が前記インデックスデータに登録されていると判定された場合、前記撮影データの少なくとも一部を外部装置に送信する送信手段、
    として機能させるプログラム。
    Computer
    Photographing means attached to a moving body and photographing the outside of the moving body,
    Analyzing means for analyzing the photographing data generated by the photographing means and generating attribute data of an object reflected in the photographing data;
    Registration means for registering the object specified by the attribute data as index data of the photographing data,
    Request acquisition means for acquiring a request for the imaging data in which an object meeting a predetermined condition is reflected,
    Determining means for determining whether an object meeting the predetermined condition is registered in the index data,
    When it is determined that an object that meets the predetermined condition is registered in the index data, a transmitting unit that transmits at least a part of the photographing data to an external device,
    A program to function as
PCT/JP2019/015214 2018-08-30 2019-04-05 Image processing device, image processing method, and program WO2020044646A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020540045A JPWO2020044646A1 (en) 2018-08-30 2019-04-05 Image processing equipment, image processing methods and programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018161893 2018-08-30
JP2018-161893 2018-08-30

Publications (1)

Publication Number Publication Date
WO2020044646A1 true WO2020044646A1 (en) 2020-03-05

Family

ID=69645146

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/015214 WO2020044646A1 (en) 2018-08-30 2019-04-05 Image processing device, image processing method, and program

Country Status (2)

Country Link
JP (1) JPWO2020044646A1 (en)
WO (1) WO2020044646A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002288174A (en) * 2001-03-28 2002-10-04 Nec Corp Information obtaining method and server
JP2014106617A (en) * 2012-11-26 2014-06-09 Mitsubishi Electric Corp On-vehicle information providing device
CN105760533A (en) * 2016-03-08 2016-07-13 广东欧珀移动通信有限公司 Photo management method and photo management device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013161390A (en) * 2012-02-08 2013-08-19 Sony Corp Server, client terminal, system and program
JP6388532B2 (en) * 2014-11-28 2018-09-12 富士通株式会社 Image providing system and image providing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002288174A (en) * 2001-03-28 2002-10-04 Nec Corp Information obtaining method and server
JP2014106617A (en) * 2012-11-26 2014-06-09 Mitsubishi Electric Corp On-vehicle information providing device
CN105760533A (en) * 2016-03-08 2016-07-13 广东欧珀移动通信有限公司 Photo management method and photo management device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OAMI, RYOMA ET AL.: "Vehicle/Human Metadata Analysis Technology and Its Applications", NEC TECHNICAL JOURNAL, vol. 63, no. 3, 24 September 2010 (2010-09-24), pages 44 - 51 *

Also Published As

Publication number Publication date
JPWO2020044646A1 (en) 2021-11-11

Similar Documents

Publication Publication Date Title
KR102418446B1 (en) Picture-based vehicle damage assessment method and apparatus, and electronic device
KR102151365B1 (en) Image-based vehicle loss evaluation method, apparatus and system, and electronic device
US10152858B2 (en) Systems, apparatuses and methods for triggering actions based on data capture and characterization
CN106952303B (en) Vehicle distance detection method, device and system
KR20190060817A (en) Image based vehicle damage determination method and apparatus, and electronic device
WO2020024457A1 (en) Liability cognizance method and device of traffic accident and computer readable storage medium
WO2018081581A1 (en) Systems and methods for supplementing captured data
CN104011734B (en) System, the method and apparatus of information are obtained from the object for being attached to vehicle
EP3153976A1 (en) Information processing device, photographing device, image sharing system, information processing method, and program
JP7258595B2 (en) Investigation support system and investigation support method
US20230245462A1 (en) Systems and methods of legibly capturing vehicle markings
JP2017004527A (en) Vehicle number information recognition/correspondence system and correspondence method
JP2020518165A (en) Platform for managing and validating content such as video images, pictures, etc. generated by different devices
JP6437217B2 (en) Image output device, image management system, image processing method, and program
JP2024009115A (en) Information provision system
WO2020044646A1 (en) Image processing device, image processing method, and program
JP2023060081A (en) Processing device
US20230008356A1 (en) Video processing apparatus, method and computer program
JP6993750B2 (en) Information provision system
KR20120070888A (en) Method, electronic device and record medium for provoding information on wanted target
US11659273B2 (en) Information processing apparatus, information processing method, and non-transitory storage medium
CN112699798A (en) Traffic police action recognition method and device with vehicle-road cooperation
CN111209807A (en) Yolov 3-based video structuring method and system
JP7015604B1 (en) Information provision system
JP7015603B1 (en) Information provision system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19854192

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020540045

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19854192

Country of ref document: EP

Kind code of ref document: A1