JP6390015B2 - Biological search system - Google Patents

Biological search system Download PDF

Info

Publication number
JP6390015B2
JP6390015B2 JP2018044840A JP2018044840A JP6390015B2 JP 6390015 B2 JP6390015 B2 JP 6390015B2 JP 2018044840 A JP2018044840 A JP 2018044840A JP 2018044840 A JP2018044840 A JP 2018044840A JP 6390015 B2 JP6390015 B2 JP 6390015B2
Authority
JP
Japan
Prior art keywords
search
person
unmanned
living
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2018044840A
Other languages
Japanese (ja)
Other versions
JP2018121351A (en
Inventor
和雄 市原
和雄 市原
河野 雅一
雅一 河野
Original Assignee
株式会社プロドローン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社プロドローン filed Critical 株式会社プロドローン
Priority to JP2018044840A priority Critical patent/JP6390015B2/en
Publication of JP2018121351A publication Critical patent/JP2018121351A/en
Application granted granted Critical
Publication of JP6390015B2 publication Critical patent/JP6390015B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a living body search system for searching for a specific person or the like within a predetermined range such as indoor or outdoor.

  Conventionally, for example, a lost child search system has been proposed as a human search system (see Patent Document 1).

  The above-described lost child search system is carried by a person entering a facility such as an amusement park, and is placed at a predetermined location in the facility, which transmits a unique information registered in advance when a question signal is received. When the person with the ID tag passes, the interrogator transmits the interrogation signal to the ID tag and prompts transmission of the unique information of the ID tag, and is arranged at a predetermined place in the facility. A camera device that captures an image of the person when the person with the ID tag passes, an image of each person obtained by the camera device, and unique information of each ID tag obtained by the interrogator And a management device for creating identification data for each person.

  In the search by the lost child search system, a photograph of a facility visitor is taken with a camera, and the captured image is automatically recorded in pairs with an ID. When a lost child occurs, interrogators interspersed with the corresponding IDs The location is roughly identified with reference to the read history, the person's face photo is printed out, and a staff member searches the facility.

  Also, a monitoring system that captures video of a moving object to be tracked in cooperation with a plurality of monitoring camera devices is known (see Patent Document 2).

  In the monitoring system, the monitoring camera device has an image recognition function, and the tracking target is tracked by being configured to transmit the captured image of the tracking target and the feature information to another monitoring camera device via the network. It is possible to continue.

Japanese Patent Laid-Open No. 10-301984 JP 2003-324720 A

  In the lost child search system described in Patent Document 1, since the camera is fixed, it is impossible to track the search target person with the camera.

  Further, in the monitoring system described in Patent Document 2, it is possible to track a person to be monitored by switching a plurality of cameras. However, since the position of each camera is fixed, a blind spot is generated. The problem is inevitable. Even if the monitoring target can be tracked by the camera, if the distance from the camera is increased, the target can only be confirmed in a small figure, and image recognition becomes difficult. there were.

  An object of the present invention is to provide a living body search system that searches for a specific living body individual using an unmanned moving body.

  In order to solve the above-described problem, the living body search system of the present invention includes an unmanned moving body and a server connected to the unmanned moving body through a communication network when searching for a person who is a living body as a body to be searched. The unmanned moving body includes a camera for observing the surroundings, and a moving means for moving in space or on the ground, and the server includes a database capable of recording individual identification information of the searched object, and the camera And an individual identification means for collating the characteristic portion of the person photographed in step 1 with the individual identification information and determining whether or not the person is the object to be searched.

  The living body search system of the present invention may be configured to use an unmanned aircraft capable of autonomous flight as the unmanned mobile body.

  Further, in order to solve the above problems, the living body search system of the present invention, when searching for a person who is a living body as a search target, a camera for observing the surroundings, a moving means for moving in space or the ground, A database capable of recording individual identification information of a search target, image data processing means for detecting a human characteristic part from an image photographed by the camera, and a characteristic part detected by the image data processing means as the individual identification information It is characterized by comprising an unmanned mobile body having an individual identification means for collating and discriminating whether or not the person is the search object.

  In the living body search system of the present invention, the characteristic part is a human face, and the image data processing means includes a person detecting means for detecting a person from an image photographed by the camera, and a person detected by the person detecting means. It is good also as a structure containing the face detection means which detects a face from these images.

  The living body search system of the present invention may be configured to automatically move the unmanned moving body to a position where the person's face can be detected when the person detecting means detects the person from the image.

  Moreover, the biological body search system of this invention is good also as a structure which uses the said some unmanned aircraft as said unmanned mobile body.

  Further, the living body search system of the present invention may be configured such that, when the search target is detected by the individual identification unit, the unattended mobile body tracks the search target.

  In addition, the living body search system of the present invention includes a plurality of the unmanned moving objects, and can track the searched object to the unmanned moving object that is different from the unmanned moving object that images the searched object. It is good also as a certain structure.

  In addition, the living body search system of the present invention may be configured such that notification destination information that is a notification destination when the search target is found is registered in the database.

  Further, the living body search system of the present invention is configured such that the unmanned mobile body or the server notifies the notification destination of positional information of the search object when the search object is detected by the individual identification unit. It is good.

  In the living body search system of the present invention, the unmanned moving body may have a plurality of the cameras.

  According to the living body search system of the present invention, it is possible to quickly find a living body individual to be searched.

It is explanatory drawing which shows the structure of the outline of one Example of the biopsy system of this invention. It is a block diagram which shows the structure of the unmanned aircraft of the biometric search system of FIG. It is a flowchart which shows the procedure of the search of the biometric search system of FIG.

  Hereinafter, the living body search system of the present invention will be described in detail with reference to the drawings. FIG. 1 is an explanatory diagram showing a schematic configuration of an embodiment of the living body search system of the present invention. The embodiment shown in FIG. 1 is an example of a human search system in the case of searching for a specific person S (searchee) as a search target, and is an example in the case of using an unmanned aircraft (multicopter 30) as an unmanned mobile body. .

  The person search system in FIG. 1 is for searching for a specific individual (biological individual) requested by a client as a search target in a predetermined range indoors or outdoors. The human search system 10 includes a multicopter 30 and a server 50. The unmanned aircraft 30 and the server 50 are connected by a communication network 90, and are configured to be able to transmit and receive data. The server 50 is installed, for example, in a search center or the like, and operations such as data input operations are performed by an attendant.

  The communication network 90 may be either a shared network provided for public flights or an original dedicated network. The communication network 90 and the wireless aircraft 30 are connected by a wireless method. Further, the communication network 90 and the server 50 may be connected by either a wireless method or a wired method. As the shared network, a normal wired fixed telephone line, mobile telephone line, or the like can be used.

  FIG. 2 is a block diagram showing a configuration of an unmanned mobile body of the biological search system of FIG. The human search system in FIG. 1 uses the multicopter 30 as an unmanned mobile body. As shown in FIG. 2, the multicopter 30 includes a moving unit 300 that can arbitrarily move by flying in space. The moving means 300 of the multicopter 30 includes a plurality of rotor blades 310 that generate lift, a control unit 320 that controls flight operations, a battery 340 that supplies power to each component, and the like. The multicopter 30 is formed so that it can move autonomously.

  In the present invention, as the unmanned mobile body, an unmanned automobile or the like configured to be capable of automatic driving can be used in addition to the unmanned aircraft. If you use an unmanned aerial vehicle such as a multicopter, you do not need to scrape people even in crowded crowds, and you can fly and move at a height that is out of reach of people. Is less likely to happen.

  Moreover, although said multicopter can move autonomously, an unmanned mobile body may be movable by remote operation.

  A DC motor 311 is coupled to each rotary blade 310 and is connected to the control unit 320 via an ESC (Electric Speed Controller) 312. The control unit 320 includes a CPU (central processing unit) 323, a RAM / ROM (storage device) 322, a PWM controller 324, and the like. The control unit 320 is further connected to a sensor group 325 such as an acceleration sensor, a gyro sensor (angular velocity sensor), an atmospheric pressure sensor, a geomagnetic sensor (electronic compass), a GPS receiver 326, and the like.

  The multicopter 30 is controlled by the PWM controller 324 of the moving unit 300 by adjusting the rotational speed of the DC motor 311 via the ESC 312. That is, the posture and position of the multicopter 30 can be controlled by appropriately adjusting the rotation direction and the balance of the rotation speeds of the plurality of rotor blades 310.

  For example, the RAM / ROM 322 of the control unit 320 stores a flight control program in which a flight control algorithm during the flight of the multicopter 30 is implemented. The control unit 321 can control the attitude and position of the multicopter 30 using a flight control program using information acquired from the sensor group 325 and the like. As described above, the multicopter 30 is configured to be able to fly within a predetermined range and move for searching for the search object by the moving unit 300.

  The multicopter 30 transmits the image data captured by the camera 350 for observing the surroundings, the image data processing unit 360 for capturing image data of the still image from the camera, and the image data processing unit 360 to the server 50, Communication means 370 for receiving data from the server 50 is provided. As the communication means 370, a communication device capable of wireless transmission / reception is used.

  The camera 350 is used for monitoring and observing the periphery of the multicopter 30, and may be any camera that can capture a still image as necessary. As the camera 350, a visible light camera that forms an image with visible light, an infrared camera that forms an image with infrared light, or the like can be used. As the camera 350, an image sensor used for a surveillance camera or the like can be used.

  A plurality of cameras 350 may be provided in the multicopter 30. For example, four cameras 350 may be installed in four different directions. The camera 350 may be attached to the bottom of the multicopter as a 360-degree camera so that the entire periphery of the multicopter can be observed.

  The multicopter 30 determines the human face as a characteristic part of the search object, and when a human face is detected in the image observed by the camera 350, the image data of the still image of the observed image can be captured. A configured image data processing means 360 is provided.

  The image data processing means 360 can be used as long as it can take image data into a mobile body or a server. Specifically, there are a process for recording image data in a recording device, a process for temporarily storing image data in a storage device, a process for transmitting image data to a server, and the like.

  The image data processing means 360 uses face detection means for detecting a human face (sometimes referred to as face detection). The face detection means performs image processing on the monitored image in real time, performs pattern analysis, pattern identification, and the like. If a human face is recognized, the face is detected.

  The image data processing unit 360 includes a human detection unit that determines that a human body is detected when a silhouette of a human body is recognized in the observed image. Similarly to face detection, the human detection means performs image processing, performs pattern analysis, pattern recognition, and the like, and it is assumed that a human is detected when it is recognized that a human body silhouette is present in the image.

  When performing image processing according to the present invention, face detection refers to detecting the location of a part that is a face. Face recognition detects a face and then identifies an individual from the feature information of the face. It is a process to do.

  The multicopter 30 includes an audio input device 380 such as a microphone, and an output device 390 such as sound, image, and video. When the searchee is found, the voice input device 380 can input the searchee's voice and use it for conversation with a staff member of the search center. Examples of the output device 390 include an audio output device such as a speaker, an image display device such as a liquid crystal display, and an image projection device such as a projector. The output device 390 is used for giving (transmitting) a message to the searchee, and for a staff member of the search center to have a conversation with the searchee.

  The server 50 includes a database 510 capable of recording search data such as individual identification information of the searchee S, the telephone number of the requester, and notification destination information such as an email address, and a request when the searchee S is discovered. Notification means 520 for notifying a person, individual identification means 530 used for determining whether or not the search object is a search target, in contrast to image data including an image of a searcher input from a camera, the database 510 includes an input device 540 for inputting the search data. Communication between the communication network 90 and the server 50 is performed via a control device.

  The search data registered in the database 510 includes, in addition to the search range, the individual identification information of the searchee, and the contact information of the client, as additional information, whether or not tracking is required, the data from the client to the searcher Examples include voice messages and video messages.

  As the individual identification information of the searchee registered in the database 510, for example, image data such as a personal face photograph, information such as personal clothing color, data such as personal height and weight, and the like are used.

  The notification unit 520 is a communication device that can communicate voice, text, images, and the like, and specifically includes a mobile phone, a personal computer, a facsimile, and the like. Examples of the notification destination notified by the notification unit 520 include a mobile phone, a management center, and the Internet. When the search target S is discovered by the individual identification unit 530, the notification unit 520 causes the control device 550 to search the database 510 for a contact address, and notifies the contact point of the discovery using the notification unit 520. be able to. For the notification, voice, text, image data, a combination thereof, or the like can be used.

The search procedure of the human search system in FIG. 1 will be described below. FIG. 3 is a flowchart showing the procedure of the person search system. As a search procedure, for example, the following steps can be performed.
S110: A data registration step for registering search target data provided in advance by a client.
S120: A moving step in which the moving body moves within the search range while observing the surroundings with the camera.
S130: a person detection step of determining whether there is a person in the observation image of the camera and performing person detection.
S140: A face detection step of determining whether there is a face in the observation image of the camera and performing face detection.
S150: An image data processing step of determining that a target search object has been detected when a predetermined characteristic portion is detected in the observation image of the camera, and capturing the observation image of the camera as image data.
S160: An individual recognition step of performing individual recognition of the target search object by comparing the target search object of the image data with the individual identification information of the search object.
S170: A discovery notification step in which, when the object search object of the image data coincides with the individual identification information in the individual recognition step, the notification means notifies the client of the discovery of the search object as having detected the search object. Hereinafter, these steps will be described.

  As shown in FIG. 3, first, in a data registration step S <b> 210, search data provided by a client by a search center operator is registered in the database 510 using the input device 540 of the server 50.

  Next, in a moving step S120, a control signal is sent from the control device 550 of the server 50 to the multicopter 30 via the communication network 90, and a predetermined search range is moved while observing the surroundings with the camera 350.

  Next, in the person detection step S130, it is determined whether or not there is a person in the observation image of the camera 350, and person detection is performed. If no person is detected in the person detection step S130 (NO), the process returns to the movement step S120, and the multicopter 30 further moves within the search range. If a person is detected in the person detection step 120 (YES), the process proceeds to the next face detection step S140. In the human detection step S120, it is determined that a person is detected when the silhouette of the human body is recognized.

  In face detection step S140, it is determined whether or not there is a face image in the camera observation image. When there is no face image (NO), the process returns to the moving step S120, and the multicopter 30 is moved. The multicopter 30 is moved to a position such as the front side of a human face so that the face can be detected. On the other hand, if a face image can be detected (YES), it is determined that a searcher has been detected, and the process proceeds to image data processing step S150.

  In the image data processing step S150, the image of the camera 350 is stored as image data by the image data processing means 360. The stored image data is sent to the server 50 via the communication network 90 using the communication means 370 by the control device.

  Next, the server 50 performs an individual recognition step S160. In the individual recognition step S160, the image data sent to the server 50 is sent to the individual identification means 530 via the control device 550, and the face information of the search target registered as the personal identification information in the database is used. It is determined whether or not the person (searcher) of the information image data is a search target person. The determination is made by comparing the registered single or plural face information and face image data, and when it exceeds a predetermined matching rate, it is determined that the search object matches and the next discovery notification Proceed to step S170. On the other hand, as a result of the comparison, if the match rate has not been reached, the multi-copter 30 is moved by returning to the moving step S120 as a mismatch.

  In the discovery notification step S <b> 170, it is assumed that the search target person S has been found, and the requester is notified of the search target person discovery using the notification means 520. In the notification, a discovery destination (a mobile phone, a management center, the Internet) registered in advance is notified of the fact of discovery.

  In the discovery notification step S170, the location information of the discovered location can be added and notified. If the position information is outdoors, the position information of the GPS receiver of the multicopter 30 can be used. In addition, if it is indoors, surrounding video, position information used by the multicopter for self-position estimation, and the like can be used.

  Next, the database 510 is queried for the presence or absence of additional information in the search data. If there is no additional information, the process is terminated. If there is additional information, processing such as message giving step S190 and tracking step S200 is continued.

  In the message giving step S190, a voice message, a video message or the like from a client registered in the database in advance is transmitted to the search target person S using the output device of the multicopter 30. Further, by using a voice input device 380 such as a camera 350 and a microphone and an output device 390 for images, voices, etc., a voice call, a call with an image, or the like is performed between a staff member of the management center on the server 50 side and the search target S. It is also possible to go and communicate with the discovered search target.

  In the tracking step S200, the flight of the multicopter 30 is controlled so that the multicopter 30 continues to track the search target person S for the person whose tracking data is flagged in the search data of the database. Continue monitoring.

  In the tracking step S200, when tracking and monitoring are performed, the single multicopter 30 cannot search the search range designated in advance at the same time. In this case, a spare multicopter can be prepared and the spare multicopter can be activated at the start of tracking to take over the search of the designated range.

  Further, the tracking of the tracking step S200 can be performed by dispatching another multicopter. In this case, by dividing the multi-copter for overall monitoring and the multi-copter for tracking monitoring, it is possible to specialize in functions required for each. For example, the multi-copter for overall monitoring is large and operates for a long time, such as equipped with a 360-degree camera at the bottom of the structure, or installed four cameras in four directions and simultaneously shooting, The tracking multicopter can be configured as a small device with only a single camera and low noise.

  Also, when using the multi-copter for general monitoring and the multi-copter for tracking together, if the multi-copter for tracking is sufficiently small, the multi-copter for tracking is mounted on the multi-copter for general monitoring, and it will be dispatched from here when tracking. It may be configured.

  Although the embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present invention.

  In the present invention, the living individual that is a search target is not limited to the above-described person, but can be applied to various living individuals such as dogs, cats, and other animals, and other animals.

  In the above embodiment, the image data captured by the camera is sent to the server, and the individual recognition step is performed by the individual identification means provided in the image server. However, the performance of the multi-copter CPU is sufficiently high. If recognition is possible, it is also possible to provide individual identification means in the multicopter, receive only the face data of the search object from the server, and perform the individual recognition step only on the multicopter.

  However, calculation processing for face recognition requires a high-performance CPU and a large database, whereas calculation processing such as face detection and human detection is lighter than the calculation processing for face recognition. For this reason, mounting a high-performance CPU in a multi-copter or the like may not be realistic in terms of cost and the like. As described in the embodiment, image data is sent to a server via a communication network, and face recognition is performed by the server. It is reasonable to perform this process.

  In the above embodiment, a visible light camera is used for individual recognition, and a person to be searched is detected by face recognition technology using face information of captured image data. A handle or the like may be used. In addition, when an infrared camera is used as the camera, the temperature of the search object is detected from the heat distribution image obtained by the infrared camera, and temperature data such as body temperature is used as the individual identification information of the living body. It is also possible to perform identification.

  In addition, the accuracy of individual recognition can be improved by using data other than face detection together with face detection. Examples of other data include information on the body weight and height of the search target, clothing color, and the like. For example, the above data is effective when the face is not facing the multi-copter camera.

S people (searchee)
10 person search system (biological search system)
30 Multicopter 300 Moving means 350 Camera 360 Image data processing means 370 Communication means 390 Output device 50 Server 510 Database 520 Notification means 530 Individual identification means 540 Input device 90 Communication network

Claims (11)

  1. A biological search system for searching for a person who is a living individual as a search target,
    An unmanned moving body,
    A server connected to the unmanned mobile body via a communication network,
    The unmanned mobile body is
    A camera to observe the surroundings,
    Moving means for moving in space or on the ground,
    The server
    A database capable of recording individual identification information of the search object;
    A living body search system comprising: individual identification means for collating a characteristic portion of a person photographed by the camera with the individual identification information and determining whether or not the person is the object to be searched. .
  2.   The living body search system according to claim 1, wherein an unmanned aircraft capable of autonomous flight is used as the unmanned moving body.
  3. A biological search system for searching for a person who is a living individual as a search target,
    It has an unmanned moving body that is an unmanned aerial vehicle capable of autonomous flight,
    The unmanned mobile body is
    A camera to observe the surroundings,
    Moving means for moving in space or on the ground;
    A database capable of recording individual identification information of the search object;
    Image data processing means for detecting a human feature from an image taken by the camera;
    A living body search system comprising: individual identification means for collating a characteristic portion detected by the image data processing means with the individual identification information and determining whether or not the person is the object to be searched.
  4. The characteristic part is a human face;
    The image data processing means includes human detection means for detecting a person from an image photographed by the camera, and face detection means for detecting a face from an image of the person detected by the person detection means, The living body search system according to claim 3.
  5.   The living body search system according to claim 4, wherein when the person detecting unit detects a person from the image, the unmanned moving body is automatically moved to a position where the person's face can be detected.
  6.   The biological search system according to any one of claims 2 to 5, wherein a plurality of the unmanned aircraft are used as the unmanned moving body.
  7.   The living body search according to any one of claims 1 to 6, wherein when the search target is detected by the individual identification unit, the unattended moving body tracks the search target. system.
  8. Comprising a plurality of the unmanned moving bodies,
    The living body search system according to claim 7, wherein tracking of the search target object can be taken over by the unmanned mobile object different from the unmanned mobile object that images the search target object.
  9.   The living body search system according to any one of claims 1 to 8, wherein notification information that is a notification destination when the search target is found is registered in the database.
  10.   The unmanned mobile body or the server notifies the notification destination of position information of the search target object when the search target object is detected by the individual identification unit. Biological search system.
  11.   The living body search system according to any one of claims 1 to 10, wherein the unmanned moving body includes a plurality of the cameras.
JP2018044840A 2018-03-13 2018-03-13 Biological search system Active JP6390015B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018044840A JP6390015B2 (en) 2018-03-13 2018-03-13 Biological search system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2018044840A JP6390015B2 (en) 2018-03-13 2018-03-13 Biological search system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2016049026 Division 2016-03-11

Publications (2)

Publication Number Publication Date
JP2018121351A JP2018121351A (en) 2018-08-02
JP6390015B2 true JP6390015B2 (en) 2018-09-19

Family

ID=63044022

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2018044840A Active JP6390015B2 (en) 2018-03-13 2018-03-13 Biological search system

Country Status (1)

Country Link
JP (1) JP6390015B2 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4506381B2 (en) * 2004-09-27 2010-07-21 沖電気工業株式会社 Single actor and group actor detection device
JP5674406B2 (en) * 2010-09-30 2015-02-25 綜合警備保障株式会社 Surveillance system, monitoring device, autonomous mobile body, monitoring method, and monitoring program using autonomous mobile body
JP6029446B2 (en) * 2012-12-13 2016-11-24 セコム株式会社 Autonomous flying robot
JP6195450B2 (en) * 2013-01-31 2017-09-13 セコム株式会社 Autonomous flying robot
JP6022627B2 (en) * 2014-03-27 2016-11-09 株式会社電通 Evacuation support system, evacuation support management program, evacuation support terminal application program, and evacuation support method
JP6469962B2 (en) * 2014-04-21 2019-02-13 薫 渡部 Monitoring system and monitoring method

Also Published As

Publication number Publication date
JP2018121351A (en) 2018-08-02

Similar Documents

Publication Publication Date Title
US9973737B1 (en) Unmanned aerial vehicle assistant for monitoring of user activity
US9773364B2 (en) Security and public safety application for a mobile device with audio/video analytics and access control authentication
US9910436B1 (en) Autonomous data machines and systems
US9962054B2 (en) Robot cleaner, robot cleaning system having the same, and method for operating a robot cleaner
US20180043542A1 (en) Customer service robot and related systems and methods
US10088841B2 (en) Robotic assistance in security monitoring
US10896332B2 (en) Image capture with privacy protection
US9977434B2 (en) Automatic tracking mode for controlling an unmanned aerial vehicle
US10599149B2 (en) Salient feature based vehicle positioning
US10779337B2 (en) Method, apparatus and system for establishing connection between devices
US10024667B2 (en) Wearable earpiece for providing social and environmental awareness
US9858821B2 (en) Autonomous vehicle passenger locator
US8761933B2 (en) Finding a called party
CN105425815B (en) A kind of pasture intelligent management system and method using unmanned vehicle
US7684894B2 (en) Autonomously moving robot
US7504965B1 (en) Portable covert license plate reader
US20150268338A1 (en) Tracking from a vehicle
US20160116914A1 (en) Drone Tours In Security Systems
US20170374277A1 (en) Image pickup apparatus, image pickup method, and recording medium for imaging plural subjects or a single subject
US8233043B2 (en) Systems and methods for location of objects
JP4460528B2 (en) IDENTIFICATION OBJECT IDENTIFICATION DEVICE AND ROBOT HAVING THE SAME
US7180050B2 (en) Object detection device, object detection server, and object detection method
ES2371758T3 (en) Multiple sensor system.
KR101210783B1 (en) Node management system and node managing method using sensing system
JP6308238B2 (en) Flight camera device, flight camera system, terminal device, flight camera device control method and program

Legal Events

Date Code Title Description
TRDD Decision of grant or rejection written
A975 Report on accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A971005

Effective date: 20180615

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180626

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180725

R150 Certificate of patent or registration of utility model

Ref document number: 6390015

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350