US20210027060A1 - Person search system - Google Patents
Person search system Download PDFInfo
- Publication number
- US20210027060A1 US20210027060A1 US17/036,967 US202017036967A US2021027060A1 US 20210027060 A1 US20210027060 A1 US 20210027060A1 US 202017036967 A US202017036967 A US 202017036967A US 2021027060 A1 US2021027060 A1 US 2021027060A1
- Authority
- US
- United States
- Prior art keywords
- image
- vehicle
- further configured
- camera
- human
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000000694 effects Effects 0.000 claims abstract description 15
- 230000004044 response Effects 0.000 claims abstract description 9
- 230000010365 information processing Effects 0.000 claims description 11
- 238000003672 processing method Methods 0.000 claims description 10
- 241001605695 Pareronia Species 0.000 description 50
- 238000000034 method Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 16
- 238000001514 detection method Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 14
- 238000012545 processing Methods 0.000 description 10
- 230000006399 behavior Effects 0.000 description 9
- 230000008520 organization Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 241000282412 Homo Species 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 206010012289 Dementia Diseases 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000029305 taxis Effects 0.000 description 1
Images
Classifications
-
- G06K9/00664—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G06K9/00369—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/0202—Child monitoring systems using a transmitter-receiver system carried by the parent and the child
- G08B21/0269—System arrangements wherein the object is to detect the exact location of child or item using a navigation satellite system, e.g. GPS
Definitions
- the present disclosure relates to a server, an information processing method, and a vehicle.
- Patent Literature 1 in the following list discloses a technology of finding a lost child by detecting persons in images captured by surveillance cameras set in an amusement park or the like and comparing them with registered information of the lost child.
- An object of the present disclosure is to provide a technology used to detect searched objects by using a system including mobile objects.
- the person search system includes an autonomous mobile object that has a camera for capturing an image and a GPS receiver for acquiring positional information and configured to move according to a specific operation command; and a controller configured to: create said operation command sent to said autonomous mobile object; detect a human image in said image; and make a determination as to whether or not said human image detected in said image is an image of a searched object that satisfies a specific search condition.
- the autonomous mobile object is a mobile object that moves autonomously on the basis of a specific operation command.
- the autonomous mobile object may be an autonomous vehicle.
- the operation command is information including, for example, information about a destination and/or a travel route and information about a service to be provided by the autonomous mobile object on the route.
- the operation command may be a command that causes the autonomous mobile object to transport passengers and/or goods along a predetermined route.
- the operation command may be a command that causes the autonomous mobile object to travel to a certain destination, and prepare the shop, facility, or equipment for service at that place.
- the present disclosure is characterized by using such an autonomous mobile object for the purpose of detecting a searched object.
- the person search system includes controller for creating the aforementioned operation command, detecting a human image in a captured image, and determining whether or not the detected human image is an image of a searched object.
- the controller may be provided in a server apparatus that can communicate with the autonomous mobile object or in the autonomous mobile object.
- the system having the above configuration enables quick and accurate detection of a wanderer using images captured by the autonomous mobile object under operation.
- the aforementioned person search system may employ various methods to detect a person in an image.
- the person search system may further comprise a storage medium, and the controller may make the determination on the basis of information about said searched object stored in said storage medium in advance and said human image.
- said controller may detect said human image in each of a plurality of said images, and said controller may make said determination on the basis of a plurality of said human images that are images of the same person.
- the person search system may further comprise a user interface through which a user can input information, and said controller may make said determination on the basis of an input made through said user interface.
- the operation of the person search system shifts to discovery and protection of the searched object.
- the person search system may let the searched object to get on the autonomous mobile object.
- the system may determine an area or route in or along which said autonomous mobile object is to travel to discover said person, on the basis of positional information of said plurality of autonomous mobile objects.
- a person search system including at least one or more of the above-described means.
- a method carried out by the above-described person search system there is provided a method carried out by the above-described person search system.
- the processing and means described above may be employed in any combinations, if technically feasible.
- the present disclosure can provide a technology that enables quick detection of a searched object using a system including a mobile object.
- FIG. 1 is a diagram showing the general configuration of a person search system according to a first embodiment.
- FIG. 2 is a block diagram showing components of the person search system.
- FIG. 3 shows the outer appearance of an autonomous vehicle.
- FIG. 4 is a diagram showing dataflow between the components of the system.
- FIG. 5 is a diagram showing an exemplary road network.
- FIG. 6 is a flow chart of a person search process according to the first embodiment.
- FIG. 7 is a flow chart of a process of protecting a searched object according to a second embodiment.
- the outline of a person search system according to a first embodiment will be described with reference to FIG. 1 .
- the person search system according to the first embodiment includes a plurality of autonomous vehicles 100 A, 100 B, 100 C, . . . 100 n that can run autonomously on the basis of commands given thereto and a server apparatus 200 that issues the commands.
- the autonomous vehicles will be collectively referred to as autonomous vehicles 100 , when it is not necessary to distinguish individual vehicles.
- the autonomous vehicles 100 are self-driving vehicles that provide predetermined services.
- the server apparatus 200 is an apparatus that performs management of the plurality of autonomous vehicles 100 .
- the autonomous vehicles 100 are multipurpose mobile objects that may have individually different functions. Typically the autonomous vehicles 100 are vehicles that can travel on the road autonomously without a human driver. Examples of the autonomous vehicles 100 include vehicles that travel along a predetermined route to pick up and drop off persons, on-demand taxis that operate on users' demand, and mobile shops that enable shop operation at a desired destination. In the case where the autonomous vehicles 100 are intended for transportation of passengers and/or goods, they may transport passengers and/or goods along a predetermined route. In the case where the autonomous vehicles 100 are intended for transportation of a shop, facility, or equipment, they may travel to a destination, and the shop, facility, or equipment may be prepared for operation at that place.
- the autonomous vehicles 100 may be vehicles that patrol on the road for the purpose of monitoring facilities and/or infrastructures or preventing crimes. In that case, the autonomous vehicles 100 may be configured to travel along a predetermined patrol route.
- the autonomous vehicles 100 that are powered by a battery are also called electric vehicle palettes (EV palettes).
- the autonomous vehicle 100 A is a monitoring vehicle that travels along a predetermined patrol route
- the autonomous vehicle 100 B is a vehicle used to open a mobile shop at a certain location
- the autonomous vehicle 100 C is an on-demand taxi that picks up and dropping off a user 70 .
- a searched object 50 shown in FIG. 1 will be described later.
- the autonomous vehicles 100 are not required to be vehicles without humans. For example, a sales staff(s), a customer service attendant(s), or an operation monitoring crew(s) may be aboard the autonomous vehicle 100 .
- the autonomous vehicles 100 are not required to be vehicles that can run completely autonomously. For example, they may be vehicles that can be driven by a human driver or accept a human assistance in some circumstances.
- the autonomous vehicles 100 have the functions of receiving requests by users, responding to the users, performing appropriate processing in response to the users' requests, and reporting the result of processing to the users.
- the autonomous vehicles 100 may transfer the requests by users that they cannot fulfil by themselves to the server apparatus 200 so as to fulfil them in cooperation with the server apparatus 200 .
- the server apparatus 200 is an apparatus that directs the operation of the autonomous vehicles 100 .
- the server apparatus 200 receives a request by a user 70 to get a location to which an autonomous vehicle 100 is to be dispatched and the user's destination. Then, the server apparatus 200 sends to an autonomous vehicle 100 that is running in the neighborhood of the requested location an operation command to the effect that the autonomous vehicle 100 is to transport a person(s) from the place of departure to the destination with positional information and information specifying the user 70 .
- the server apparatus 200 may also send to the autonomous vehicle 100 A, which serves as a monitoring vehicle, an operation command to the effect that the autonomous vehicle 100 A is to monitor streets while travelling along a predetermined route.
- the server apparatus 200 may send to the autonomous vehicle 100 B, which is used as a mobile shop, a command to the effect that the autonomous vehicle 100 B is to travel to a specific destination to open a shop at that place.
- operation commands may specify operations to be done by autonomous vehicles 100 besides traveling.
- the person search system also has the function of collecting images by the autonomous vehicles 100 in order for the server apparatus 200 to detect a wanderer or a lost child.
- the autonomous vehicles 100 capture images of their environment by means of imaging means and send the images to the server apparatus 200 .
- Positional information acquired by a GPS device associated with the images may be also sent to the server apparatus 200 .
- the server apparatus 200 detects a wanderer or a lost child using the images received from the autonomous vehicles 100 .
- FIG. 2 is a block diagram showing an example of the configuration of the autonomous vehicle 100 and the server apparatus 200 shown in FIG. 1 .
- the autonomous vehicle 100 is a vehicle that runs according to a command received from the server apparatus 200 . Specifically, the autonomous vehicle 100 creates a traveling route on the basis of the operation command received by wireless communication and travels on the road in an appropriate manner while sensing its environment.
- the autonomous vehicle 100 has a sensor 101 , a positional information acquisition unit 102 , a control unit 103 , a driving unit 104 , a communication unit 105 , and a camera 106 .
- the autonomous vehicle 100 operates by electrical power supplied by a battery, which is not shown in the drawings.
- the sensor 101 is means for sensing the environment of the vehicle, which typically includes a stereo camera, a laser scanner, LIDAR, radar, or the like. Information acquired by the sensor 101 is sent to the control unit 103 .
- the positional information acquisition unit 102 is means for acquiring the current position of the vehicle.
- the positional information acquisition unit 102 includes a GPS receiver, which receives a GPS signals from GPS satellites to acquire positional information (e.g. latitude, longitude, and altitude) on the basis of time information.
- positional information e.g. latitude, longitude, and altitude
- Information acquired by the positional information acquisition unit 102 is sent to the control unit 103 .
- the control unit 103 is a computer that controls the autonomous vehicle 100 on the basis of the information acquired through the sensor 101 .
- the control unit 103 is, for example, a microcomputer.
- the control unit 103 includes as functional modules an operation plan creation part 1031 , an environment perceiving part 1032 , and a travel control part 1033 . These functional modules may be implemented by executing programs stored in storage means, such as a read only memory (ROM), by a central processing unit (CPU), neither of which is shown in the drawings.
- ROM read only memory
- CPU central processing unit
- the operation plan creation part 1031 receives an operation command from the server apparatus 200 and creates an operation plan of the vehicle.
- the operation plan is data that specifies a route along which the autonomous vehicle 100 is to travel and a task(s) to be done by the autonomous vehicle 100 in a part or the entirety of that route. Examples of data included in the operation plan are as follows.
- the route along which the vehicle is to travel may be created automatically from a given place of departure and a given destination with reference to map data stored in the storage means (not shown). Alternatively, the route may be created using an external service.
- Examples of the tasks to be done by the vehicle include, but are not limited to, picking up and dropping off a person(s), loading and unloading goods, opening and closing a mobile shop, and collecting data.
- the operation plan created by the operation plan creation part 1031 is sent to the travel control part 1033 , which will be described later.
- the environment perceiving part 1032 perceives the environment around the vehicle using the data acquired by the sensor 101 . What is perceived includes, but is not limited to, the number and the position of lanes, the number and the position of other vehicles present around the vehicle, the number and the position of obstacles (e.g. pedestrians, bicycles, structures, and buildings) present around the vehicle, the structure of the road, and road signs. What is perceived may include anything that is useful for autonomous traveling.
- the environment perceiving part 1032 may track a perceived object(s). For example, the environment perceiving part 1032 may calculate the relative speed of the object from the difference between the coordinates of the object determined in a previous step and the current coordinates of the object.
- the data relating to the environment acquired by the environment perceiving part 1032 is sent to the travel control part 1033 , which will be described below. This data will be hereinafter referred to as “environment data”.
- the travel control part 1033 controls the traveling of the vehicle on the basis of the operation plan created by the operation plan creation part 1031 , the environment data acquired by the environment perceiving part 1032 , and the positional information of the vehicle acquired by the positional information acquisition unit 102 .
- the travel control part 1033 causes the vehicle to travel along a specific route in such a way that obstacles will not enter a specific safety zone around the vehicle.
- a known autonomous driving method may be employed to drive the vehicle.
- the control of the vehicle may include locking and unlocking of the door and turning-on and turning-off of the engine.
- the driving unit 104 is means for driving the autonomous vehicle 100 according to a command issued by the travel control part 1033 .
- the driving unit 104 includes, for example, a motor and inverter for driving wheels, a brake, a steering system, and a secondary battery.
- the communication unit 105 serves as communication means for connecting the autonomous vehicle 100 to a network.
- the communication unit can communicate with other devices (e.g. the server apparatus 200 ) via a network using a mobile communication service based on e.g. 3G or LTE.
- the communication unit 105 may further include communication means for inter-vehicle communication with other autonomous vehicles 100 .
- the camera 106 is an on-vehicle camera provided on the body of the autonomous vehicle 100 .
- an imaging device using an image sensor such as a charge-coupled device (CCD), metal oxide semiconductor (MOS), or complementary metal oxide semiconductor (CMOS) sensor may be employed.
- CCD charge-coupled device
- MOS metal oxide semiconductor
- CMOS complementary metal oxide semiconductor
- FIG. 3 shows the outer appearance of the autonomous vehicle 100 .
- the autonomous vehicle 100 according to this embodiment has the on-vehicle camera 106 , which can capture images (still images or moving images).
- the camera 106 may be a visible light camera or an infrared camera. While FIG. 3 shows only one camera, the autonomous vehicle 100 may have a plurality of cameras 106 provided on different portions of the vehicle body. For example, cameras may be provided on the front, rear, and right and left sides of the vehicle body. Image capturing by the camera may be performed either in response to an image capturing command sent from the server apparatus 200 or periodically at regular intervals.
- searched object will be used to refer to an object to be searched for by the person search system using the autonomous vehicles 100 .
- the searched object includes a human being, such as a wanderer or a lost person.
- the searched object may be a creature, such as a pet in some cases.
- the server apparatus 200 is configured to manage the position of the running autonomous vehicles 100 and send operation commands to them. For example, when the server apparatus 200 receives a request for dispatching a taxi from a user, the server apparatus 200 is notified of the place of departure and the destination and sends an operation command to an autonomous vehicle 100 that is capable of serving as a taxi and running in the neighborhood of the place of departure.
- the server apparatus 200 has a communication unit 201 , a control unit 202 , a storage unit 203 , and a user interface 205 .
- the communication unit 201 is a communication interface for communication with autonomous vehicles 100 via a network, as with the communication unit 105 of the autonomous vehicle 100 .
- the control unit 202 is means for performing overall control of the server apparatus 200 .
- the control unit 202 is constituted by, for example, a CPU.
- the control unit 202 includes as functional modules a vehicle information management part 2021 , an operation command creation part 2022 , a human detection part 2023 , and a person determination part 2024 .
- These functional modules may be implemented by executing programs stored in storage means, such as a read only memory (ROM), by the CPU, neither of which is shown in the drawings.
- ROM read only memory
- the user interface 205 is input means through which a user can input information.
- the user interface 205 includes a display device that displays images, such as a liquid crystal display or an organic electro-luminescence display and an input device, such as a mouse and/or a keyboard.
- the vehicle information management part 2021 manages a plurality of autonomous vehicles 100 that are under its management. Specifically, for example, the vehicle information management part 2021 receives positional information from the autonomous vehicles 100 at predetermined intervals and stores the information in association with the date and time in the storage unit 203 , which will be described later. Moreover, the vehicle information management part 2021 holds and updates data about features of the autonomous vehicles 100 , if necessary. This data will be hereinafter referred to as “vehicle information”. Examples of the vehicle information include, but are not limited to, the identification of each autonomous vehicle 100 , the service type, information about the location at which each vehicle is on standby (e.g. car shed or service office), the door type, the vehicle body size, the carrying capacity, the maximum number of passengers, the full charge driving range, the present (or remaining) driving range, and the present status (such as empty, occupied, running, or under operation etc.).
- the vehicle information include, but are not limited to, the identification of each autonomous vehicle 100 , the service type, information about the location
- the operation command creation part 2022 determines the autonomous vehicle 100 to be dispatched and creates an operation command according to the request for operation. Examples of the request for operation are, but not limited to, as follows.
- the place to which an autonomous vehicle 100 is to be dispatched may be either a single place or multiple places. In the case of multiple places, a service may be provided at each of the places.
- Requests for operation are received from users via, for example, a network.
- the sender of a request for operation is not necessarily an ordinary user.
- the organization that provides the service with the autonomous vehicles 100 may send a request for operation.
- the autonomous vehicle 100 to which an operation command is to be sent is determined taking account of the positional information of the vehicles and the vehicle information (indicating what function each vehicle has) that the vehicle information management part 2021 has received.
- the human detection part 2023 analyzes images received from the autonomous vehicles 100 under operation according to operation commands to detect the presence of humans in the images.
- Human detection may be carried out using existing technologies.
- examples of the method employed include extracting specific shape patterns such as front views and side views of faces by template processing, detecting colors typical of human skins, detecting brightness edge information, and detecting motions typical of human faces. Detecting the face portion is advantageous for identification of persons.
- Another method that can be employed is comparing a plurality of frames in a moving image (video) to extract a moving area therein and detecting a human image on the basis of the shape and motion of the moving area.
- other image processing technologies such as tracking and optical flow may be employed. Humans may be detected on the basis of portions of humans other than the face, clothes, or something that people carry.
- the human detection part 2023 stores detected human images in the storage unit 203 or sends them to the person determination part 2024 .
- the human image may be associated with positional information and a time stamp indicating the time of image capturing.
- the human images may be converted into features, which can reduce the data size and data transfer time.
- the person determination part 2024 determines whether or not the human image detected by the human detection part 2023 is an image of a searched object that satisfies a specific search condition.
- the search condition is a condition that is set to make a determination as to whether the person of the human image is a searched object. Examples of the search condition are as follows.
- the server apparatus 200 receives information about a wanderer or a lost child from an external apparatus (not shown) and stores it in the storage unit 203 in advance.
- An example of the external apparatus is a server through which information about a missing person is distributed by an organization, such as the police, local government, hospital, school, lost child center, or news organization.
- the person determination part 2024 compares the information stored in the storage unit 203 and the information output from the human detection part 2023 , and if the degree of matching in features exceeds a predetermined threshold, the person determination part 2024 determines that the detected person is the searched object.
- the information received from the aforementioned external apparatus includes detailed information about a wanderer (e.g. information about patterns of his/her behavior and/or the address of his/her present or previous residence and/or workplace), the place where he or she tends to appear or a route along which he or she tends to follow may be conjectured in some cases.
- a wanderer e.g. information about patterns of his/her behavior and/or the address of his/her present or previous residence and/or workplace
- the place where he or she tends to appear or a route along which he or she tends to follow may be conjectured in some cases.
- This search condition is used in searching for wanderers or lost children from a large amount of image data instead of searching for a specific person as in the above cases (1) and (2).
- This search condition can be used in the case where a moving image or a plurality of still images containing the same person captured at different times are available. Since wanderers or lost children show specific patterns of behavior in some cases, the person determination part 2024 is configured to determine that the detected person is a searched object if such a specific pattern of behavior is found.
- the person determination part 2024 may determine that a person in an image is a searched object on the basis of a designation by a user through the user interface 205 .
- the user finds the searched object in an image displayed on the display device and designates he or she using a mouse or keyboard.
- the storage unit 203 is means for storing information, which is constituted by a storage medium, such as a RAM, a magnetic disc, or a flash memory.
- FIG. 4 is a diagram illustrating dataflow during a process in which the server apparatus 200 creates an operation command based on a request by a user and an autonomous vehicle 100 starts to operate.
- FIG. 4 shows a typical operation using an EV palette.
- a case in which the autonomous vehicle 100 runs in a road network shown in FIG. 5 will be described by way of example.
- the autonomous vehicle 100 sends positional information periodically to the server apparatus 200 .
- the autonomous vehicle 100 informs the server apparatus 200 of its location at node A, and the vehicle information management part 2021 stores the association of the autonomous vehicle 100 with node A as data in the storage unit 203 .
- the positional information is not necessarily positional information of a node itself.
- the positional information may be information that specifies a node or link.
- a link may be divided into a plurality of sections.
- the road network is not necessarily a network represented by nodes and links.
- the positional information is updated every time the autonomous vehicle 100 moves.
- the server apparatus 200 (specifically, the operation command creation part 2022 ) creates an operation command according to the request for operation (step S 12 ).
- the operation command may designate a place of departure and a destination or only a destination. Alternatively, the operation command may designate a travel route.
- the operation command may include information about a task to be done or a service to be provided on the route.
- step S 13 the operation command creation part 2022 selects an autonomous vehicle 100 that is to provide the service. For example, the operation command creation part 2022 determines an autonomous vehicle 100 that can provide the requested service and can be delivered to the user within a predetermined time, with reference to the stored positional information and vehicle information of the autonomous vehicles 100 .
- the server apparatus 200 sends an operation command to the selected autonomous vehicle 100 (step S 14 ).
- step S 15 the autonomous vehicle 100 (specifically, the operation plan creation part 1031 ) creates an operation plan on the basis of the operation command it has received.
- the autonomous vehicle 100 creates an operation plan to the effect that the autonomous vehicle 100 is to travel along the route indicated by the solid arrows in FIG. 5 , pick up and drop off a person at node B and node C respectively, and return to node D.
- the operation plan thus created is sent to the travel control part 1033 , and then the operation is started (step S 16 ).
- Positional information is sent to the server apparatus 200 periodically during the operation also.
- the operation command is created on the basis of a request for operation sent from an external source (i.e. a user), the operation command does not necessarily have to be created on the basis of a request for operation sent from an external source.
- the server apparatus 200 may create an operation command autonomously.
- the creation of an operation plan does not necessarily have to be based on an operation command.
- the autonomous vehicle 100 may create an operation plan without receiving external instructions.
- an operation plan may be created by the server apparatus 200 .
- This process relates to a case where a wanderer is searched for in response to an external request, and the aforementioned search condition (1) is applied.
- the server apparatus 200 creates an operation command in response to a request for operation sent from a user, though details of this process will not be described.
- An organization such as the police or local government, sends to the server apparatus 200 information about a wanderer (e.g. a pictures of his/her face and clothes and/or features based on them) (step S 21 ).
- the server apparatus 200 in response to a request for operation (not shown), creates a command to capture images of the vehicle's environment using the on-vehicle camera 106 as well as an operation command.
- the server apparatus 200 may narrow down the area in which the wanderer is expected to be located. The narrowing-down of the area may be performed using the information received from the aforementioned organization.
- the positional information of that terminal may be used.
- the server apparatus 200 sends the operation command including the command to capture images created as above to an autonomous vehicle(s) 100 (step S 24 ).
- the operation command may be sent to a plurality of autonomous vehicles 100 simultaneously.
- the command to capture images may specify the timing and cycle of image-capturing and designate either moving or still image.
- the autonomous vehicle 100 A or 100 B having received the operation command creates an operation plan to start the operation (steps S 25 and S 26 ) and captures images using the camera 106 (step S 27 ). Then, the autonomous vehicle 100 A or 100 B sends the captured images to the server apparatus 200 with positional information.
- the server apparatus 200 performs human detection processing by the human detection part 2023 to detect a human image (step S 28 ). Then, the server apparatus 200 performs person determination processing by the person determination part 2024 to determine whether or not the detected human image is an image of the searched object (step S 29 ). If a person that matches the information about the wanderer sent from the organization is found, the image captured by the camera 106 is sent to the organization with positional information associated with the image (step S 210 ).
- the system according to this embodiment can detect a searched object, such as a wanderer or lost child, using images captured by the camera 106 of the autonomous vehicles 100 under operation.
- a searched object such as a wanderer or lost child
- the searched object can be found quickly. Since each autonomous vehicle 100 is required only to send images captured by the on-vehicle camera, the autonomous vehicle 100 does not need to suspend its operation according to a operation command.
- the system according to the second embodiment searches for a searched object and gives protection to the searched object on the basis of the result of person determination made by the server apparatus 200 .
- the control unit 202 of the server apparatus according to the second embodiment has discovery command creation means and protection command creation means, which will be described later.
- the discovery command creation means and the protection command creation means are functional modules similar to the vehicle information management part 2021 etc.
- FIG. 7 is a flow chart of an exemplary process according to the second embodiment. This process is executed by the server apparatus 200 after the end of the process according to the first embodiment. The process shown in FIG. 7 may be executed repeatedly at regular intervals.
- the searched object in the following description is assumed to be a wanderer for whom search is requested by an external organization.
- step S 31 the server apparatus 200 judges whether or not the wanderer has been found in images captured by the autonomous vehicles, on the basis of the result of the person determination processing (S 29 in FIG. 6 ). If the wanderer has been found (Yes in S 31 ), the server apparatus 200 determines the time at which the wanderer was detected last on the basis of the image capturing time of the images stored in the storage unit 203 . Moreover, the server apparatus acquires positional information representing the location of capturing of the image in which the wanderer was detected last (step S 32 ). If the wanderer has not been found (No in S 31 ), the server apparatus 200 continues to judge whether or not the wanderer has been found in images captured by the autonomous vehicles.
- step S 33 the server apparatus 200 forecasts the area of activities of the wanderer on the basis of the time and location at which he or she was detected last. In cases where the same person has been detected multiple times, the forecast may be made taking account of the direction and the speed of movement of the wanderer. Then, the discovery command creation means sends a new operation command for discovering the wanderer to an autonomous vehicle(s) 100 located in or in the neighborhood of the forecast area of activities of the wanderer (step S 34 ). When creating the command for discovery, the discovery command creation means may communicate with the autonomous vehicles 100 through the communication unit 201 to ask the autonomous vehicles 100 whether they can conduct search and assign search areas to the autonomous vehicles 100 that have answered in the affirmative according to their locations.
- the command for discovery specifies an area of movement or route of movement over or along which search is to be conducted.
- the command for discovery may include a command to make the frequency of image capturing by the camera 106 higher than usual. Consequently, image capturing will be conducted thoroughly over the forecast area of activities of the wanderer, leading to quick discovery.
- step S 35 the human detection part 2023 and the person determination part 2024 of the server apparatus 200 determines whether or not the wanderer has been discovered on the basis of the images received from the autonomous vehicles 100 . If it is determined that the wanderer has been discovered in S 35 (Yes in S 35 ), the process proceeds to step S 36 . If it is determined that the wanderer has not been discovered in S 35 (No in S 35 ), the process returns to step S 33 .
- step S 36 the protection command creation means commands the autonomous vehicle 100 that discovered the wanderer to move to the vicinity of the wanderer and to give protection to the wanderer.
- An example of the protection is inviting the wanderer to get on the autonomous vehicle 100 using a speaker or a display (not shown) of the autonomous vehicle 100 and transporting the wanderer to his or her home.
- the voice or image of a member of the wanderer's family or an acquaintance of the wanderer may be used in the aforementioned invitation.
- the autonomous vehicle 100 that discovered the wanderer may pursue the wanderer so as not to lose sight of him/her and/or communicate with another vehicle in the neighborhood to transfer the task of watching the wanderer to that vehicle.
- the autonomous vehicle 100 that discovered the wanderer may ask a policeman or the like in the vicinity of the wanderer for support.
- the system according to the second embodiment can give protection to the discovered wanderer by an autonomous vehicle 100 .
- a large number of autonomous vehicles 100 can be employed to search for or watch the wanderer. This enables quick discovery and protection of the wanderer.
- a plurality of autonomous vehicles cooperating with each other may replace the server apparatus 200 in the above-described embodiments to manage the system.
- one of the autonomous vehicles 100 may be configured to create operation commands.
- the autonomous vehicle 100 that serves as the director may be changed over according to the type of operation.
- the person search system according to the present disclosure can be constructed if at least one component of the system has the function of detecting a wanderer or the like on the basis of images captured by the autonomous vehicles 100 .
- Some of the functions of the server apparatus 200 in each embodiment may be provided by the autonomous vehicle 100 .
- detection of humans in images captured by the on-vehicle camera and person determination may be performed by the control unit of the autonomous vehicle 100 .
- the autonomous vehicle 100 can identify a wanderer without communicating with the server apparatus 200 . This enables quick discovery and leads to a reduction in the communication traffic.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Child & Adolescent Psychology (AREA)
- Emergency Alarm Devices (AREA)
- Alarm Systems (AREA)
- Radar, Positioning & Navigation (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/225,622 filed on Dec. 19, 2018 which claims the benefit of Japanese Patent Application No. 2017-249853, filed on Dec. 26, 2017, which is hereby incorporated by reference herein in its entirety.
- The present disclosure relates to a server, an information processing method, and a vehicle.
- It is known in prior art to detect wanderers (e.g. persons with dementia) or lost children using images captured by surveillance cameras. For example,
Patent Literature 1 in the following list discloses a technology of finding a lost child by detecting persons in images captured by surveillance cameras set in an amusement park or the like and comparing them with registered information of the lost child. -
- Patent Literature 1: Japanese Patent Application Laid-Open No. 2015-102960
- Patent Literature 2: Japanese Patent Application Laid-Open No. 2015-092320
- In the system disclosed in
Patent Literature 1, detection of persons relies on surveillance cameras set in an amusement park or the like. Therefore, it is not possible to detect a wanderer or a lost child to be found, if he or she is not present in the image capturing areas of the surveillance cameras. - There are known autonomous driving traffic systems including autonomous driving mobile objects that communicate with each other via a network, as described in
Patent Literature 2. However, such traffic systems have not been utilized to detect searched objects. - The present disclosure has been made in view of the above circumstances. An object of the present disclosure is to provide a technology used to detect searched objects by using a system including mobile objects.
- According to the present disclosure, there is provided a person search system. The person search system includes an autonomous mobile object that has a camera for capturing an image and a GPS receiver for acquiring positional information and configured to move according to a specific operation command; and a controller configured to: create said operation command sent to said autonomous mobile object; detect a human image in said image; and make a determination as to whether or not said human image detected in said image is an image of a searched object that satisfies a specific search condition.
- The autonomous mobile object is a mobile object that moves autonomously on the basis of a specific operation command. The autonomous mobile object may be an autonomous vehicle. The operation command is information including, for example, information about a destination and/or a travel route and information about a service to be provided by the autonomous mobile object on the route. For example, in the case where the autonomous mobile object is intended for transportation of passengers and/or goods, the operation command may be a command that causes the autonomous mobile object to transport passengers and/or goods along a predetermined route. In the case where the autonomous mobile object is intended for transportation of a shop, facility, or equipment, the operation command may be a command that causes the autonomous mobile object to travel to a certain destination, and prepare the shop, facility, or equipment for service at that place.
- The present disclosure is characterized by using such an autonomous mobile object for the purpose of detecting a searched object.
- The person search system includes controller for creating the aforementioned operation command, detecting a human image in a captured image, and determining whether or not the detected human image is an image of a searched object. The controller may be provided in a server apparatus that can communicate with the autonomous mobile object or in the autonomous mobile object.
- The system having the above configuration enables quick and accurate detection of a wanderer using images captured by the autonomous mobile object under operation.
- The aforementioned person search system may employ various methods to detect a person in an image.
- For example, the person search system may further comprise a storage medium, and the controller may make the determination on the basis of information about said searched object stored in said storage medium in advance and said human image.
- Alternatively, said controller may detect said human image in each of a plurality of said images, and said controller may make said determination on the basis of a plurality of said human images that are images of the same person.
- Still alternatively, the person search system may further comprise a user interface through which a user can input information, and said controller may make said determination on the basis of an input made through said user interface.
- As above, various detection methods are available. Therefore, a method suitable for the actual configuration of the system may be employed or two or more methods may be employed in combination to improve accuracy of detection.
- After detecting the searched object using an image(s) captured by the autonomous mobile object, the operation of the person search system shifts to discovery and protection of the searched object. When the searched object is to be protected, the person search system may let the searched object to get on the autonomous mobile object.
- When the searched object is to be discovered, if said person search system includes a plurality of said autonomous mobile objects, the system may determine an area or route in or along which said autonomous mobile object is to travel to discover said person, on the basis of positional information of said plurality of autonomous mobile objects.
- According to the present disclosure, there can also be provided a person search system including at least one or more of the above-described means. According to another aspect of the present disclosure, there is provided a method carried out by the above-described person search system. The processing and means described above may be employed in any combinations, if technically feasible.
- The present disclosure can provide a technology that enables quick detection of a searched object using a system including a mobile object.
-
FIG. 1 is a diagram showing the general configuration of a person search system according to a first embodiment. -
FIG. 2 is a block diagram showing components of the person search system. -
FIG. 3 shows the outer appearance of an autonomous vehicle. -
FIG. 4 is a diagram showing dataflow between the components of the system. -
FIG. 5 is a diagram showing an exemplary road network. -
FIG. 6 is a flow chart of a person search process according to the first embodiment. -
FIG. 7 is a flow chart of a process of protecting a searched object according to a second embodiment. - In the following, modes for carrying out the present disclosure will be specifically described as illustrative embodiments with reference to the drawings. The dimensions, materials, shapes, relative arrangements, and other features of the components that will be described in connection with the embodiments may be changed suitably depending on the structure of apparatuses to which the present disclosure is applied and other conditions. In other words, the following embodiments are not intended to limit the technical scope of the present disclosure.
- The outline of a person search system according to a first embodiment will be described with reference to
FIG. 1 . The person search system according to the first embodiment includes a plurality of autonomous vehicles 100A, 100B, 100C, . . . 100 n that can run autonomously on the basis of commands given thereto and aserver apparatus 200 that issues the commands. In the following, the autonomous vehicles will be collectively referred to asautonomous vehicles 100, when it is not necessary to distinguish individual vehicles. Theautonomous vehicles 100 are self-driving vehicles that provide predetermined services. Theserver apparatus 200 is an apparatus that performs management of the plurality ofautonomous vehicles 100. - The
autonomous vehicles 100 are multipurpose mobile objects that may have individually different functions. Typically theautonomous vehicles 100 are vehicles that can travel on the road autonomously without a human driver. Examples of theautonomous vehicles 100 include vehicles that travel along a predetermined route to pick up and drop off persons, on-demand taxis that operate on users' demand, and mobile shops that enable shop operation at a desired destination. In the case where theautonomous vehicles 100 are intended for transportation of passengers and/or goods, they may transport passengers and/or goods along a predetermined route. In the case where theautonomous vehicles 100 are intended for transportation of a shop, facility, or equipment, they may travel to a destination, and the shop, facility, or equipment may be prepared for operation at that place. Theautonomous vehicles 100 may be vehicles that patrol on the road for the purpose of monitoring facilities and/or infrastructures or preventing crimes. In that case, theautonomous vehicles 100 may be configured to travel along a predetermined patrol route. Theautonomous vehicles 100 that are powered by a battery are also called electric vehicle palettes (EV palettes). - In the case shown in
FIG. 1 , the autonomous vehicle 100A is a monitoring vehicle that travels along a predetermined patrol route, the autonomous vehicle 100B is a vehicle used to open a mobile shop at a certain location, and the autonomous vehicle 100C is an on-demand taxi that picks up and dropping off a user 70. A searchedobject 50 shown inFIG. 1 will be described later. - The
autonomous vehicles 100 are not required to be vehicles without humans. For example, a sales staff(s), a customer service attendant(s), or an operation monitoring crew(s) may be aboard theautonomous vehicle 100. Theautonomous vehicles 100 are not required to be vehicles that can run completely autonomously. For example, they may be vehicles that can be driven by a human driver or accept a human assistance in some circumstances. - Moreover, the
autonomous vehicles 100 have the functions of receiving requests by users, responding to the users, performing appropriate processing in response to the users' requests, and reporting the result of processing to the users. Theautonomous vehicles 100 may transfer the requests by users that they cannot fulfil by themselves to theserver apparatus 200 so as to fulfil them in cooperation with theserver apparatus 200. - The
server apparatus 200 is an apparatus that directs the operation of theautonomous vehicles 100. For example, in the case where an EV palette is used as an on-demand taxi, theserver apparatus 200 receives a request by a user 70 to get a location to which anautonomous vehicle 100 is to be dispatched and the user's destination. Then, theserver apparatus 200 sends to anautonomous vehicle 100 that is running in the neighborhood of the requested location an operation command to the effect that theautonomous vehicle 100 is to transport a person(s) from the place of departure to the destination with positional information and information specifying the user 70. Theserver apparatus 200 may also send to the autonomous vehicle 100A, which serves as a monitoring vehicle, an operation command to the effect that the autonomous vehicle 100A is to monitor streets while travelling along a predetermined route. Moreover, theserver apparatus 200 may send to the autonomous vehicle 100B, which is used as a mobile shop, a command to the effect that the autonomous vehicle 100B is to travel to a specific destination to open a shop at that place. As above, operation commands may specify operations to be done byautonomous vehicles 100 besides traveling. - The person search system according to the first embodiment also has the function of collecting images by the
autonomous vehicles 100 in order for theserver apparatus 200 to detect a wanderer or a lost child. For this purpose, theautonomous vehicles 100 capture images of their environment by means of imaging means and send the images to theserver apparatus 200. Positional information acquired by a GPS device associated with the images may be also sent to theserver apparatus 200. Theserver apparatus 200 detects a wanderer or a lost child using the images received from theautonomous vehicles 100. -
FIG. 2 is a block diagram showing an example of the configuration of theautonomous vehicle 100 and theserver apparatus 200 shown inFIG. 1 . - The
autonomous vehicle 100 is a vehicle that runs according to a command received from theserver apparatus 200. Specifically, theautonomous vehicle 100 creates a traveling route on the basis of the operation command received by wireless communication and travels on the road in an appropriate manner while sensing its environment. - The
autonomous vehicle 100 has asensor 101, a positional information acquisition unit 102, acontrol unit 103, a driving unit 104, acommunication unit 105, and acamera 106. Theautonomous vehicle 100 operates by electrical power supplied by a battery, which is not shown in the drawings. - The
sensor 101 is means for sensing the environment of the vehicle, which typically includes a stereo camera, a laser scanner, LIDAR, radar, or the like. Information acquired by thesensor 101 is sent to thecontrol unit 103. - The positional information acquisition unit 102 is means for acquiring the current position of the vehicle. The positional information acquisition unit 102 includes a GPS receiver, which receives a GPS signals from GPS satellites to acquire positional information (e.g. latitude, longitude, and altitude) on the basis of time information. Information acquired by the positional information acquisition unit 102 is sent to the
control unit 103. - The
control unit 103 is a computer that controls theautonomous vehicle 100 on the basis of the information acquired through thesensor 101. Thecontrol unit 103 is, for example, a microcomputer. Thecontrol unit 103 includes as functional modules an operation plan creation part 1031, an environment perceiving part 1032, and atravel control part 1033. These functional modules may be implemented by executing programs stored in storage means, such as a read only memory (ROM), by a central processing unit (CPU), neither of which is shown in the drawings. - The operation plan creation part 1031 receives an operation command from the
server apparatus 200 and creates an operation plan of the vehicle. In this embodiment, the operation plan is data that specifies a route along which theautonomous vehicle 100 is to travel and a task(s) to be done by theautonomous vehicle 100 in a part or the entirety of that route. Examples of data included in the operation plan are as follows. - (1) Data that Specifies a Route Along which the Vehicle is to Travel by a Set of Road Links
- The route along which the vehicle is to travel may be created automatically from a given place of departure and a given destination with reference to map data stored in the storage means (not shown). Alternatively, the route may be created using an external service.
- (2) Data that Specifies a Task(s) to be Done by the Vehicle at a Certain Location(s) in the Route
- Examples of the tasks to be done by the vehicle include, but are not limited to, picking up and dropping off a person(s), loading and unloading goods, opening and closing a mobile shop, and collecting data.
- The operation plan created by the operation plan creation part 1031 is sent to the
travel control part 1033, which will be described later. - The environment perceiving part 1032 perceives the environment around the vehicle using the data acquired by the
sensor 101. What is perceived includes, but is not limited to, the number and the position of lanes, the number and the position of other vehicles present around the vehicle, the number and the position of obstacles (e.g. pedestrians, bicycles, structures, and buildings) present around the vehicle, the structure of the road, and road signs. What is perceived may include anything that is useful for autonomous traveling. - The environment perceiving part 1032 may track a perceived object(s). For example, the environment perceiving part 1032 may calculate the relative speed of the object from the difference between the coordinates of the object determined in a previous step and the current coordinates of the object.
- The data relating to the environment acquired by the environment perceiving part 1032 is sent to the
travel control part 1033, which will be described below. This data will be hereinafter referred to as “environment data”. - The
travel control part 1033 controls the traveling of the vehicle on the basis of the operation plan created by the operation plan creation part 1031, the environment data acquired by the environment perceiving part 1032, and the positional information of the vehicle acquired by the positional information acquisition unit 102. For example, thetravel control part 1033 causes the vehicle to travel along a specific route in such a way that obstacles will not enter a specific safety zone around the vehicle. A known autonomous driving method may be employed to drive the vehicle. The control of the vehicle may include locking and unlocking of the door and turning-on and turning-off of the engine. - The driving unit 104 is means for driving the
autonomous vehicle 100 according to a command issued by thetravel control part 1033. The driving unit 104 includes, for example, a motor and inverter for driving wheels, a brake, a steering system, and a secondary battery. - The
communication unit 105 serves as communication means for connecting theautonomous vehicle 100 to a network. In this embodiment, the communication unit can communicate with other devices (e.g. the server apparatus 200) via a network using a mobile communication service based on e.g. 3G or LTE. Thecommunication unit 105 may further include communication means for inter-vehicle communication with otherautonomous vehicles 100. - The
camera 106 is an on-vehicle camera provided on the body of theautonomous vehicle 100. As thecamera 106, an imaging device using an image sensor such as a charge-coupled device (CCD), metal oxide semiconductor (MOS), or complementary metal oxide semiconductor (CMOS) sensor may be employed. -
FIG. 3 shows the outer appearance of theautonomous vehicle 100. As shown inFIG. 3 , theautonomous vehicle 100 according to this embodiment has the on-vehicle camera 106, which can capture images (still images or moving images). Thecamera 106 may be a visible light camera or an infrared camera. WhileFIG. 3 shows only one camera, theautonomous vehicle 100 may have a plurality ofcameras 106 provided on different portions of the vehicle body. For example, cameras may be provided on the front, rear, and right and left sides of the vehicle body. Image capturing by the camera may be performed either in response to an image capturing command sent from theserver apparatus 200 or periodically at regular intervals. In the following, the term “searched object” will be used to refer to an object to be searched for by the person search system using theautonomous vehicles 100. The searched object includes a human being, such as a wanderer or a lost person. The searched object may be a creature, such as a pet in some cases. - Next, the
server apparatus 200 will be described. - The
server apparatus 200 is configured to manage the position of the runningautonomous vehicles 100 and send operation commands to them. For example, when theserver apparatus 200 receives a request for dispatching a taxi from a user, theserver apparatus 200 is notified of the place of departure and the destination and sends an operation command to anautonomous vehicle 100 that is capable of serving as a taxi and running in the neighborhood of the place of departure. - The
server apparatus 200 has acommunication unit 201, acontrol unit 202, a storage unit 203, and a user interface 205. - The
communication unit 201 is a communication interface for communication withautonomous vehicles 100 via a network, as with thecommunication unit 105 of theautonomous vehicle 100. - The
control unit 202 is means for performing overall control of theserver apparatus 200. Thecontrol unit 202 is constituted by, for example, a CPU. - The
control unit 202 includes as functional modules a vehicle information management part 2021, an operation command creation part 2022, a human detection part 2023, and a person determination part 2024. These functional modules may be implemented by executing programs stored in storage means, such as a read only memory (ROM), by the CPU, neither of which is shown in the drawings. - The user interface 205 is input means through which a user can input information. The user interface 205 includes a display device that displays images, such as a liquid crystal display or an organic electro-luminescence display and an input device, such as a mouse and/or a keyboard.
- The vehicle information management part 2021 manages a plurality of
autonomous vehicles 100 that are under its management. Specifically, for example, the vehicle information management part 2021 receives positional information from theautonomous vehicles 100 at predetermined intervals and stores the information in association with the date and time in the storage unit 203, which will be described later. Moreover, the vehicle information management part 2021 holds and updates data about features of theautonomous vehicles 100, if necessary. This data will be hereinafter referred to as “vehicle information”. Examples of the vehicle information include, but are not limited to, the identification of eachautonomous vehicle 100, the service type, information about the location at which each vehicle is on standby (e.g. car shed or service office), the door type, the vehicle body size, the carrying capacity, the maximum number of passengers, the full charge driving range, the present (or remaining) driving range, and the present status (such as empty, occupied, running, or under operation etc.). - When a request for operation of an
autonomous vehicle 100 is received from outside, the operation command creation part 2022 determines theautonomous vehicle 100 to be dispatched and creates an operation command according to the request for operation. Examples of the request for operation are, but not limited to, as follows. - This is a request for transportation of a passenger(s) or goods with designation of a place of departure and a destination or a route to be followed.
- This is a request for dispatch of an
autonomous vehicle 100 that is capable of functioning as a shop (e.g. eating house, sales booth, or showcase), an office of a business entity (e.g. private office or service office), or a public facility (e.g. branch of a city office, library, or clinic). The place to which anautonomous vehicle 100 is to be dispatched may be either a single place or multiple places. In the case of multiple places, a service may be provided at each of the places. - This is a request for patrol on the road for the purpose of monitoring facilities and/or infrastructures or preventing crimes.
- Requests for operation are received from users via, for example, a network. The sender of a request for operation is not necessarily an ordinary user. For example, the organization that provides the service with the
autonomous vehicles 100 may send a request for operation. - The
autonomous vehicle 100 to which an operation command is to be sent is determined taking account of the positional information of the vehicles and the vehicle information (indicating what function each vehicle has) that the vehicle information management part 2021 has received. - The human detection part 2023 analyzes images received from the
autonomous vehicles 100 under operation according to operation commands to detect the presence of humans in the images. Human detection may be carried out using existing technologies. For example, in the case where the face portion of humans is to be detected, examples of the method employed include extracting specific shape patterns such as front views and side views of faces by template processing, detecting colors typical of human skins, detecting brightness edge information, and detecting motions typical of human faces. Detecting the face portion is advantageous for identification of persons. Another method that can be employed is comparing a plurality of frames in a moving image (video) to extract a moving area therein and detecting a human image on the basis of the shape and motion of the moving area. Alternatively, other image processing technologies, such as tracking and optical flow may be employed. Humans may be detected on the basis of portions of humans other than the face, clothes, or something that people carry. - The human detection part 2023 stores detected human images in the storage unit 203 or sends them to the person determination part 2024. The human image may be associated with positional information and a time stamp indicating the time of image capturing. The human images may be converted into features, which can reduce the data size and data transfer time.
- The person determination part 2024 determines whether or not the human image detected by the human detection part 2023 is an image of a searched object that satisfies a specific search condition. The search condition is a condition that is set to make a determination as to whether the person of the human image is a searched object. Examples of the search condition are as follows.
- (1) Matching of Features of the Human Image with Features of a Searched Object Stored in the Storage Unit 203.
- In the case where this search condition is used, the
server apparatus 200 receives information about a wanderer or a lost child from an external apparatus (not shown) and stores it in the storage unit 203 in advance. An example of the external apparatus is a server through which information about a missing person is distributed by an organization, such as the police, local government, hospital, school, lost child center, or news organization. The person determination part 2024 compares the information stored in the storage unit 203 and the information output from the human detection part 2023, and if the degree of matching in features exceeds a predetermined threshold, the person determination part 2024 determines that the detected person is the searched object. - (2) Matching of the Behavior of the Detected Person with Features of a Searched Object Stored in the Storage Unit 203.
- In the case where the information received from the aforementioned external apparatus includes detailed information about a wanderer (e.g. information about patterns of his/her behavior and/or the address of his/her present or previous residence and/or workplace), the place where he or she tends to appear or a route along which he or she tends to follow may be conjectured in some cases.
- (3) Matching of the Behavior of the Detected Person with a Pattern of Behavior Typical of a Wanderer or the Like
- This search condition is used in searching for wanderers or lost children from a large amount of image data instead of searching for a specific person as in the above cases (1) and (2). This search condition can be used in the case where a moving image or a plurality of still images containing the same person captured at different times are available. Since wanderers or lost children show specific patterns of behavior in some cases, the person determination part 2024 is configured to determine that the detected person is a searched object if such a specific pattern of behavior is found.
- Specifically, if two images contain the same person, that person is extracted from other images using tracking or other methods to acquire a pattern of behavior of that person. Examples of such specific patterns of behavior are as follows:
-
- appearing at the same place recurrently because of losing the way;
- moving at low speed or staying at one place for a long time (reasonable staying at an appropriate place, such as a bus stop, is excluded);
- incomprehensible behavior, such as walking a mountain trail at night.
- The person determination part 2024 may determine that a person in an image is a searched object on the basis of a designation by a user through the user interface 205. In this case, the user finds the searched object in an image displayed on the display device and designates he or she using a mouse or keyboard.
- Two or more of the above conditions may be employed in combination. Conditions other than the above may also be employed.
- The storage unit 203 is means for storing information, which is constituted by a storage medium, such as a RAM, a magnetic disc, or a flash memory.
- Processing that is performed by each of the above-described components will now be described.
FIG. 4 is a diagram illustrating dataflow during a process in which theserver apparatus 200 creates an operation command based on a request by a user and anautonomous vehicle 100 starts to operate.FIG. 4 shows a typical operation using an EV palette. Here, a case in which theautonomous vehicle 100 runs in a road network shown inFIG. 5 will be described by way of example. - The
autonomous vehicle 100 sends positional information periodically to theserver apparatus 200. For example, in the case shown inFIG. 5 , theautonomous vehicle 100 informs theserver apparatus 200 of its location at node A, and the vehicle information management part 2021 stores the association of theautonomous vehicle 100 with node A as data in the storage unit 203. The positional information is not necessarily positional information of a node itself. For example, the positional information may be information that specifies a node or link. A link may be divided into a plurality of sections. The road network is not necessarily a network represented by nodes and links. The positional information is updated every time theautonomous vehicle 100 moves. - If a user sends a request for operation to the
server apparatus 200 by communication means (not shown) (step S11), the server apparatus 200 (specifically, the operation command creation part 2022) creates an operation command according to the request for operation (step S12). The operation command may designate a place of departure and a destination or only a destination. Alternatively, the operation command may designate a travel route. The operation command may include information about a task to be done or a service to be provided on the route. Here, a case in which a request for transportation of a person from node B to node C is made will be described. - In step S13, the operation command creation part 2022 selects an
autonomous vehicle 100 that is to provide the service. For example, the operation command creation part 2022 determines anautonomous vehicle 100 that can provide the requested service and can be delivered to the user within a predetermined time, with reference to the stored positional information and vehicle information of theautonomous vehicles 100. Here, let us assume that the vehicle located at node A inFIG. 5 is selected. Consequently, theserver apparatus 200 sends an operation command to the selected autonomous vehicle 100 (step S14). - In step S15, the autonomous vehicle 100 (specifically, the operation plan creation part 1031) creates an operation plan on the basis of the operation command it has received. In the case described here, for example, the
autonomous vehicle 100 creates an operation plan to the effect that theautonomous vehicle 100 is to travel along the route indicated by the solid arrows inFIG. 5 , pick up and drop off a person at node B and node C respectively, and return to node D. The operation plan thus created is sent to thetravel control part 1033, and then the operation is started (step S16). Positional information is sent to theserver apparatus 200 periodically during the operation also. - While in the above-described case the operation command is created on the basis of a request for operation sent from an external source (i.e. a user), the operation command does not necessarily have to be created on the basis of a request for operation sent from an external source. For example, the
server apparatus 200 may create an operation command autonomously. Moreover, the creation of an operation plan does not necessarily have to be based on an operation command. For example, in cases where anautonomous vehicle 100 performs patrol for the purpose of surveying streets, theautonomous vehicle 100 may create an operation plan without receiving external instructions. Alternatively, an operation plan may be created by theserver apparatus 200. - In the following, an exemplary process specific to the system according to the present disclosure will be described with reference to
FIG. 6 sometimes in comparison with the process shown inFIG. 4 . This process relates to a case where a wanderer is searched for in response to an external request, and the aforementioned search condition (1) is applied. As in the case shown inFIG. 4 , theserver apparatus 200 creates an operation command in response to a request for operation sent from a user, though details of this process will not be described. - An organization, such as the police or local government, sends to the
server apparatus 200 information about a wanderer (e.g. a pictures of his/her face and clothes and/or features based on them) (step S21). In step S22, in response to a request for operation (not shown), theserver apparatus 200 creates a command to capture images of the vehicle's environment using the on-vehicle camera 106 as well as an operation command. Before determining a target vehicle(s) in step S23, theserver apparatus 200 may narrow down the area in which the wanderer is expected to be located. The narrowing-down of the area may be performed using the information received from the aforementioned organization. In this connection, in the case where the wanderer carries a portable terminal (e.g. smartphone) that can transmit positional information, the positional information of that terminal may be used. - Then, the
server apparatus 200 sends the operation command including the command to capture images created as above to an autonomous vehicle(s) 100 (step S24). As shown inFIG. 6 , the operation command may be sent to a plurality ofautonomous vehicles 100 simultaneously. In the case where theautonomous vehicle 100 is already operating according to an operation command, only the command to capture images may be sent to it. The command to capture images may specify the timing and cycle of image-capturing and designate either moving or still image. - The autonomous vehicle 100A or 100B having received the operation command creates an operation plan to start the operation (steps S25 and S26) and captures images using the camera 106 (step S27). Then, the autonomous vehicle 100A or 100B sends the captured images to the
server apparatus 200 with positional information. - The
server apparatus 200 performs human detection processing by the human detection part 2023 to detect a human image (step S28). Then, theserver apparatus 200 performs person determination processing by the person determination part 2024 to determine whether or not the detected human image is an image of the searched object (step S29). If a person that matches the information about the wanderer sent from the organization is found, the image captured by thecamera 106 is sent to the organization with positional information associated with the image (step S210). - As above, the system according to this embodiment can detect a searched object, such as a wanderer or lost child, using images captured by the
camera 106 of theautonomous vehicles 100 under operation. Thus, the searched object can be found quickly. Since eachautonomous vehicle 100 is required only to send images captured by the on-vehicle camera, theautonomous vehicle 100 does not need to suspend its operation according to a operation command. - The system according to the second embodiment searches for a searched object and gives protection to the searched object on the basis of the result of person determination made by the
server apparatus 200. In the following description, the components same as those in the first embodiment will be denoted by the same reference signs to simplify the description. Thecontrol unit 202 of the server apparatus according to the second embodiment has discovery command creation means and protection command creation means, which will be described later. The discovery command creation means and the protection command creation means are functional modules similar to the vehicle information management part 2021 etc. -
FIG. 7 is a flow chart of an exemplary process according to the second embodiment. This process is executed by theserver apparatus 200 after the end of the process according to the first embodiment. The process shown inFIG. 7 may be executed repeatedly at regular intervals. In the following description, the searched object in the following description is assumed to be a wanderer for whom search is requested by an external organization. - In step S31, the
server apparatus 200 judges whether or not the wanderer has been found in images captured by the autonomous vehicles, on the basis of the result of the person determination processing (S29 inFIG. 6 ). If the wanderer has been found (Yes in S31), theserver apparatus 200 determines the time at which the wanderer was detected last on the basis of the image capturing time of the images stored in the storage unit 203. Moreover, the server apparatus acquires positional information representing the location of capturing of the image in which the wanderer was detected last (step S32). If the wanderer has not been found (No in S31), theserver apparatus 200 continues to judge whether or not the wanderer has been found in images captured by the autonomous vehicles. - In step S33, the
server apparatus 200 forecasts the area of activities of the wanderer on the basis of the time and location at which he or she was detected last. In cases where the same person has been detected multiple times, the forecast may be made taking account of the direction and the speed of movement of the wanderer. Then, the discovery command creation means sends a new operation command for discovering the wanderer to an autonomous vehicle(s) 100 located in or in the neighborhood of the forecast area of activities of the wanderer (step S34). When creating the command for discovery, the discovery command creation means may communicate with theautonomous vehicles 100 through thecommunication unit 201 to ask theautonomous vehicles 100 whether they can conduct search and assign search areas to theautonomous vehicles 100 that have answered in the affirmative according to their locations. - The command for discovery specifies an area of movement or route of movement over or along which search is to be conducted. The command for discovery may include a command to make the frequency of image capturing by the
camera 106 higher than usual. Consequently, image capturing will be conducted thoroughly over the forecast area of activities of the wanderer, leading to quick discovery. - In step S35, the human detection part 2023 and the person determination part 2024 of the
server apparatus 200 determines whether or not the wanderer has been discovered on the basis of the images received from theautonomous vehicles 100. If it is determined that the wanderer has been discovered in S35 (Yes in S35), the process proceeds to step S36. If it is determined that the wanderer has not been discovered in S35 (No in S35), the process returns to step S33. In step S36, the protection command creation means commands theautonomous vehicle 100 that discovered the wanderer to move to the vicinity of the wanderer and to give protection to the wanderer. An example of the protection is inviting the wanderer to get on theautonomous vehicle 100 using a speaker or a display (not shown) of theautonomous vehicle 100 and transporting the wanderer to his or her home. In order not to arouse the wanderer's suspicion, the voice or image of a member of the wanderer's family or an acquaintance of the wanderer may be used in the aforementioned invitation. Theautonomous vehicle 100 that discovered the wanderer may pursue the wanderer so as not to lose sight of him/her and/or communicate with another vehicle in the neighborhood to transfer the task of watching the wanderer to that vehicle. Theautonomous vehicle 100 that discovered the wanderer may ask a policeman or the like in the vicinity of the wanderer for support. - As above, the system according to the second embodiment can give protection to the discovered wanderer by an
autonomous vehicle 100. Moreover, a large number ofautonomous vehicles 100 can be employed to search for or watch the wanderer. This enables quick discovery and protection of the wanderer. - The modes and components of the present disclosure are not limited to those described above. Various modifications may be made to the above embodiments as described below.
- In cases where the
control unit 103 of theautonomous vehicle 100 has sufficient information processing capability, a plurality of autonomous vehicles cooperating with each other may replace theserver apparatus 200 in the above-described embodiments to manage the system. In that case, one of theautonomous vehicles 100 may be configured to create operation commands. Alternatively, theautonomous vehicle 100 that serves as the director may be changed over according to the type of operation. In other words, the person search system according to the present disclosure can be constructed if at least one component of the system has the function of detecting a wanderer or the like on the basis of images captured by theautonomous vehicles 100. - Some of the functions of the
server apparatus 200 in each embodiment may be provided by theautonomous vehicle 100. For example, detection of humans in images captured by the on-vehicle camera and person determination may be performed by the control unit of theautonomous vehicle 100. In that case, theautonomous vehicle 100 can identify a wanderer without communicating with theserver apparatus 200. This enables quick discovery and leads to a reduction in the communication traffic.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/036,967 US20210027060A1 (en) | 2017-12-26 | 2020-09-29 | Person search system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-249853 | 2017-12-26 | ||
JP2017249853A JP7024396B2 (en) | 2017-12-26 | 2017-12-26 | Person search system |
US16/225,622 US10824863B2 (en) | 2017-12-26 | 2018-12-19 | Systems for searching for persons using autonomous vehicles |
US17/036,967 US20210027060A1 (en) | 2017-12-26 | 2020-09-29 | Person search system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/225,622 Continuation US10824863B2 (en) | 2017-12-26 | 2018-12-19 | Systems for searching for persons using autonomous vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210027060A1 true US20210027060A1 (en) | 2021-01-28 |
Family
ID=66950438
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/225,622 Active US10824863B2 (en) | 2017-12-26 | 2018-12-19 | Systems for searching for persons using autonomous vehicles |
US17/036,967 Abandoned US20210027060A1 (en) | 2017-12-26 | 2020-09-29 | Person search system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/225,622 Active US10824863B2 (en) | 2017-12-26 | 2018-12-19 | Systems for searching for persons using autonomous vehicles |
Country Status (3)
Country | Link |
---|---|
US (2) | US10824863B2 (en) |
JP (1) | JP7024396B2 (en) |
CN (1) | CN109960735A (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7156215B2 (en) * | 2019-09-04 | 2022-10-19 | トヨタ自動車株式会社 | Server device, mobile store, and information processing system |
CN110491128A (en) * | 2019-09-23 | 2019-11-22 | 广西交通职业技术学院 | A kind of wisdom traffic system based on big data |
JP7282199B2 (en) * | 2019-11-14 | 2023-05-26 | 三菱電機株式会社 | Merging support device and merging support method |
JP7188369B2 (en) * | 2019-12-04 | 2022-12-13 | 株式会社デンソー | In-vehicle monitoring system, in-vehicle monitoring device, and in-vehicle monitoring program |
CN111210590B (en) * | 2019-12-27 | 2021-10-19 | 恒大智慧科技有限公司 | Early warning method and device for children lost in intelligent community and readable storage medium |
JP7011640B2 (en) * | 2019-12-27 | 2022-01-26 | コイト電工株式会社 | Search system |
JP7110254B2 (en) * | 2020-01-30 | 2022-08-01 | コイト電工株式会社 | search system |
CN113468358A (en) * | 2020-03-30 | 2021-10-01 | 本田技研工业株式会社 | Search support system, search support method, vehicle-mounted device, and storage medium |
CN111611904B (en) * | 2020-05-15 | 2023-12-01 | 新石器慧通(北京)科技有限公司 | Dynamic target identification method based on unmanned vehicle driving process |
JP7538025B2 (en) * | 2020-12-17 | 2024-08-21 | トヨタ自動車株式会社 | Mobile |
JP7411614B2 (en) * | 2021-09-30 | 2024-01-11 | 本田技研工業株式会社 | Search system, search method, and program |
WO2023058143A1 (en) * | 2021-10-06 | 2023-04-13 | 日本電信電話株式会社 | Prediction device, prediction method, and program |
CN114332768B (en) * | 2021-12-30 | 2022-09-16 | 江苏国盈信息科技有限公司 | Intelligent community security management method and system |
WO2023209757A1 (en) * | 2022-04-25 | 2023-11-02 | 三菱電機株式会社 | Mobile body monitoring service operation supervising device, mobile body monitoring service control device, and mobile body monitoring service operation system |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5379190B2 (en) * | 1999-09-22 | 2013-12-25 | 雅信 鯨田 | Search system and method |
JP2003233877A (en) * | 2002-02-06 | 2003-08-22 | Toshiba Corp | Protectee retrieving system and its method |
JP2007172116A (en) * | 2005-12-20 | 2007-07-05 | Csk Green Taxi:Kk | Vehicle dispatch guidance system for taxi |
JP2007241377A (en) * | 2006-03-06 | 2007-09-20 | Sony Corp | Retrieval system, imaging apparatus, data storage device, information processor, picked-up image processing method, information processing method, and program |
JP5730518B2 (en) * | 2010-08-27 | 2015-06-10 | 株式会社日立国際電気 | Specific person detection system and specific person detection method |
US10742475B2 (en) * | 2012-12-05 | 2020-08-11 | Origin Wireless, Inc. | Method, apparatus, and system for object tracking sensing using broadcasting |
JP6257318B2 (en) * | 2013-09-30 | 2018-01-10 | 株式会社日本総合研究所 | Mobile object in automatic driving traffic system, cooperative vehicle allocation apparatus for mobile object, and vehicle allocation method thereof |
JP2015102960A (en) * | 2013-11-22 | 2015-06-04 | Kddi株式会社 | Stray child determination device, stray child determination method, and computer program |
JP6256984B2 (en) * | 2014-03-18 | 2018-01-10 | 株式会社日本総合研究所 | Local monitoring system and local monitoring method using autonomous driving traffic system |
EP3125060A1 (en) * | 2014-03-28 | 2017-02-01 | Yanmar Co., Ltd. | Autonomously traveling work vehicle |
JP6413664B2 (en) * | 2014-11-07 | 2018-10-31 | 株式会社デンソー | Automatic vehicle allocation system, center equipment |
WO2017159680A1 (en) * | 2016-03-17 | 2017-09-21 | 日本電気株式会社 | Searh support apparatus, search support system, search support method, and program recording medium |
US10599150B2 (en) * | 2016-09-29 | 2020-03-24 | The Charles Stark Kraper Laboratory, Inc. | Autonomous vehicle: object-level fusion |
US10233021B1 (en) * | 2016-11-02 | 2019-03-19 | Amazon Technologies, Inc. | Autonomous vehicles for delivery and safety |
EP3545376A4 (en) * | 2016-11-22 | 2020-07-01 | Amazon Technologies Inc. | Methods for autonomously navigating across uncontrolled and controlled intersections |
US11295458B2 (en) * | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
US10452926B2 (en) * | 2016-12-29 | 2019-10-22 | Uber Technologies, Inc. | Image capture device with customizable regions of interest |
US11042155B2 (en) * | 2017-06-06 | 2021-06-22 | Plusai Limited | Method and system for closed loop perception in autonomous driving vehicles |
AU2018297342A1 (en) * | 2017-07-06 | 2020-01-16 | Cubic Corporation | Passenger classification-based autonomous vehicle routing |
JP6780625B2 (en) * | 2017-10-26 | 2020-11-04 | トヨタ自動車株式会社 | Self-driving vehicle allocation system and self-driving vehicles |
DE112018006665T5 (en) * | 2017-12-27 | 2020-10-01 | Direct Current Capital LLC | PROCEDURE FOR ACCESSING ADDITIONAL PERCEPTIONAL DATA FROM OTHER VEHICLES |
US10324467B1 (en) * | 2017-12-29 | 2019-06-18 | Apex Artificial Intelligence Industries, Inc. | Controller systems and methods of limiting the operation of neural networks to be within one or more conditions |
US11940798B2 (en) * | 2018-07-06 | 2024-03-26 | Toyota Research Institute, Inc. | System, method, and computer-readable medium for an autonomous vehicle to pass a bicycle |
US11345342B2 (en) * | 2019-09-27 | 2022-05-31 | Intel Corporation | Potential collision warning system based on road user intent prediction |
-
2017
- 2017-12-26 JP JP2017249853A patent/JP7024396B2/en active Active
-
2018
- 2018-12-19 CN CN201811577396.0A patent/CN109960735A/en active Pending
- 2018-12-19 US US16/225,622 patent/US10824863B2/en active Active
-
2020
- 2020-09-29 US US17/036,967 patent/US20210027060A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
CN109960735A (en) | 2019-07-02 |
US10824863B2 (en) | 2020-11-03 |
JP2019117449A (en) | 2019-07-18 |
US20190197859A1 (en) | 2019-06-27 |
JP7024396B2 (en) | 2022-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10824863B2 (en) | Systems for searching for persons using autonomous vehicles | |
US11024167B2 (en) | Information collection system and information collection apparatus | |
US11513539B2 (en) | Information collection system and server apparatus | |
US11380192B2 (en) | Autonomous mobile object and information collection system | |
US9973737B1 (en) | Unmanned aerial vehicle assistant for monitoring of user activity | |
US11150659B2 (en) | Information collection system and server apparatus | |
US11393215B2 (en) | Rescue system and rescue method, and server used for rescue system and rescue method | |
US7504965B1 (en) | Portable covert license plate reader | |
US20190236520A1 (en) | Movable body, service providing method, storage medium and service providing system | |
JP7145971B2 (en) | Method and Vehicle System for Passenger Recognition by Autonomous Vehicles | |
US20230343208A1 (en) | Pedestrian device, information collection device, base station device, positioning method, user management method, information collection method, and facility monitoring method | |
CN111736591A (en) | Information processing apparatus and information processing method | |
US11487286B2 (en) | Mobile object system that provides a commodity or service | |
JP2021120778A (en) | Information processing device, information processing method, and program | |
US10878211B2 (en) | System and method enabling location, identification, authentication and ranging with social networking features | |
CN113176867B (en) | Information processing apparatus, information processing method, and information processing system | |
JP7318526B2 (en) | Information processing device, information processing method and program | |
JP2023017868A (en) | Crime prevention device and crime prevention method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KANEHARA, ISAO;UMEDA, KAZUHIRO;HASEGAWA, HIDEO;AND OTHERS;SIGNING DATES FROM 20181030 TO 20181122;REEL/FRAME:054556/0143 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |