US11882542B2 - Monitoring device, tracking method, and non-transitory computer-readable medium - Google Patents

Monitoring device, tracking method, and non-transitory computer-readable medium Download PDF

Info

Publication number
US11882542B2
US11882542B2 US17/432,206 US201917432206A US11882542B2 US 11882542 B2 US11882542 B2 US 11882542B2 US 201917432206 A US201917432206 A US 201917432206A US 11882542 B2 US11882542 B2 US 11882542B2
Authority
US
United States
Prior art keywords
tracked
information
position information
monitoring device
timing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/432,206
Other versions
US20220346055A1 (en
Inventor
Shinichi Miyamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Assigned to NEC CORPORATION reassignment NEC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MIYAMOTO, SHINICHI
Publication of US20220346055A1 publication Critical patent/US20220346055A1/en
Application granted granted Critical
Publication of US11882542B2 publication Critical patent/US11882542B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/937Radar or analogous systems specially adapted for specific applications for anti-collision purposes of marine craft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/589Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Definitions

  • the present disclosure relates to monitoring devices, tracking methods, and programs.
  • Monitoring systems that monitor ships sailing over the ocean are used to navigate the ships safely over the ocean. Information on the positions of the ships identified by the monitoring systems is transmitted to each ship, and this can be expected to keep the ships from colliding with each other.
  • Patent Literature 1 indicates that the position of a ship is recognized by use of a radar. Moreover, in one typically employed method of recognizing the positions of ships, a ship is provided with an automatic identification system (AIS), and a monitoring device collects AIS information to recognize the position of the ship. This AIS information includes such information as the name of the ship, its destination, and its current position.
  • AIS automatic identification system
  • the positions of ships are identified periodically by use of radars.
  • AIS information is transmitted from the ships to a monitoring system periodically. Therefore, the positions of the ships are identified periodically by use of the AIS information as well.
  • the positions of the ships can be identified only periodically, this does not allow for recognizing the path along which a given ship has moved from a point where the position of the ship has been identified at a given timing to a point where the position of the ship has been identified at the next timing. This therefore poses a problem in that the monitoring system cannot track ships accurately when tracking the ships based on their position information collected by use of radars and AIS information.
  • the present disclosure is directed to providing a monitoring device, a tracking method, and a program that can improve the accuracy in tracking ships.
  • a monitoring device includes a position information acquiring unit configured to acquire position information of an object to be tracked; an estimating unit configured to estimate a position of the object to be tracked displayed in an image captured of a predetermined region; and an information generating unit configured to generate path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
  • a tracking method includes acquiring position information of an object to be tracked; estimating a position of the object to be tracked displayed in an image captured of a predetermined region; and generating path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
  • a program causes a computer to execute acquiring position information of an object to be tracked; estimating a position of the object to be tracked displayed in an image captured of a predetermined region; and generating path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
  • the present disclosure can provide a monitoring device, a tracking method, and a program that can improve the accuracy in tracking ships.
  • FIG. 1 is a configuration diagram of a monitoring device according to a first example embodiment.
  • FIG. 2 is a configuration diagram of a monitoring device according to a second example embodiment.
  • FIG. 3 A illustrates a path of a ship according to the second example embodiment.
  • FIG. 3 B illustrates a path of a ship according to the second example embodiment.
  • FIG. 4 illustrates paths of ships according to the second example embodiment.
  • FIG. 5 illustrates a flow of a process of generating path information according to the second example embodiment.
  • FIG. 6 illustrates a flow of a process performed when a ship to be tracked has disappeared from a captured image according to the second example embodiment.
  • FIG. 7 is a configuration diagram of a monitoring device according to each of the example embodiments.
  • the monitoring device 10 may be a computer device that operates as a processor executes a program stored in a memory.
  • the monitoring device 10 may be, for example, a server device.
  • the monitoring device 10 includes a position information acquiring unit 11 , an estimating unit 12 , and an information generating unit 13 .
  • the components of the monitoring device 10 may each be software or a module that is processed as the processor executes a program stored in the memory.
  • the components of the monitoring device 10 may each be hardware, such as a circuit or a chip.
  • the position information acquiring unit 11 acquires position information of an object to be tracked.
  • the object to be tracked may be, for example, a means of transportation, such as a ship, a vehicle, or an aircraft; a person; or an animal.
  • the position information may be, for example, information on the position of the object to be tracked measured by use of the global positioning system (GPS) or the like.
  • GPS global positioning system
  • the position information acquiring unit 11 may receive AIS information including the position information from a ship provided with an AIS.
  • the position information acquiring unit 11 may irradiate an object to be tracked with radio waves by use of a radar and measure the position of the object to be tracked.
  • the position information acquiring unit 11 may acquire the position information of an object to be tracked regularly, periodically, or at any given timing.
  • Position information included in AIS information is information on the position acquired by the GPS or the like provided in a ship and includes information on the time at which the GPS or the like has acquired that information on the position.
  • the estimating unit 12 estimates the position of an object to be tracked displayed in an image captured of a predetermined region.
  • the predetermined region is a region that the monitoring device 10 monitors and may be, for example, a partial region over an ocean or a partial region within a city.
  • the estimating unit 12 captures an image of the predetermined region by use of a camera, for example.
  • An image generated by use of a camera may be a still image or a moving image.
  • To estimate the position of an object to be tracked displayed in an image may mean to convert the image coordinates or the camera coordinates of the object to be tracked displayed in the image to the world coordinates.
  • the image coordinates or the camera coordinates may be indicated by the position of a pixel, for example.
  • the world coordinates may be indicated by the latitude and the longitude, for example.
  • the estimating unit 12 may hold table information representing the correspondence between the world coordinates and the image coordinates or the camera coordinates, for example.
  • the information generating unit 13 supplements, based on an estimated position of an object to be tracked, the position of the object to be tracked held between its position indicated by position information acquired at a first timing and its position indicated by position information acquired at a second timing that is later than the first timing.
  • the information generating unit 13 generates path information of an object to be tracked by use of pieces of position information of the object to be tracked acquired at different timings and the position information of the object to be tracked estimated based on an image.
  • the path information may be information indicated by a line segment or the like on a display or information indicating the positions of a given ship held at given times.
  • the second timing is a timing that comes a predetermined period past the first timing.
  • the predetermined period may be, for example, several seconds or several minutes and may be determined, for example, by a system that acquires position information.
  • the information generating unit 13 combines the position of an object to be tracked estimated based on an image captured between the first timing and the second timing with the positions of the object to be tracked held at the first timing and the second timing.
  • the monitoring device 10 estimates, based on an image captured of an object to be tracked, the position of the object to be tracked held at a timing when the position information of the object to be tracked cannot be acquired by use of an AIS or a radar, for example.
  • the use of the monitoring device 10 can help reduce the period in which the position information of an object to be tracked cannot be identified, and thus the accuracy in tracking can be improved.
  • the monitoring device 20 includes a radar information acquiring unit 21 , an AIS information acquiring unit 22 , an imaging unit 23 , an estimating unit 24 , and an information generating unit 25 .
  • the radar information acquiring unit 21 and the AIS information acquiring unit 22 correspond to the position information acquiring unit 11 illustrated in FIG. 1 .
  • the estimating unit 24 corresponds to the estimating unit 12 illustrated in FIG. 1 .
  • the information generating unit 25 corresponds to the information generating unit 13 illustrated in FIG. 1 .
  • the monitoring device 20 is used to monitor ships sailing over an ocean.
  • the radar information acquiring unit 21 irradiates a ship sailing in a monitored region periodically with radio waves by use of a radar and measures the position of the ship based on the reflected waves of the irradiating radio waves.
  • the AIS information acquiring unit 22 receives AIS information regularly from a ship provided with an AIS.
  • the AIS information includes position information of the ship.
  • the monitoring device 20 identifies the position of a ship provided with an AIS based on the position information included in its AIS information.
  • the monitoring device 20 identifies the position of a ship that is not provided with an AIS based on information obtained by use of a radar (hereinafter, referred to as radar information).
  • the monitoring device 20 may also identify the position of a ship provided with an AIS based on its radar information.
  • the imaging unit 23 may be a camera that captures an image of a monitored region over an ocean.
  • the imaging unit 23 may capture an image of a monitored region over an ocean by use of a plurality of cameras or by use of a single camera.
  • the imaging unit 23 outputs image data obtained by capturing an image to the estimating unit 24 .
  • the estimating unit 24 estimates the position of a ship included in image data. Specifically, the estimating unit 24 converts the image coordinates or the camera coordinates of a ship to the world coordinates. Moreover, the estimating unit 24 estimates or identifies whether ships included in images captured at different timings are the same ship. Images captured at different timings may be different frame images, for example.
  • the estimating unit 24 may track a ship by use of a particle filter, for example.
  • a ship being tracked may become overlaid with another ship or an obstruction and cease to be included in a frame image.
  • the estimating unit 24 may estimate the position of the ship being tracked within the frame image by use of a particle filter.
  • the estimating unit 24 outputs, to the information generating unit 25 , the world coordinates of a ship included in image data, that is, information on the latitude and the longitude of the ship.
  • the estimating unit 24 may output, to the information generating unit 25 , information on the latitude and the longitude of a ship estimated by use of a particle filter.
  • the information generating unit 25 receives the position information of a ship provided with an AIS from the AIS information acquiring unit 22 .
  • the information generating unit 25 generates path information of a ship by combining position information received from the AIS information acquiring unit 22 and position information received from the estimating unit 24 .
  • Path information may be information that represents the path of a single ship by connecting two points indicated by respective pieces of position information with the same line on a GUI. Alternatively, path information may be information stored with two pieces of position information associated with each other by the same ship ID.
  • the information generating unit 25 needs to combine position information of a given ship received from the AIS information acquiring unit 22 with position information of the same ship received from the estimating unit 24 .
  • the information generating unit 25 may compare position information received from the AIS information acquiring unit 22 against position information estimated based on image data captured at the same timing as the timing when the position information received from the AIS information acquiring unit 22 has been measured. If these pieces of position information match with each other or if the difference between these pieces of position information is within a predetermined range, the information generating unit 25 may identify or estimate that these pieces of position information belong to the same ship. That the difference between these pieces of position information is within a predetermined range means that these pieces of position information differ only to an extent that can be regarded as an error.
  • the estimating unit 24 or the information generating unit 25 may analyze image data representing an image captured by the imaging unit 23 and identify the name of a ship included in the image data. If the estimating unit 24 or the information generating unit 25 cannot identify a part of the name of a ship included in image data, the estimating unit 24 or the information generating unit 25 may extract a ship that includes letters identified by referring to a database or the like that manages the names of ships, for example, and may identify the name of the ship. The database may manage the names of ships that may sail through the monitored region.
  • the information generating unit 25 may combine the position information of the ship included in the AIS information with the position information estimated based on the image data. Specifically, the information generating unit 25 may identify the name and the type of a ship based on image data captured by use of a ship-name reader camera installed at an entrance of a harbor. Furthermore, position information of a ship estimated based on an image captured by the ship-name reader camera, information on the time at which the image has been captured, and the identified name and type of the ship may be displayed in an image captured by a wide-range monitoring camera used to track ships.
  • the information generating unit 25 may use, as correction information, the difference between position information received from the AIS information acquiring unit 22 and position information estimated based on image data captured at the same timing as the timing when the position information received from the AIS information acquiring unit 22 has been measured.
  • the information generating unit 25 may modify or correct position information received from the estimating unit 24 by use of the difference between the aforementioned pieces of position information.
  • the information generating unit 25 may use the difference between the two pieces of position information as an offset value.
  • FIG. 3 A illustrates path information generated by the information generating unit 25 .
  • the information generating unit 25 acquires, from the AIS information acquiring unit 22 , the position information of a given ship that includes the latitude of AA and the longitude of aa as well as the position information of the same ship that includes the latitude of DD and the longitude of dd.
  • this ship is moving from the point where the latitude is AA and the longitude is aa to the point where the latitude is DD and the longitude is dd.
  • the information generating unit 25 supplements position information received from the estimating unit 24 as the route or the path along which the ship has moved in a period from the timing when the latitude of AA and the longitude of aa have been measured to the timing when the latitude of DD and the longitude of dd have been measured.
  • the information generating unit 25 supplements the position information that includes the latitude of BB and the longitude of bb and the position information that includes the latitude of CC and the longitude of cc as the position information of the ship held in a period from the timing when the latitude of AA and the longitude of aa have been measured to the timing when the latitude of DD and the longitude of dd have been measured.
  • FIG. 3 A illustrates a screen image to be output to a display unit, such as a display. For example, as illustrated in FIG. 3 A , the latitude and the longitude of a ship may be displayed in a balloon next to the ship.
  • FIG. 3 B illustrates a case where position information estimated based on image data captured at the same timing as the timing when position information that includes the latitude of FF and the longitude of ff included in the AIS information has been measured indicates that the latitude is EE and the longitude is ee.
  • the ships indicated by the dashed lines represent the positions of that ship estimated based on image data.
  • the difference between the position where the latitude is FF and the longitude is ff and the position where the latitude is EE and the longitude is ee may be used as correction information or an offset value.
  • the ships indicated by the dashed lines may not be displayed on the display unit, such as a display, and merely the ship located at the position where the latitude is FF and the longitude is ff may be displayed on the display unit. Thereafter, when the ship is estimated to be located at the position where the latitude is GG and the longitude is gg based on the image data, the estimated position where the latitude is GG and the longitude is gg may be corrected to the position where the latitude is HH and the longitude is hh by use of the offset value, as illustrated in FIG. 3 B .
  • the ship located at the position where the latitude is GG and the longitude is gg may not be displayed on the display unit, such as a display, and merely the ship located at the corrected position where the latitude is HH and the longitude is hh may be displayed on the display unit.
  • the pentagonal-shaped objects indicate ships mapped onto a map based on position information received from the AIS information acquiring unit 22 .
  • the dashed lines between two ships each indicate the path between two pieces of position information received from the AIS information acquiring unit 22 .
  • These dashed lines indicate the pieces of position information received from the estimating unit 24 .
  • the information generating unit 25 may display position information received from the AIS information acquiring unit 22 by use of an object representing a ship and display position information received from the estimating unit 24 by use of a dashed line.
  • the information generating unit 25 outputs the image data illustrated in FIG. 3 A, 3 B , or 4 to the display unit, such as a display.
  • FIGS. 3 A and 3 B each illustrate an image captured by the imaging unit 23 overlaid with position information identified by use of AIS information and position information estimated by the estimating unit 24 .
  • FIG. 4 illustrates a map image overlaid with position information identified by use of AIS information and position information estimated by the estimating unit 24 .
  • the image in FIG. 4 may be referred to as an AIS screen, for example.
  • FIGS. 3 A, 3 B, and 4 each illustrate an image that the information generating unit 25 generates by use of position information received from the AIS information acquiring unit 22 .
  • the information generating unit 25 may use position information received from the radar information acquiring unit 21 .
  • the information generating unit 25 may identify the position of a ship that is not provided with an AIS based on position information received from the radar information acquiring unit 21 and generate the image data illustrated in FIG. 3 A, 3 B , or 4 .
  • the image in FIG. 4 may be referred to as a radar screen.
  • the imaging unit 23 captures an image of a monitored region that includes a ship to be tracked (S 11 ).
  • the estimating unit 24 estimates the position of the ship to be tracked included in the captured image (S 12 ).
  • the estimating unit 24 may convert the image coordinates of the ship to the world coordinates based on a predetermined conversion table of the image coordinates and the world coordinates.
  • the information generating unit 25 determines whether the information generating unit 25 has received AIS information related to the same ship as the ship whose position has been estimated at step S 12 (S 13 ).
  • the AIS information includes the position information of the ship.
  • the information generating unit 25 may regard the two ships as the same ship if the position information included in the AIS information matches the position estimated by the estimating unit 12 or if the difference between these two positions falls within a predetermined range.
  • the information generating unit 25 may regard the two ships as the same ship if the name of the ship included in the AIS information is the same as the name of the ship displayed in image data.
  • the name of the ship displayed in the image data may be identified through an image analyzing process or the like.
  • the information generating unit 25 determines that the information generating unit 25 has acquired the AIS information related to the same ship as the ship whose position has been estimated at step S 12 , the information generating unit 25 generates correction information indicating the difference between the position information included in the AIS information and the position estimated at step S 12 (hereinafter, referred to as an estimated position) (S 14 ).
  • the correction information may be rephrased as an offset value.
  • the information generating unit 25 determines whether the information generating unit 25 has acquired radar information related to the same ship as the ship whose position has been estimated (S 16 ).
  • the radar information includes the position information of the ship. If the information generating unit 25 determines that the information generating unit 25 has acquired the radar information related to the same ship as the ship whose position has been estimated, the information generating unit 25 generates correction information indicating the difference between the position information included in the radar information and the estimated position (S 14 ).
  • the information generating unit 25 outputs the position information included in the AIS information to the display unit and causes the display unit to display the position information (S 15 ).
  • the information generating unit 25 determines whether the correction information indicating the difference between the estimated position and the position information included in the AIS information or the radar information is present (S 17 ).
  • the correction information is present if the information generating unit 25 has generated the correction information previously at a timing when the information generating unit 25 has acquired the position information included in the AIS information or the radar information.
  • the information generating unit 25 determines that the correction information is present, the information generating unit 25 outputs an estimated position corrected by use of the correction information to the display unit and causes the display unit to display the corrected estimated position information (S 18 ).
  • To correct the estimated position by use of the correction information may mean to move the latitude and the longitude of the estimated position by adding or subtracting the correction information to or from the estimated position, for example.
  • the information generating unit 25 determines that the correction information is not present, the information generating unit 25 outputs the estimated position to the display unit and causes the display unit to display the estimated position (S 19 ).
  • a ship to be tracked may disappear from a captured image, for example, when the ship to be tracked is overlaid with another ship and becomes hidden behind another ship or an obstruction. In other words, a ship to be tracked may disappear from a captured image when an occlusion occurs.
  • the estimating unit 24 recognizes that a ship to be tracked has disappeared from a captured image (S 21 ). For example, the estimating unit 24 may recognize that the ship to be tracked has disappeared if the ship being tracked by use of a particle filter ceases to be displayed in the captured image.
  • the information generating unit 25 determines whether the information generating unit 25 holds AIS information or radar information that includes information on the position of the ship to be tracked held at substantially the same timing as the timing when the image in which the ship to be tracked no longer appears has been captured (S 22 ). Substantially the same timing falls within a predetermined period from the timing when the image in which the ship to be tracked no longer appears has been captured. In other words, substantially the same timing may deviate from the aforementioned timing by an amount that can be regarded as an error.
  • the information generating unit 25 determines that the information generating unit 25 holds the AIS information or the radar information that includes the information on the position of the ship to be tracked held at substantially the same timing as the timing when the image in which the ship to be tracked no longer appears has been captured, the information generating unit 25 executes the process at step S 23 .
  • the information generating unit 25 generates particles to be used to track the ship based on the position information included in the AIS information or the radar information (S 23 ). To generate particles may be rephrased as to disperse particles.
  • the information generating unit 25 determines that the information generating unit 25 does not hold the AIS information or the radar information that includes the information on the position of the ship to be tracked held at substantially the same timing as the timing when the image in which the ship to be tracked no longer appears has been captured, the information generating unit 25 executes the process at step S 24 .
  • the information generating unit 25 generates particles to be used to track the ship based on the path of the ship to be tracked held up to this point (S 24 ).
  • the particles are generated based on the confirmed position information of the ship. Therefore, the particles generated at step S 23 are distributed within a range smaller than the range in which the particles generated at step S 24 where the position information is not available are distributed. Since the position information of the ship is not available at step S 24 , the possible area where the ship may be located is wider than the area obtained when the position information of the ship is available at step S 23 . In other words, the accuracy in tracking the ship by use of the particles generated at step S 23 is higher than the accuracy in tracking the ship by use of the particles generated at step S 24 . To rephrase, if a ship to be tracked disappears from a captured image, the area in which particles are dispersed can be narrowed by use of position information included in AIS information or radar information related to the ship to be tracked.
  • the monitoring device 20 can display the path of a ship more accurately by combining information on the position of the ship obtained from an AIS or a radar with position information of the ship estimated based on a captured image.
  • the monitoring device 20 can generate correction information for correcting the position of a ship estimated based on an image, by use of AIS information or radar information acquired at substantially the same timing as the timing when the image has been captured. Specifically, the monitoring device 20 adopts the difference between the position of a ship estimated based on an image and the position included in AIS information or radar information as an offset value. The monitoring device 20 corrects the position of a ship estimated based on an image by use of an offset value. With this configuration, the monitoring device 20 can improve the accuracy of an estimated position used as the position of an object to be tracked held at a timing when position information of the object to be tracked cannot be acquired based on AIS information or radar information.
  • FIG. 7 is a block diagram illustrating a configuration example of the monitoring device 10 or 20 (hereinafter, referred to as the monitoring device 10 or the like).
  • the monitoring device 10 or the like includes a network interface 1201 , a processor 1202 , and a memory 1203 .
  • the network interface 1201 is used to communicate with another network node device constituting a communication system.
  • the network interface 1201 may be used to carry out wireless communication.
  • the network interface 1201 may be used to carry out wireless LAN communication defined in IEEE 802.11 series or mobile communication defined in the 3rd Generation Partnership Project (3GPP).
  • the network interface 1201 may include a network interface card (NIC) compliant with IEEE 802.3 series, for example.
  • NIC network interface card
  • the processor 1202 reads out software (computer program) from the memory 1203 and executes the software. Thus, the processor 1202 implements the processes of the monitoring device 10 or the like described with reference to the flowcharts or sequences according to the foregoing example embodiments.
  • the processor 1202 may be a microprocessor, a microprocessing unit (MPU), or a central processing unit (CPU), for example.
  • the processor 1202 may include a plurality of processors.
  • the memory 1203 is constituted by a combination of a volatile memory and a non-volatile memory.
  • the memory 1203 may include a storage provided apart from the processor 1202 .
  • the processor 1202 may access the memory 1203 via an I/O interface (not illustrated).
  • the memory 1203 is used to store a set of software modules.
  • the processor 1202 can read out the set of software modules from the memory 1203 and execute the set of software modules.
  • the processor 1202 can carry out the processes of the monitoring device 10 or the like described according to the foregoing example embodiments.
  • each of the processors included in the monitoring device 10 or the like executes one or more programs including a set of instructions for causing a computer to execute the algorithms described with reference to the drawings.
  • the program or programs can be stored by use of various types of non-transitory computer readable media and provided to the computer.
  • the non-transitory computer-readable media include various types of tangible storage media. Examples of the non-transitory computer-readable media include a magnetic recording medium, a magneto-optical recording medium (e.g., a magneto-optical disk), a CD-ROM (read-only memory), a CD-R, a CD-R/W, and a semiconductor memory.
  • the magnetic recording medium may be a flexible disk, a magnetic tape, or a hard-disk drive, for example.
  • the semiconductor memory may be a mask ROM, a programmable ROM (PROM), an erasable PROM (EPROM), a flash ROM, or a random-access memory (RAM), for example.
  • the program or programs may also be supplied to the computer in the form of various types of transitory computer-readable media. Examples of the transitory computer-readable media include an electric signal, an optical signal, and an electromagnetic wave.
  • the transitory computer-readable media can supply the program or programs to the computer via a wired communication line, such as an electric wire or an optical fiber, or via a wireless communication line.
  • a monitoring device comprising:
  • the monitoring device according to Supplementary note 1, wherein the position information acquiring unit is configured to acquire the position information of the object to be tracked included in AIS information received from the object to be tracked provided with an AIS.
  • the monitoring device according to Supplementary note 1 or 2, wherein the position information acquiring unit is configured to acquire the position information of the object to be tracked periodically by use of a radar.
  • the monitoring device according to any one of Supplementary notes 1 to 3, wherein the information generating unit is configured to correct the estimated position of the object to be tracked by use of a difference between the position of the object to be tracked indicated by the position information and the position of the object to be tracked estimated at a timing when the position information has been acquired.
  • the monitoring device according to any one of Supplementary notes 1 to 4, wherein the estimating unit is configured to estimate the position of the object to be tracked by use of a particle filter.
  • the estimating unit is configured to limit a region in which the position of the object to be tracked is estimated to fall by use of the position information of the object to be tracked if the object to be tracked fails to be displayed in the image for a predetermined period.
  • the monitoring device according to any one of Supplementary notes 1 to 6, wherein the estimating unit is configured to estimate the position of the object to be tracked by converting image coordinates of the object to be tracked to world coordinates.
  • the monitoring device according to any one of Supplementary notes 1 to 7, wherein the information generating unit is configured to cause the path information to be displayed in the image.
  • the monitoring device according to Supplementary note 2, wherein the information generating unit is configured to cause the path information to be displayed on an AIS screen generated based on the AIS information.
  • the monitoring device according to Supplementary note 3, wherein the information generating unit is configured to cause the path information to be displayed on a radar screen generated based on information acquired by use of the radar.
  • a tracking method to be executed in a monitoring device comprising:
  • a non-transitory computer-readable medium storing a program that causes a computer to execute:

Abstract

The present disclosure is directed to providing a monitoring device that can improve the accuracy in tracking ships. A monitoring device (10) according to the present disclosure includes a position information acquiring unit (11) configured to acquire position information of an object to be tracked; an estimating unit (12) configured to estimate a position of the object to be tracked displayed in an image captured of a predetermined region; and an information generating unit (13) configured to generate path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.

Description

This application is a National Stage Entry of PCT/JP2019/007234 filed on Feb. 26, 2019, the contents of all of which are incorporated herein by reference, in their entirety.
TECHNICAL FIELD
The present disclosure relates to monitoring devices, tracking methods, and programs.
BACKGROUND ART
Monitoring systems that monitor ships sailing over the ocean are used to navigate the ships safely over the ocean. Information on the positions of the ships identified by the monitoring systems is transmitted to each ship, and this can be expected to keep the ships from colliding with each other.
Patent Literature 1 indicates that the position of a ship is recognized by use of a radar. Moreover, in one typically employed method of recognizing the positions of ships, a ship is provided with an automatic identification system (AIS), and a monitoring device collects AIS information to recognize the position of the ship. This AIS information includes such information as the name of the ship, its destination, and its current position.
CITATION LIST Patent Literature
  • Patent Literature 1: Japanese Unexamined Patent Application Publication No. 2001-186504
SUMMARY OF INVENTION Technical Problem
The positions of ships are identified periodically by use of radars. In addition, AIS information is transmitted from the ships to a monitoring system periodically. Therefore, the positions of the ships are identified periodically by use of the AIS information as well. However, if the positions of the ships can be identified only periodically, this does not allow for recognizing the path along which a given ship has moved from a point where the position of the ship has been identified at a given timing to a point where the position of the ship has been identified at the next timing. This therefore poses a problem in that the monitoring system cannot track ships accurately when tracking the ships based on their position information collected by use of radars and AIS information.
The present disclosure is directed to providing a monitoring device, a tracking method, and a program that can improve the accuracy in tracking ships.
Solution to Problem
A monitoring device according to a first aspect of the present disclosure includes a position information acquiring unit configured to acquire position information of an object to be tracked; an estimating unit configured to estimate a position of the object to be tracked displayed in an image captured of a predetermined region; and an information generating unit configured to generate path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
A tracking method according to a second aspect of the present disclosure includes acquiring position information of an object to be tracked; estimating a position of the object to be tracked displayed in an image captured of a predetermined region; and generating path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
A program according to a third aspect of the present disclosure causes a computer to execute acquiring position information of an object to be tracked; estimating a position of the object to be tracked displayed in an image captured of a predetermined region; and generating path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
Advantageous Effects of Invention
The present disclosure can provide a monitoring device, a tracking method, and a program that can improve the accuracy in tracking ships.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a configuration diagram of a monitoring device according to a first example embodiment.
FIG. 2 is a configuration diagram of a monitoring device according to a second example embodiment.
FIG. 3A illustrates a path of a ship according to the second example embodiment.
FIG. 3B illustrates a path of a ship according to the second example embodiment.
FIG. 4 illustrates paths of ships according to the second example embodiment.
FIG. 5 illustrates a flow of a process of generating path information according to the second example embodiment.
FIG. 6 illustrates a flow of a process performed when a ship to be tracked has disappeared from a captured image according to the second example embodiment.
FIG. 7 is a configuration diagram of a monitoring device according to each of the example embodiments.
DESCRIPTION OF EMBODIMENTS First Example Embodiment
Hereinafter, some example embodiments of the present disclosure will be described with reference to the drawings. With reference to FIG. 1 , a configuration example of a monitoring device 10 according to a first example embodiment will be described. The monitoring device 10 may be a computer device that operates as a processor executes a program stored in a memory. The monitoring device 10 may be, for example, a server device.
The monitoring device 10 includes a position information acquiring unit 11, an estimating unit 12, and an information generating unit 13. The components of the monitoring device 10, such as the position information acquiring unit 11, the estimating unit 12, and the information generating unit 13, may each be software or a module that is processed as the processor executes a program stored in the memory. Alternatively, the components of the monitoring device 10 may each be hardware, such as a circuit or a chip.
The position information acquiring unit 11 acquires position information of an object to be tracked. The object to be tracked may be, for example, a means of transportation, such as a ship, a vehicle, or an aircraft; a person; or an animal. The position information may be, for example, information on the position of the object to be tracked measured by use of the global positioning system (GPS) or the like. For example, the position information acquiring unit 11 may receive AIS information including the position information from a ship provided with an AIS. Alternatively, the position information acquiring unit 11 may irradiate an object to be tracked with radio waves by use of a radar and measure the position of the object to be tracked. The position information acquiring unit 11 may acquire the position information of an object to be tracked regularly, periodically, or at any given timing.
Position information included in AIS information is information on the position acquired by the GPS or the like provided in a ship and includes information on the time at which the GPS or the like has acquired that information on the position.
The estimating unit 12 estimates the position of an object to be tracked displayed in an image captured of a predetermined region. The predetermined region is a region that the monitoring device 10 monitors and may be, for example, a partial region over an ocean or a partial region within a city. The estimating unit 12 captures an image of the predetermined region by use of a camera, for example. An image generated by use of a camera may be a still image or a moving image. To estimate the position of an object to be tracked displayed in an image may mean to convert the image coordinates or the camera coordinates of the object to be tracked displayed in the image to the world coordinates. The image coordinates or the camera coordinates may be indicated by the position of a pixel, for example. The world coordinates may be indicated by the latitude and the longitude, for example. The estimating unit 12 may hold table information representing the correspondence between the world coordinates and the image coordinates or the camera coordinates, for example.
The information generating unit 13 supplements, based on an estimated position of an object to be tracked, the position of the object to be tracked held between its position indicated by position information acquired at a first timing and its position indicated by position information acquired at a second timing that is later than the first timing. The information generating unit 13 generates path information of an object to be tracked by use of pieces of position information of the object to be tracked acquired at different timings and the position information of the object to be tracked estimated based on an image. The path information may be information indicated by a line segment or the like on a display or information indicating the positions of a given ship held at given times.
The second timing is a timing that comes a predetermined period past the first timing. The predetermined period may be, for example, several seconds or several minutes and may be determined, for example, by a system that acquires position information. The information generating unit 13 combines the position of an object to be tracked estimated based on an image captured between the first timing and the second timing with the positions of the object to be tracked held at the first timing and the second timing.
As described above, the monitoring device 10 estimates, based on an image captured of an object to be tracked, the position of the object to be tracked held at a timing when the position information of the object to be tracked cannot be acquired by use of an AIS or a radar, for example. The use of the monitoring device 10 can help reduce the period in which the position information of an object to be tracked cannot be identified, and thus the accuracy in tracking can be improved.
Second Example Embodiment
Now, with reference to FIG. 2 , a configuration example of a monitoring device 20 according to a second example embodiment will be described. The monitoring device 20 includes a radar information acquiring unit 21, an AIS information acquiring unit 22, an imaging unit 23, an estimating unit 24, and an information generating unit 25. The radar information acquiring unit 21 and the AIS information acquiring unit 22 correspond to the position information acquiring unit 11 illustrated in FIG. 1 . The estimating unit 24 corresponds to the estimating unit 12 illustrated in FIG. 1 . The information generating unit 25 corresponds to the information generating unit 13 illustrated in FIG. 1 . Of the functions and operations of the monitoring device 20, the functions and operation similar to those of the monitoring device 10 illustrated in FIG. 1 will not be described in detail. According to the second example embodiment, the monitoring device 20 is used to monitor ships sailing over an ocean.
The radar information acquiring unit 21 irradiates a ship sailing in a monitored region periodically with radio waves by use of a radar and measures the position of the ship based on the reflected waves of the irradiating radio waves. Meanwhile, the AIS information acquiring unit 22 receives AIS information regularly from a ship provided with an AIS. The AIS information includes position information of the ship. The monitoring device 20 identifies the position of a ship provided with an AIS based on the position information included in its AIS information. Moreover, the monitoring device 20 identifies the position of a ship that is not provided with an AIS based on information obtained by use of a radar (hereinafter, referred to as radar information). The monitoring device 20 may also identify the position of a ship provided with an AIS based on its radar information.
The imaging unit 23 may be a camera that captures an image of a monitored region over an ocean. The imaging unit 23 may capture an image of a monitored region over an ocean by use of a plurality of cameras or by use of a single camera. The imaging unit 23 outputs image data obtained by capturing an image to the estimating unit 24.
The estimating unit 24 estimates the position of a ship included in image data. Specifically, the estimating unit 24 converts the image coordinates or the camera coordinates of a ship to the world coordinates. Moreover, the estimating unit 24 estimates or identifies whether ships included in images captured at different timings are the same ship. Images captured at different timings may be different frame images, for example.
The estimating unit 24 may track a ship by use of a particle filter, for example. In one possible case, a ship being tracked may become overlaid with another ship or an obstruction and cease to be included in a frame image. In such a case, the estimating unit 24 may estimate the position of the ship being tracked within the frame image by use of a particle filter. The estimating unit 24 outputs, to the information generating unit 25, the world coordinates of a ship included in image data, that is, information on the latitude and the longitude of the ship. Furthermore, the estimating unit 24 may output, to the information generating unit 25, information on the latitude and the longitude of a ship estimated by use of a particle filter.
The information generating unit 25 receives the position information of a ship provided with an AIS from the AIS information acquiring unit 22. The information generating unit 25 generates path information of a ship by combining position information received from the AIS information acquiring unit 22 and position information received from the estimating unit 24. Path information may be information that represents the path of a single ship by connecting two points indicated by respective pieces of position information with the same line on a GUI. Alternatively, path information may be information stored with two pieces of position information associated with each other by the same ship ID. The information generating unit 25 needs to combine position information of a given ship received from the AIS information acquiring unit 22 with position information of the same ship received from the estimating unit 24. For example, the information generating unit 25 may compare position information received from the AIS information acquiring unit 22 against position information estimated based on image data captured at the same timing as the timing when the position information received from the AIS information acquiring unit 22 has been measured. If these pieces of position information match with each other or if the difference between these pieces of position information is within a predetermined range, the information generating unit 25 may identify or estimate that these pieces of position information belong to the same ship. That the difference between these pieces of position information is within a predetermined range means that these pieces of position information differ only to an extent that can be regarded as an error.
Alternatively, the estimating unit 24 or the information generating unit 25 may analyze image data representing an image captured by the imaging unit 23 and identify the name of a ship included in the image data. If the estimating unit 24 or the information generating unit 25 cannot identify a part of the name of a ship included in image data, the estimating unit 24 or the information generating unit 25 may extract a ship that includes letters identified by referring to a database or the like that manages the names of ships, for example, and may identify the name of the ship. The database may manage the names of ships that may sail through the monitored region. If the name of a ship identified by use of image data matches the name of a ship included in AIS information, the information generating unit 25 may combine the position information of the ship included in the AIS information with the position information estimated based on the image data. Specifically, the information generating unit 25 may identify the name and the type of a ship based on image data captured by use of a ship-name reader camera installed at an entrance of a harbor. Furthermore, position information of a ship estimated based on an image captured by the ship-name reader camera, information on the time at which the image has been captured, and the identified name and type of the ship may be displayed in an image captured by a wide-range monitoring camera used to track ships.
The information generating unit 25 may use, as correction information, the difference between position information received from the AIS information acquiring unit 22 and position information estimated based on image data captured at the same timing as the timing when the position information received from the AIS information acquiring unit 22 has been measured. The information generating unit 25 may modify or correct position information received from the estimating unit 24 by use of the difference between the aforementioned pieces of position information. To rephrase, the information generating unit 25 may use the difference between the two pieces of position information as an offset value.
Now, with reference to FIG. 3A, path information to be generated by the information generating unit 25 will be described. FIG. 3A illustrates path information generated by the information generating unit 25. For example, the information generating unit 25 acquires, from the AIS information acquiring unit 22, the position information of a given ship that includes the latitude of AA and the longitude of aa as well as the position information of the same ship that includes the latitude of DD and the longitude of dd. In other words, this ship is moving from the point where the latitude is AA and the longitude is aa to the point where the latitude is DD and the longitude is dd.
The information generating unit 25 supplements position information received from the estimating unit 24 as the route or the path along which the ship has moved in a period from the timing when the latitude of AA and the longitude of aa have been measured to the timing when the latitude of DD and the longitude of dd have been measured. In other words, the information generating unit 25 supplements the position information that includes the latitude of BB and the longitude of bb and the position information that includes the latitude of CC and the longitude of cc as the position information of the ship held in a period from the timing when the latitude of AA and the longitude of aa have been measured to the timing when the latitude of DD and the longitude of dd have been measured. FIG. 3A illustrates a screen image to be output to a display unit, such as a display. For example, as illustrated in FIG. 3A, the latitude and the longitude of a ship may be displayed in a balloon next to the ship.
Meanwhile, FIG. 3B illustrates a case where position information estimated based on image data captured at the same timing as the timing when position information that includes the latitude of FF and the longitude of ff included in the AIS information has been measured indicates that the latitude is EE and the longitude is ee. The ships indicated by the dashed lines represent the positions of that ship estimated based on image data. In this case, the difference between the position where the latitude is FF and the longitude is ff and the position where the latitude is EE and the longitude is ee may be used as correction information or an offset value. The ships indicated by the dashed lines may not be displayed on the display unit, such as a display, and merely the ship located at the position where the latitude is FF and the longitude is ff may be displayed on the display unit. Thereafter, when the ship is estimated to be located at the position where the latitude is GG and the longitude is gg based on the image data, the estimated position where the latitude is GG and the longitude is gg may be corrected to the position where the latitude is HH and the longitude is hh by use of the offset value, as illustrated in FIG. 3B. The ship located at the position where the latitude is GG and the longitude is gg may not be displayed on the display unit, such as a display, and merely the ship located at the corrected position where the latitude is HH and the longitude is hh may be displayed on the display unit.
Now, with reference to FIG. 4 , path information different from those illustrated in FIGS. 3A and 3B will be described. The pentagonal-shaped objects indicate ships mapped onto a map based on position information received from the AIS information acquiring unit 22. The dashed lines between two ships each indicate the path between two pieces of position information received from the AIS information acquiring unit 22. These dashed lines indicate the pieces of position information received from the estimating unit 24. In this manner, the information generating unit 25 may display position information received from the AIS information acquiring unit 22 by use of an object representing a ship and display position information received from the estimating unit 24 by use of a dashed line.
The information generating unit 25 outputs the image data illustrated in FIG. 3A, 3B, or 4 to the display unit, such as a display. For example, FIGS. 3A and 3B each illustrate an image captured by the imaging unit 23 overlaid with position information identified by use of AIS information and position information estimated by the estimating unit 24. FIG. 4 illustrates a map image overlaid with position information identified by use of AIS information and position information estimated by the estimating unit 24. The image in FIG. 4 may be referred to as an AIS screen, for example.
FIGS. 3A, 3B, and 4 each illustrate an image that the information generating unit 25 generates by use of position information received from the AIS information acquiring unit 22. Alternatively, the information generating unit 25 may use position information received from the radar information acquiring unit 21. In other words, the information generating unit 25 may identify the position of a ship that is not provided with an AIS based on position information received from the radar information acquiring unit 21 and generate the image data illustrated in FIG. 3A, 3B, or 4. When representing position information identified based on radar information, the image in FIG. 4 may be referred to as a radar screen.
Next, with reference to FIG. 5 , a flow of a process of generating path information according to the second example embodiment will be described. First, the imaging unit 23 captures an image of a monitored region that includes a ship to be tracked (S11). Next, the estimating unit 24 estimates the position of the ship to be tracked included in the captured image (S12). For example, the estimating unit 24 may convert the image coordinates of the ship to the world coordinates based on a predetermined conversion table of the image coordinates and the world coordinates.
Next, the information generating unit 25 determines whether the information generating unit 25 has received AIS information related to the same ship as the ship whose position has been estimated at step S12 (S13). The AIS information includes the position information of the ship. The information generating unit 25 may regard the two ships as the same ship if the position information included in the AIS information matches the position estimated by the estimating unit 12 or if the difference between these two positions falls within a predetermined range. Alternatively, the information generating unit 25 may regard the two ships as the same ship if the name of the ship included in the AIS information is the same as the name of the ship displayed in image data. The name of the ship displayed in the image data may be identified through an image analyzing process or the like.
If the information generating unit 25 determines that the information generating unit 25 has acquired the AIS information related to the same ship as the ship whose position has been estimated at step S12, the information generating unit 25 generates correction information indicating the difference between the position information included in the AIS information and the position estimated at step S12 (hereinafter, referred to as an estimated position) (S14). The correction information may be rephrased as an offset value.
Meanwhile, if the information generating unit 25 determines at step S13 that the information generating unit 25 has not acquired the AIS information related to the same ship as the ship whose position has been estimated, the information generating unit 25 determines whether the information generating unit 25 has acquired radar information related to the same ship as the ship whose position has been estimated (S16). The radar information includes the position information of the ship. If the information generating unit 25 determines that the information generating unit 25 has acquired the radar information related to the same ship as the ship whose position has been estimated, the information generating unit 25 generates correction information indicating the difference between the position information included in the radar information and the estimated position (S14).
After step S14, the information generating unit 25 outputs the position information included in the AIS information to the display unit and causes the display unit to display the position information (S15).
If the information generating unit 25 determines at step S16 that the information generating unit 25 has not acquired the radar information related to the same ship as the ship whose position has been estimated, the information generating unit 25 determines whether the correction information indicating the difference between the estimated position and the position information included in the AIS information or the radar information is present (S17). Herein, the correction information is present if the information generating unit 25 has generated the correction information previously at a timing when the information generating unit 25 has acquired the position information included in the AIS information or the radar information.
If the information generating unit 25 determines that the correction information is present, the information generating unit 25 outputs an estimated position corrected by use of the correction information to the display unit and causes the display unit to display the corrected estimated position information (S18). To correct the estimated position by use of the correction information may mean to move the latitude and the longitude of the estimated position by adding or subtracting the correction information to or from the estimated position, for example. If the information generating unit 25 determines that the correction information is not present, the information generating unit 25 outputs the estimated position to the display unit and causes the display unit to display the estimated position (S19).
Now, with reference to FIG. 6 , a flow of a process performed when a ship to be tracked has disappeared from a captured image will be described. A ship to be tracked may disappear from a captured image, for example, when the ship to be tracked is overlaid with another ship and becomes hidden behind another ship or an obstruction. In other words, a ship to be tracked may disappear from a captured image when an occlusion occurs.
First, the estimating unit 24 recognizes that a ship to be tracked has disappeared from a captured image (S21). For example, the estimating unit 24 may recognize that the ship to be tracked has disappeared if the ship being tracked by use of a particle filter ceases to be displayed in the captured image.
Next, the information generating unit 25 determines whether the information generating unit 25 holds AIS information or radar information that includes information on the position of the ship to be tracked held at substantially the same timing as the timing when the image in which the ship to be tracked no longer appears has been captured (S22). Substantially the same timing falls within a predetermined period from the timing when the image in which the ship to be tracked no longer appears has been captured. In other words, substantially the same timing may deviate from the aforementioned timing by an amount that can be regarded as an error.
If the information generating unit 25 determines that the information generating unit 25 holds the AIS information or the radar information that includes the information on the position of the ship to be tracked held at substantially the same timing as the timing when the image in which the ship to be tracked no longer appears has been captured, the information generating unit 25 executes the process at step S23.
At step S23, the information generating unit 25 generates particles to be used to track the ship based on the position information included in the AIS information or the radar information (S23). To generate particles may be rephrased as to disperse particles.
Meanwhile, if the information generating unit 25 determines that the information generating unit 25 does not hold the AIS information or the radar information that includes the information on the position of the ship to be tracked held at substantially the same timing as the timing when the image in which the ship to be tracked no longer appears has been captured, the information generating unit 25 executes the process at step S24.
At step S24, the information generating unit 25 generates particles to be used to track the ship based on the path of the ship to be tracked held up to this point (S24).
At step S23, the particles are generated based on the confirmed position information of the ship. Therefore, the particles generated at step S23 are distributed within a range smaller than the range in which the particles generated at step S24 where the position information is not available are distributed. Since the position information of the ship is not available at step S24, the possible area where the ship may be located is wider than the area obtained when the position information of the ship is available at step S23. In other words, the accuracy in tracking the ship by use of the particles generated at step S23 is higher than the accuracy in tracking the ship by use of the particles generated at step S24. To rephrase, if a ship to be tracked disappears from a captured image, the area in which particles are dispersed can be narrowed by use of position information included in AIS information or radar information related to the ship to be tracked.
As described above, the monitoring device 20 according to the second example embodiment can display the path of a ship more accurately by combining information on the position of the ship obtained from an AIS or a radar with position information of the ship estimated based on a captured image.
Furthermore, the monitoring device 20 can generate correction information for correcting the position of a ship estimated based on an image, by use of AIS information or radar information acquired at substantially the same timing as the timing when the image has been captured. Specifically, the monitoring device 20 adopts the difference between the position of a ship estimated based on an image and the position included in AIS information or radar information as an offset value. The monitoring device 20 corrects the position of a ship estimated based on an image by use of an offset value. With this configuration, the monitoring device 20 can improve the accuracy of an estimated position used as the position of an object to be tracked held at a timing when position information of the object to be tracked cannot be acquired based on AIS information or radar information.
FIG. 7 is a block diagram illustrating a configuration example of the monitoring device 10 or 20 (hereinafter, referred to as the monitoring device 10 or the like). With reference to FIG. 7 , the monitoring device 10 or the like includes a network interface 1201, a processor 1202, and a memory 1203. The network interface 1201 is used to communicate with another network node device constituting a communication system. The network interface 1201 may be used to carry out wireless communication. For example, the network interface 1201 may be used to carry out wireless LAN communication defined in IEEE 802.11 series or mobile communication defined in the 3rd Generation Partnership Project (3GPP). Alternatively, the network interface 1201 may include a network interface card (NIC) compliant with IEEE 802.3 series, for example.
The processor 1202 reads out software (computer program) from the memory 1203 and executes the software. Thus, the processor 1202 implements the processes of the monitoring device 10 or the like described with reference to the flowcharts or sequences according to the foregoing example embodiments. The processor 1202 may be a microprocessor, a microprocessing unit (MPU), or a central processing unit (CPU), for example. The processor 1202 may include a plurality of processors.
The memory 1203 is constituted by a combination of a volatile memory and a non-volatile memory. The memory 1203 may include a storage provided apart from the processor 1202. In this case, the processor 1202 may access the memory 1203 via an I/O interface (not illustrated).
In the example illustrated in FIG. 7 , the memory 1203 is used to store a set of software modules. The processor 1202 can read out the set of software modules from the memory 1203 and execute the set of software modules. Thus, the processor 1202 can carry out the processes of the monitoring device 10 or the like described according to the foregoing example embodiments.
As described with reference to FIG. 7 , each of the processors included in the monitoring device 10 or the like executes one or more programs including a set of instructions for causing a computer to execute the algorithms described with reference to the drawings.
In the foregoing examples, the program or programs can be stored by use of various types of non-transitory computer readable media and provided to the computer. The non-transitory computer-readable media include various types of tangible storage media. Examples of the non-transitory computer-readable media include a magnetic recording medium, a magneto-optical recording medium (e.g., a magneto-optical disk), a CD-ROM (read-only memory), a CD-R, a CD-R/W, and a semiconductor memory. The magnetic recording medium may be a flexible disk, a magnetic tape, or a hard-disk drive, for example. The semiconductor memory may be a mask ROM, a programmable ROM (PROM), an erasable PROM (EPROM), a flash ROM, or a random-access memory (RAM), for example. The program or programs may also be supplied to the computer in the form of various types of transitory computer-readable media. Examples of the transitory computer-readable media include an electric signal, an optical signal, and an electromagnetic wave. The transitory computer-readable media can supply the program or programs to the computer via a wired communication line, such as an electric wire or an optical fiber, or via a wireless communication line.
It is to be noted that the present disclosure is not limited to the foregoing example embodiments, and modifications can be made as appropriate within the scope that does not depart from the technical spirit of the present disclosure. Moreover, the present disclosure may be implemented by combining the example embodiments, as appropriate.
A part or the whole of the foregoing example embodiments can also be expressed as in the following supplementary notes, but the following are not limiting.
(Supplementary Note 1)
A monitoring device comprising:
    • a position information acquiring unit configured to acquire position information of an object to be tracked;
    • an estimating unit configured to estimate a position of the object to be tracked displayed in an image captured of a predetermined region; and
    • an information generating unit configured to generate path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
      (Supplementary Note 2)
The monitoring device according to Supplementary note 1, wherein the position information acquiring unit is configured to acquire the position information of the object to be tracked included in AIS information received from the object to be tracked provided with an AIS.
(Supplementary Note 3)
The monitoring device according to Supplementary note 1 or 2, wherein the position information acquiring unit is configured to acquire the position information of the object to be tracked periodically by use of a radar.
(Supplementary Note 4)
The monitoring device according to any one of Supplementary notes 1 to 3, wherein the information generating unit is configured to correct the estimated position of the object to be tracked by use of a difference between the position of the object to be tracked indicated by the position information and the position of the object to be tracked estimated at a timing when the position information has been acquired.
(Supplementary Note 5)
The monitoring device according to any one of Supplementary notes 1 to 4, wherein the estimating unit is configured to estimate the position of the object to be tracked by use of a particle filter.
(Supplementary Note 6)
The monitoring device according to Supplementary note 5, wherein the estimating unit is configured to limit a region in which the position of the object to be tracked is estimated to fall by use of the position information of the object to be tracked if the object to be tracked fails to be displayed in the image for a predetermined period.
(Supplementary Note 7)
The monitoring device according to any one of Supplementary notes 1 to 6, wherein the estimating unit is configured to estimate the position of the object to be tracked by converting image coordinates of the object to be tracked to world coordinates.
(Supplementary Note 8)
The monitoring device according to any one of Supplementary notes 1 to 7, wherein the information generating unit is configured to cause the path information to be displayed in the image.
(Supplementary Note 9)
The monitoring device according to Supplementary note 2, wherein the information generating unit is configured to cause the path information to be displayed on an AIS screen generated based on the AIS information.
(Supplementary Note 10)
The monitoring device according to Supplementary note 3, wherein the information generating unit is configured to cause the path information to be displayed on a radar screen generated based on information acquired by use of the radar.
(Supplementary Note 11)
A tracking method to be executed in a monitoring device, the tracking method comprising:
    • acquiring position information of an object to be tracked;
    • estimating a position of the object to be tracked displayed in an image captured of a predetermined region; and
    • generating path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
      (Supplementary Note 12)
A non-transitory computer-readable medium storing a program that causes a computer to execute:
    • acquiring position information of an object to be tracked;
    • estimating a position of the object to be tracked displayed in an image captured of a predetermined region; and
    • generating path information of the object to be tracked by supplementing, based on an estimated position of the object to be tracked, the position of the object to be tracked held between the position indicated by the position information acquired at a first timing and the position indicated by the position information acquired at a second timing later than the first timing.
REFERENCE SIGNS LIST
    • 10 monitoring device
    • 11 position information acquiring unit
    • 12 estimating unit
    • 13 information generating unit
    • 20 monitoring device
    • 21 radar information acquiring unit
    • 22 AIS information acquiring unit
    • 23 imaging unit
    • 24 estimating unit
    • 25 information generating unit

Claims (11)

What is claimed is:
1. A monitoring device comprising:
at least one memory storing instructions, and
at least one processor configured to execute the instructions to;
acquire position information of an object to be tracked, the position information including a first position information acquired at a first timing and second position information acquired at a second timing later than the first timing, the first position information indicating a first position, and the second position information indicating a second position;
estimate a third position of the object to be tracked displayed in an image captured of a predetermined region;
generate path information of the object to be tracked by supplementing, based on the third position of the object to be tracked, a fourth position of the object to be tracked held between the first position indicated and the second position; and
limit a region in which the third position of the object to be tracked is estimated to fall by use of the position information of the object to be tracked if the object to be tracked fails to be displayed in the image for a predetermined period.
2. The monitoring device according to claim 1, wherein the at least one processor is further configured to execute the instructions to acquire the position information of the object to be tracked included in AIS information received from the object to be tracked provided with an AIS.
3. The monitoring device according to claim 1, wherein the at least one processor is further configured to execute the instructions to acquire the position information of the object to be tracked periodically by use of a radar.
4. The monitoring device according to claim 1, wherein the at least one processor is further configured to execute the instructions to correct the third position of the object to be tracked by use of a difference between a position of the object to be tracked indicated by the position information and the third position of the object to be tracked estimated at a timing when the position information has been acquired.
5. The monitoring device according to claim 1, wherein the at least one processor is further configured to execute the instructions to estimate the third position of the object to be tracked by use of a particle filter.
6. The monitoring device according to claim 1, wherein the at least one processor is further configured to execute the instructions to estimate the third position of the object to be tracked by converting image coordinates of the object to be tracked to world coordinates.
7. The monitoring device according to claim 1, wherein the at least one processor is further configured to execute the instructions to cause the path information to be displayed in the image.
8. The monitoring device according to claim 2, wherein the at least one processor is further configured to execute the instructions to cause the path information to be displayed on an AIS screen generated based on the AIS information.
9. The monitoring device according to claim 3, wherein the at least one processor is further configured to execute the instructions to cause the path information to be displayed on a radar screen generated based on information acquired by use of the radar.
10. A tracking method to be executed in a monitoring device, the tracking method comprising:
acquiring position information of an object to be tracked, the position information including a first position information acquired at a first timing and second position information acquired at a second timing later than the first timing, the first position information indicating a first position, and the second position information indicating a second position;
estimating a third position of the object to be tracked displayed in an image captured of a predetermined region;
generating path information of the object to be tracked by supplementing, based on the third position of the object to be tracked, a fourth position of the object to be tracked held between the first position; and
limiting a region in which the third position of the object to be tracked is estimated to fall by use of the position information of the object to be tracked if the object to be tracked fails to be displayed in the image for a predetermined period.
11. A non-transitory computer-readable medium storing a program that causes a computer to execute:
acquiring position information of an object to be tracked, the position information including a first position information acquired at a first timing and second position information acquired at a second timing later than the first timing, the first position information indicating a first position, and the second position information indicating a second position;
estimating a third position of the object to be tracked displayed in an image captured of a predetermined region;
generating path information of the object to be tracked by supplementing, based on the third position of the object to be tracked, a fourth position of the object to be tracked held between the first position and the second position; and
limiting a region in which the third position of the object to be tracked is estimated to fall by use of the position information of the object to be tracked if the object to be tracked fails to be displayed in the image for a predetermined period.
US17/432,206 2019-02-26 2019-02-26 Monitoring device, tracking method, and non-transitory computer-readable medium Active 2039-09-10 US11882542B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/007234 WO2020174566A1 (en) 2019-02-26 2019-02-26 Monitoring device, tracking method, and non-transitory computer readable medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/007234 A-371-Of-International WO2020174566A1 (en) 2019-02-26 2019-02-26 Monitoring device, tracking method, and non-transitory computer readable medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/946,305 Continuation US20230015362A1 (en) 2019-02-26 2022-09-16 Monitoring device, tracking method, and non-transitory computer-readable medium

Publications (2)

Publication Number Publication Date
US20220346055A1 US20220346055A1 (en) 2022-10-27
US11882542B2 true US11882542B2 (en) 2024-01-23

Family

ID=72238833

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/432,206 Active 2039-09-10 US11882542B2 (en) 2019-02-26 2019-02-26 Monitoring device, tracking method, and non-transitory computer-readable medium
US17/946,305 Pending US20230015362A1 (en) 2019-02-26 2022-09-16 Monitoring device, tracking method, and non-transitory computer-readable medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/946,305 Pending US20230015362A1 (en) 2019-02-26 2022-09-16 Monitoring device, tracking method, and non-transitory computer-readable medium

Country Status (3)

Country Link
US (2) US11882542B2 (en)
JP (1) JP7160174B2 (en)
WO (1) WO2020174566A1 (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09307883A (en) 1996-05-10 1997-11-28 Nippon Telegr & Teleph Corp <Ntt> Object tracking method, device therefor and image display device using it
JP2000152220A (en) 1998-11-17 2000-05-30 Oki Electric Ind Co Ltd Method for controlling monitor itv camera
JP2001186504A (en) 1999-12-27 2001-07-06 Matsushita Electric Ind Co Ltd Ship's name read system and ship's name read method
US20120274504A1 (en) * 2011-04-28 2012-11-01 Kubota Yugo Information display device, information display method, and radar apparatus
US8411969B1 (en) * 2010-08-06 2013-04-02 The United States Of America As Represented By The Secretary Of The Navy Method for fusing overhead imagery with automatic vessel reporting systems
JP2014192700A (en) 2013-03-27 2014-10-06 Panasonic Corp Tracking processing apparatus, tracking processing system with the same, and tracking processing method
US20150241560A1 (en) * 2014-02-27 2015-08-27 Electronics And Telecommunications Research Institute Apparatus and method for providing traffic control service
US20160092736A1 (en) 2014-09-30 2016-03-31 C/O Canon Kabushiki Kaisha System and method for object re-identification
JP2016126624A (en) 2015-01-06 2016-07-11 Kddi株式会社 Device, program and method for tracking body using dedicated discrimination device on occlusion occurrence
US20170285178A1 (en) * 2016-04-04 2017-10-05 Spire Global, Inc AIS Spoofing and Dark-Target Detection Methodology
JP2018019359A (en) 2016-07-29 2018-02-01 キヤノン株式会社 Ship monitoring device
JP2018073129A (en) 2016-10-28 2018-05-10 株式会社リコー Image processing device, image processing system, image processing method and program
JP2018077807A (en) 2016-11-11 2018-05-17 Kddi株式会社 Device, program and method for tracing body while taking multiple candidates into consideration at change point
US20180259339A1 (en) * 2015-11-13 2018-09-13 FLIR Belgium BVBA Video sensor fusion and model based virtual and augmented reality systems and methods
WO2018216537A1 (en) 2017-05-24 2018-11-29 古野電気株式会社 Video generation device
US20200025912A1 (en) * 2017-02-24 2020-01-23 Japan Aerospace Exploration Agency Flying body and program
US20200184828A1 (en) * 2018-12-05 2020-06-11 Windward Ltd. Risk event identification in maritime data and usage thereof
US11151884B2 (en) * 2016-01-15 2021-10-19 David Belu SOLOMON Vessel systems and methods relating thereto
US11354683B1 (en) * 2015-12-30 2022-06-07 Videomining Corporation Method and system for creating anonymous shopper panel using multi-modal sensor fusion

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09307883A (en) 1996-05-10 1997-11-28 Nippon Telegr & Teleph Corp <Ntt> Object tracking method, device therefor and image display device using it
JP2000152220A (en) 1998-11-17 2000-05-30 Oki Electric Ind Co Ltd Method for controlling monitor itv camera
JP2001186504A (en) 1999-12-27 2001-07-06 Matsushita Electric Ind Co Ltd Ship's name read system and ship's name read method
US8411969B1 (en) * 2010-08-06 2013-04-02 The United States Of America As Represented By The Secretary Of The Navy Method for fusing overhead imagery with automatic vessel reporting systems
US20120274504A1 (en) * 2011-04-28 2012-11-01 Kubota Yugo Information display device, information display method, and radar apparatus
JP2014192700A (en) 2013-03-27 2014-10-06 Panasonic Corp Tracking processing apparatus, tracking processing system with the same, and tracking processing method
US20160063731A1 (en) * 2013-03-27 2016-03-03 Panasonic Intellectual Property Management Co., Ltd. Tracking processing device and tracking processing system provided with same, and tracking processing method
US20150241560A1 (en) * 2014-02-27 2015-08-27 Electronics And Telecommunications Research Institute Apparatus and method for providing traffic control service
US20160092736A1 (en) 2014-09-30 2016-03-31 C/O Canon Kabushiki Kaisha System and method for object re-identification
JP2016072964A (en) 2014-09-30 2016-05-09 キヤノン株式会社 System and method for subject re-identification
JP2016126624A (en) 2015-01-06 2016-07-11 Kddi株式会社 Device, program and method for tracking body using dedicated discrimination device on occlusion occurrence
US20180259339A1 (en) * 2015-11-13 2018-09-13 FLIR Belgium BVBA Video sensor fusion and model based virtual and augmented reality systems and methods
US11354683B1 (en) * 2015-12-30 2022-06-07 Videomining Corporation Method and system for creating anonymous shopper panel using multi-modal sensor fusion
US11151884B2 (en) * 2016-01-15 2021-10-19 David Belu SOLOMON Vessel systems and methods relating thereto
US20170285178A1 (en) * 2016-04-04 2017-10-05 Spire Global, Inc AIS Spoofing and Dark-Target Detection Methodology
JP2018019359A (en) 2016-07-29 2018-02-01 キヤノン株式会社 Ship monitoring device
US20190163984A1 (en) 2016-07-29 2019-05-30 Canon Kabushiki Kaisha Vessel monitoring apparatus
JP2018073129A (en) 2016-10-28 2018-05-10 株式会社リコー Image processing device, image processing system, image processing method and program
JP2018077807A (en) 2016-11-11 2018-05-17 Kddi株式会社 Device, program and method for tracing body while taking multiple candidates into consideration at change point
US20200025912A1 (en) * 2017-02-24 2020-01-23 Japan Aerospace Exploration Agency Flying body and program
WO2018216537A1 (en) 2017-05-24 2018-11-29 古野電気株式会社 Video generation device
US20200090367A1 (en) * 2017-05-24 2020-03-19 Furuno Electric Co., Ltd. Image generating device
US20200184828A1 (en) * 2018-12-05 2020-06-11 Windward Ltd. Risk event identification in maritime data and usage thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Search Report for PCT Application No. PCT/JP2019/007234, dated May 21, 2019.

Also Published As

Publication number Publication date
WO2020174566A1 (en) 2020-09-03
JP7160174B2 (en) 2022-10-25
US20220346055A1 (en) 2022-10-27
JPWO2020174566A1 (en) 2021-12-16
US20230015362A1 (en) 2023-01-19

Similar Documents

Publication Publication Date Title
US11361547B2 (en) Object detection apparatus, prediction model generation apparatus, object detection method, and program
US20210110188A1 (en) Stereo imaging device
US11914055B2 (en) Position-window extension for GNSS and visual-inertial-odometry (VIO) fusion
JP2019004329A (en) Abnormality detection device and vehicle system
US20140037212A1 (en) Image processing method and device
US11061102B2 (en) Position estimating apparatus, position estimating method, and terminal apparatus
KR20200095888A (en) Method for context awareness of unmanned ship system and apparatus for the same
US20180188381A1 (en) Motion propagated position for positioning fusion
JP2019032218A (en) Location information recording method and device
US11802772B2 (en) Error estimation device, error estimation method, and error estimation program
JP5369873B2 (en) Judgment program and calibration device
US11882542B2 (en) Monitoring device, tracking method, and non-transitory computer-readable medium
JP2012129709A (en) Information processor, information processing method, and program
JP6075377B2 (en) COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION METHOD, AND PROGRAM
WO2022113344A1 (en) Information processing device, three-dimensional position estimation method, and non-transitory computer-readable medium having program stored thereon
EP3859275B1 (en) Navigation apparatus, navigation parameter calculation method, and program
US20190286876A1 (en) On-Demand Outdoor Image Based Location Tracking Platform
US20230049796A1 (en) Information processing apparatus, information processing method, and program
CN115769513A (en) Inter-satellite link acquisition supported by machine vision
US20190278989A1 (en) Control device for image capturing apparatus, image capturing apparatus, and method of controlling image capturing apparatus
US20240127474A1 (en) Information processing apparatus, position estimation method, and non-transitory computer-readable medium
US20230336702A1 (en) Processing device, processing system, and processing method
KR102439142B1 (en) Method and apparatus for acquiring image of object
US20230386165A1 (en) Information processing device, recording medium, and information processing method
US20240005554A1 (en) Non-transitory computer-readable recording medium, verification method, and information processing apparatus

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: NEC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MIYAMOTO, SHINICHI;REEL/FRAME:061467/0138

Effective date: 20211105

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE