WO2023008444A1 - Information processing device, program, and positioning method - Google Patents

Information processing device, program, and positioning method Download PDF

Info

Publication number
WO2023008444A1
WO2023008444A1 PCT/JP2022/028823 JP2022028823W WO2023008444A1 WO 2023008444 A1 WO2023008444 A1 WO 2023008444A1 JP 2022028823 W JP2022028823 W JP 2022028823W WO 2023008444 A1 WO2023008444 A1 WO 2023008444A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
imaging device
image
position information
positioning
Prior art date
Application number
PCT/JP2022/028823
Other languages
French (fr)
Japanese (ja)
Inventor
大 飯沼
Original Assignee
Ultimatrust株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ultimatrust株式会社 filed Critical Ultimatrust株式会社
Publication of WO2023008444A1 publication Critical patent/WO2023008444A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/51Relative positioning

Definitions

  • the present invention relates to an information processing device, a program, and a positioning method.
  • Patent Document 1 describes this type of technology.
  • Patent Document 1 relates to a positioning device for a mobile object using an external surveillance camera, which is an essential element when monitoring and following a mobile object.
  • Japanese Patent Laid-Open No. 2004-100002 describes a mobile body having an acceleration sensor, a gyro, a magnetic sensor, an atmospheric pressure sensor, etc., and outputting the position coordinates of the mobile body from the feet of the mobile body predicted from the image of the surveillance camera.
  • Patent Literature 1 Although the conventional technology described in Patent Literature 1 can correct the predicted position of the moving object based on the sensors of the moving object, there is still room for improvement in terms of more accurately identifying the position of the moving object. .
  • the present invention was made to solve the above problems, and its purpose is to accurately identify the position of a moving object such as an automobile in an image.
  • an information processing apparatus provides first positioning information indicating the position of an imaging device and second position indicating the position of a target point within the imaging range of the imaging device.
  • a relative position information acquisition unit that acquires relative position information indicating the positional relationship between the imaging device and the target point based on positioning information; and an image position corresponding to the target point in the image captured by the imaging device.
  • an image position setting unit that associates the relative position information with the image.
  • FIG. 1 is a schematic diagram showing an imaging device, a mobile object, a server, and a positioning satellite according to one embodiment of the present invention
  • FIG. 1 is a block diagram showing the hardware configuration of an information processing apparatus according to one embodiment of the present invention
  • FIG. 1 is a block diagram showing functional configurations of an imaging device, a server, and a moving body according to an embodiment of the present invention
  • FIG. 1 is a schematic plan view showing the positional relationship between an imaging device and a moving object according to an embodiment of the present invention
  • FIG. FIG. 4 is a diagram showing an example of an image in which position information is superimposed on an image captured by an imaging device according to an embodiment of the present invention
  • FIG. 4 is a schematic plan view showing the positional relationship between the imaging device according to the embodiment of the present invention and a moving body, and is a diagram showing an example of positioning a plurality of positions of the moving body.
  • FIG. 10 is a diagram showing an example of an image in which position information is superimposed on an image captured by the imaging device according to the embodiment of the present invention, and is a diagram particularly showing an example in which a plurality of positions of a moving body are measured.
  • 4 is the first half of a flowchart for displaying position information on an image captured by an imaging device according to an embodiment of the present invention
  • FIG. 11 is the second half of the flowchart for displaying position information on an image captured by the imaging device according to the embodiment of the present invention
  • FIG. FIG. 4 is a diagram showing an example of an image from which position information at arbitrary coordinates is obtained in an image captured by the imaging device according to the embodiment of the present invention;
  • the information processing apparatus 10 of the present embodiment exchanges electronic information between the imaging device 2 and the moving body 3 moving around the imaging device 2, for example.
  • the information processing device 10 appropriately has an information processing device communication device 18, the imaging device 2 has an imaging device communication device 28, and the moving body 3 has a moving body communication device 38, respectively.
  • Electronic information is exchanged mainly through the communication line 121 .
  • communication line 121 is wireless communication.
  • the communication line 121 is not limited to wireless communication.
  • the communication line 121 may use wired communication.
  • Reference position information, first positioning information, and second positioning information are obtained as position information.
  • the reference position information is position information obtained by accurately measuring the position of the imaging device 2 .
  • the first positioning information is the position information of the current position measured by GNSS representing the position of the imaging device 2, and the position accuracy of the first positioning information alone is lower than that of the reference position information.
  • the second positioning information is position information of the current position measured by GNSS representing the position of the mobile object 3, which is the target point, and the position accuracy of the second positioning information alone is lower than that of the reference position information.
  • the first positioning information and the second positioning information are acquired based on positioning signals of satellites synchronized in time. Here, the synchronization of the times does not only mean that the times are completely matched.
  • the difference in acquisition time is within the difference in time that can offset the GNSS radio wave reception status at the acquisition time of the first location information and the GNSS radio wave reception status at the acquisition time of the second location information. Examples include:
  • the reference position information of the imaging device 2 is obtained when the reference position antenna 20 receives the positioning signal from the reference position satellite 41 .
  • a reference position satellite positioning signal reception unit 401 receives an electrical signal from the reference position antenna 20, and a reference position positioning calculation unit 402 (not shown in FIG. 1) calculates and derives reference position information.
  • the first positioning information of the imaging device 2 is obtained when the first positioning antenna 21 of the imaging device 2 receives positioning signals from the first positioning satellites 42 .
  • a first positioning satellite positioning signal receiving unit 403 (not shown in FIG. 1) receives an electrical signal from the first positioning antenna 21, and a first positioning calculation unit 404 (not shown) receives the first positioning information. Calculate and derive.
  • the moving object 3 has a second position antenna 31.
  • the second positioning antenna 31 receives positioning signals from the second positioning satellite 43 .
  • a second positioning satellite positioning signal receiving unit 405 (not shown) receives the electrical signal from the second positioning antenna 31, and a second positioning calculating unit 406 (not shown) calculates second positioning information.
  • the first positioning satellite 42 and the second positioning satellite 43 require four satellites to calculate the position coordinates. The reason is that position coordinates and time are variables.
  • FIG. 1 shows an example in which five satellites are applied as the first positioning satellites 42, five satellites are applied as the second positioning satellites 43, and four satellites among them are common.
  • the information processing device 10 identifies the position of the moving object 3, which is the target point, based on the reference position information, the first positioning information, and the second positioning information.
  • the information processing apparatus 10 includes a processor 13, a ROM (Read Only Memory) 14, a RAM (Random Access Memory) 15, an input/output unit 11, a communication unit 12, and an input/output interface 17. and
  • the processor 13 performs various calculations and processes.
  • the processor 13 includes, for example, a CPU (central processing unit), MPU (micro processing unit), SoC (system on a chip), DSP (digital signal processor), GPU (graphics processing unit), ASIC (application technology), Examples include PLD (programmable logic device) or FPGA (field-programmable gate array).
  • processor 13 is a combination of several of these.
  • the processor 13 may be a combination of these with a hardware accelerator or the like.
  • the processor 13 , ROM 14 and RAM 15 are interconnected via a bus 16 .
  • the processor 13 executes various processes according to programs recorded in the ROM 14 or programs loaded in the RAM 15 . Part or all of the program may be incorporated within the circuitry of processor 13 .
  • the bus 16 is also connected to the input/output interface 17 .
  • the input/output unit 11 and the communication unit 12 are connected to the input/output interface 17 .
  • the input/output unit 11 is electrically connected to the input/output interface 17 by wire or wirelessly.
  • the input/output unit 11 includes, for example, an input unit such as a keyboard and a mouse, and an output unit such as a display for displaying images and a speaker for amplifying sounds. Note that the input/output unit 11 may have a configuration in which the display function and the input function are integrated like a touch panel.
  • the communication unit 12 is a device in which the processor 13 communicates with other devices (for example, the imaging device 2 and the mobile object 3) via the communication line 121.
  • the hardware configuration shown here is just an example, and it is not particularly limited to this configuration.
  • these various processors are combined with processing circuits such as ASIC (Application Specific Integrated Circuit) and FPGA (Field-Programmable Gate Array). may be employed as a processor to realize the functional configuration.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • FIG. 3 shows the functional configuration of the information processing device 10 of the present embodiment, and the imaging device 2, mobile object 3, etc. with which the information processing device 10 exchanges signals.
  • the moving body 3 there is a case where one moving body 3 moves to a plurality of points and is positioned at each point.
  • a functional configuration of the information processing device 10 is implemented by the processor 13 .
  • the functional configuration of the imaging device 2 is realized by electronic components mounted on the imaging device 2 and a processor of an information processing device such as a computer.
  • the functional configuration of the mobile body 3 is realized by an information processing device such as an electronic component computer mounted on the mobile body 3 .
  • the information processing device 10 here is separate from the imaging device 2 and the mobile object 3, and the server 1 includes all of the functional configurations.
  • the information processing device 10 can also be configured to be partially or wholly incorporated in the imaging device 2 and the moving body 3 .
  • the reference position satellite positioning signal reception unit 401 receives satellite signals via the reference position antenna 20 in order to position the reference position of the imaging device 2 .
  • the reference position positioning calculator 402 receives the signal from the reference position satellite positioning signal receiver 401 and calculates the reference position of the imaging device 2 .
  • the reference position satellite 41 for example, a satellite that provides an accurate position such as a quasi-zenith satellite is selected, and an accurate position is provided as the reference position.
  • the reference position antenna 20 for receiving radio waves from the reference position satellite 41 is generally relatively large, and it is preferable that the reference position antenna 20 be removed after the position is measured.
  • the measurement device including the reference position antenna 20 and the reference position information acquisition unit 103 is detachably connected to the imaging device 2 . Accordingly, after the reference position is determined, the reference position antenna 20 and the reference position information acquisition unit 103 can be removed from the imaging device 2 .
  • the installation position of the imaging device 2 may be plotted on a map and the position coordinates may be specified from the map.
  • the position of the imaging device 2 is derived by a positioning system similar to the positioning system that the mobile body 3 can have, in addition to positioning based on the reference position satellite 41 described above.
  • the current position of imaging device 2 is referred to herein as the first position.
  • a system similar to the positioning system that the moving body 3 can have has a first positioning antenna 21 , a first positioning satellite positioning signal receiving section 403 and a first positioning calculation section 404 .
  • the first positioning satellite positioning signal receiving unit 403 receives the positioning signal from the first positioning antenna 21
  • the first positioning calculation unit 404 calculates the first position of the imaging device 2 .
  • the first positioning satellite positioning signal reception unit 403 and the first positioning calculation unit 404 may be provided in the imaging device 2 .
  • the reference position of the imaging device 2 may be obtained by long-term positioning or static positioning using the first positioning antenna 21, the first positioning satellite positioning signal receiving unit 403, and the first positioning calculation unit 404.
  • the image capturing device 2 has an image capturing unit 201, and captures, for example, the moving body 3 moving in the parking lot.
  • the imaging device 2 has an attitude control device 22 .
  • the posture control unit 202 controls the shooting direction of the imaging device 2 via the posture control device 22 .
  • the imaging device 2 has an orientation/position information detection unit 203 , detects the shooting direction of the imaging device 2 controlled by the orientation control unit 202 , and sends the detection result to the image information management unit 101 of the information processing device 10 . be done.
  • the information processing apparatus 10 has an image information management unit 101, an information recording unit 102, a reference position information acquisition unit 103, a first positioning information acquisition unit 104, a second positioning information acquisition unit 105, and a relative position information acquisition unit 106. .
  • the moving body 3 is, for example, an automobile moving in a parking lot.
  • the current position of the mobile 3 is referred to herein as the second position.
  • the moving body 3 has a second positioning antenna 31, and further, as shown in FIG. have There are cases where a plurality of moving bodies 3 are used and the respective positions are measured, and there are cases where one moving body 3 is used, the moving body 3 moves, and each moved position is measured. In this specification, unless otherwise specified, they are referred to as the moving body 3 without being distinguished from each other.
  • the second positioning satellite positioning signal receiving unit 405 receives the positioning signal from the second positioning antenna 31, and the second positioning calculation unit 406 calculates the second position.
  • the first position measured by the first positioning calculation unit 404 via the first position antenna 21 and the second position measured by the second positioning calculation unit 406 via the second positioning antenna 31 are the current positions. is positioning information indicating
  • the reference position information acquisition unit 103 acquires the reference position calculated by the reference position positioning calculation unit 402 .
  • the first positioning information acquisition unit 104 acquires the first positioning information calculated and derived by the first positioning calculation unit 404 .
  • the second positioning information acquisition unit 105 acquires the second positioning information derived by the second positioning calculation unit 406 of the moving body 3 .
  • the relative position information acquisition unit 106 calculates the distance between the first position of the imaging device 2 and the second position of the moving body 3 based on the information from the first positioning information acquisition unit 104 and the second positioning information acquisition unit 105. Derive the relative position.
  • the relative position information includes, for example, information such as distance, direction, and height.
  • the first positioning satellite positioning signal receiving section 403 and the second positioning satellite positioning signal receiving section 405 use the same satellite for measurement.
  • the relative positional relationship between the two transitions is obtained by measuring the time difference between the radio signals from the satellites arriving at the respective receivers.
  • Each observation point receives radio waves from the same satellite, and the radio waves emitted from the satellite pass through similar weather conditions. This eliminates satellite position errors and delays in the troposphere and ionosphere. This provides an accurate relative position.
  • the first positioning information and the second positioning information are derived based on the positioning signals from the satellites. Then, the difference between these positions or the distance indicating the difference is obtained as relative position information.
  • the image information management unit 101 obtains the reference position from the reference position information acquisition unit 103 and the relative position from the relative position information acquisition unit 106 . Then, by adding the relative position to the reference position, the accurate position or absolute position coordinates of the moving body 3 are derived.
  • the image information management unit 101 acquires the reference position from the reference position information acquisition unit 103, the image from the imaging unit 201 of the imaging device 2, and the orientation of the imaging device 2 from the orientation position information detection unit 203 of the imaging device 2.
  • the relative position information of the moving body 3 is received from the position information acquisition unit 106 respectively.
  • the image information management unit 101 has an imaging device orientation position information acquisition unit 107 , an image position correction unit 108 , an image position setting unit 109 , an interpolation/extrapolation processing unit 110 and a moving object identification unit 111 .
  • the imaging device posture/position information acquisition unit 107 receives posture information of the imaging device 2 from the posture/position information detection unit 203 .
  • An image position correction unit 108 corrects the image position of the captured image obtained from the imaging unit 201 . For example, an image captured when the imaging device 2 is oriented at 90° and an image captured when the imaging device 2 is oriented at 45° are corrected and joined together. Then, an image with a wider angle of view than the angle of view of the imaging device itself is realized.
  • the posture position information detection unit 203 detects, for example, the output of the three-axis servomotor of the tripod on which the imaging device 2 is installed.
  • estimation of the orientation of the imaging device 2 is not limited to those directly fixed to the imaging device 2, such as acceleration sensors and geomagnetic sensors.
  • the posture of the imaging device 2 may be estimated from a camera image that overlooks the imaging device or from a posture estimation device externally attached to the imaging device. When a change in posture of the imaging device is detected, the relative position information is updated according to the amount of change.
  • the imaging device orientation position information acquisition unit 107 acquires position information and orientation information from the reference position information acquisition unit 103 . It is assumed that the position or attitude of the imaging device 2 has been changed. When these changes are detected by the posture position information detector 23, the image position correction unit 108 corrects the relationship between the positions in the image and the relative position information based on these changes.
  • the image position setting unit 109 incorporates the correction of the relationship between the position in the image and the relative position information by the image position correcting unit 108, and associates the relative position information with the image position in the image captured by the imaging device 2. .
  • the captured image 240 captured by the imaging device 2 is given reference position information, or the moving body position information is given to an image including the moving body 3 . This information is then recorded in the information recording unit 102 .
  • the interpolation/extrapolation processing unit 110 calculates the relative position corresponding to arbitrary coordinates in the captured image 240 based on the coordinates in the captured image 240 already derived corresponding to the plurality of second positions and the relative position information. Information is obtained by interpolation and extrapolation.
  • the moving body identifying unit 111 identifies the moving body 3 in the captured image 240 by image recognition. Then, the moving body identification unit 111 derives coordinates on the image corresponding to the identified position of the moving body 3 .
  • FIG. 4 is a schematic plan view showing the positional relationship between the moving body 3 moving in the parking lot and the imaging device 2.
  • FIG. 5 is an example of a captured image 240 captured by the imaging device 2 .
  • the distance, angle, and height are given to the screen as device position information of the imaging device 2 .
  • the distance from the imaging device 2 is 13.8 m
  • the angle is 123°
  • the height is 1.3 m.
  • the moving body identification unit 111 recognizes the moving body 3 in the captured image 240 by image recognition technology and identifies the coordinates on the image 240 . Based on the first positioning information obtained by the first positioning information obtaining unit 104 and the second positioning information obtained by the second positioning information obtaining unit 105, the relative position information obtaining unit 106 captures an image of the moving body 3. Get relative position information at time. Then, the image position setting unit 109 associates the position coordinates of the mobile object 3 on the image 240 with the device position information.
  • FIG. 6 shows a plan view
  • FIG. 7 shows a screen display example.
  • an image file and an information file containing data such as position information and orientation information are saved as separate files, and whether or not to superimpose information based on the information file on the image is arbitrarily selected at the time of reproduction.
  • the second positioning information acquisition unit 105 acquires the second positioning information at the point where the moving body 3 moves in the parking lot. Further, as shown in FIG. 6, the moving body 3 acquires the third positioning information at a plurality of points in the parking lot. Then, relative position information with respect to the imaging device 2 is obtained by the relative position information obtaining unit 106 at five points in FIG. In FIG. 6, (13.8 m, 12.3°, 1.3 m) (11.3 m, 57°, 2.2 m) where the combination of distance, angle, and height is (distance, angle, height) and other relative position information are obtained.
  • relative position information from the imaging device 3 is associated with each point in the image through a series of processes.
  • the position of the subject in the image 240 captured by the imaging device 3 can be calculated based on the relative position information of each point.
  • the interpolation/extrapolation processing unit 110 determines the relative position corresponding to arbitrary coordinates in the captured image 240 based on the coordinates in the captured image 240 already derived corresponding to the plurality of second positions and the relative position information. Information is obtained by interpolation and extrapolation. That is, the image position setting unit 109 can accurately derive the position of the moving body at the target point in the image only from the image without acquiring the position information from the moving body.
  • abnormal values may be mixed. This is assumed to be caused by signal noise entering the device or by deterioration in the reception state of positioning radio waves.
  • Such outliers are corrected based on other relative position information. For example, it is useful to take the mean, removing the points with its outliers.
  • FIGS. 8 and 9 are flowcharts showing the flow of processing according to the functional configuration.
  • FIG. 8 shows the first half of the flowchart
  • FIG. 9 shows the second half of the flowchart.
  • a dummy step indicated by A is written in the circle to show the connection.
  • the information processing apparatus 10 acquires reference position information (step S501).
  • the imaging device 2 acquires posture information (step S502). If the posture of the imaging device 2 has changed (step S503: Yes), the process returns to step S501. If the posture of the imaging device 2 has not changed (step S503: No), the imaging device 2 acquires image information (step S504). Next, the imaging device 2 acquires first positioning information (step S505).
  • the relative position information of the moving body 3 is acquired (step S507). If the mobile object 3 has not acquired the second positioning information (step S506: No), the coordinates of the mobile object in the image are calculated. Then, the relative position information of the moving body 3 is acquired (step S507).
  • the absolute position coordinates of the moving body 3 are acquired (step S509; Yes)
  • the relative position information of the moving body 3 is added to the reference position information of the imaging device 2, and the absolute position information of the moving body is generated ( Step S511)
  • the position information of the moving body 3 is associated with the image information (step S510).
  • Absolute position information includes, for example, latitude and longitude.
  • the absolute position coordinates of the moving body 3 are not acquired (step S509: No)
  • the position information of the moving body 3 is associated with the image information.
  • the finally obtained information is recorded in the information recording unit 102, and the process ends (END).
  • the reference position information of the imaging device 2 is acquired prior to the first positioning information.
  • the order is not limited to this, and the order can be changed.
  • the reference position information can be acquired after the relative position information 230 between the imaging device 2 and the moving object 3 is acquired or in advance.
  • the information processing device 10 is independently incorporated in the server 1.
  • the information processing device 10 is not limited to this, and a part or all of the information processing device 10 can be incorporated in the imaging device 2 or the moving body 3 .
  • the imaging device 2 and the moving body 3 may be connected by a communication line 121, for example WiFi. benefits arise.
  • An image position correction unit 108, an image position setting unit 109, an interpolation/extrapolation processing unit 110, a moving object identification unit 111, a relative position information acquisition unit 106, etc. are provided in the server 1, and other components are imaged.
  • a configuration provided in the device 2 is also possible.
  • the central processing unit can be provided only in the server 1, and there is no need to provide the imaging device 2 with a countermeasure against heat dissipation, a large-capacity uninterruptible power supply, or the like.
  • FIG. 10 shows equidistant lines and bearings on image 240 drawn based on the result of interpolation extrapolation. Although the height is omitted, if it is a perfectly flat land that is similarly required, it will be an equidistant line by an ellipse. is not limited.
  • the moving object identifying unit 111 can obtain the relative position information between the moving object 3 and the imaging device 2 by identifying the position or coordinates on the image 240. .
  • the mobile 3 does not necessarily have to have the second positioning information.
  • the moving body 3 is recognized in the image 240 by the moving body identification unit 11 .
  • the contour of the mobile object 3 is recognized by the mobile object identification unit 111, and the contact points with the ground are identified.
  • a contact point is identified as a position or coordinates on the image 240 of the moving body 3 .
  • the distance, angle, and height of the moving body 3 from the imaging device 2 are derived by the moving body identifying unit 111 through the interpolation extrapolation described above.
  • the information recording unit 102 stores not only the relative information acquired by the relative position information acquisition unit 106, that is, the distance, angle, and height from the imaging device 2 to the moving object 3, but also image information and information obtained from the image (depth , position and height). In order to reduce the communication load, it includes not only the position information of the subject, but also the position information of roads, buildings, etc. in the background.
  • Modification 3 When obtaining only relative position information, reference position information may not be used. If the reference position information is not used, the reference position information acquisition unit 103 can be omitted. In this way, the functional configuration of this embodiment can be omitted as appropriate, and another functional configuration can be added.
  • the mobile object is an automobile, but it may be a vehicle, a ship, a bicycle, an aircraft, a person, an animal, a mobile terminal, or the like. So far, the use of the reception function of positioning radio waves of positioning satellites has been taken as an example for positioning the position of a mobile object. may be a configuration in which is estimated. Position information predicted from travel history, positioning by beacons, distance measuring devices such as LiDAR, and position information based on analysis results of images acquired by imaging devices may also be used. The moving body itself does not have to have means for acquiring position information.
  • the series of processes described above can be executed by hardware or by software.
  • a program that constitutes the software is installed in a computer or the like from a network and a recording medium.
  • a recording medium containing such a program is not only composed of removable media distributed separately from the main body of the device in order to provide the program to the borrower, but is also provided to the borrower in a state pre-installed in the main body of the device. It consists of a recording medium, etc.
  • Removable media include, for example, magnetic disks (including floppy disks), optical disks, or magneto-optical disks.
  • Optical discs are composed of, for example, CD-ROMs (Compact Disk-Read Only Memory), DVDs (Digital Versatile Disks), Blu-ray (registered trademark) Discs (Blu-ray Discs), and the like.
  • the magneto-optical disk is composed of an MD (Mini-Disk) or the like.
  • the recording medium provided to the borrower in a state of being pre-installed in the apparatus main body is composed of, for example, a program memory and a hard disk in which the program is recorded.
  • the information processing device 10 connects the imaging device and the target point based on first positioning information indicating the position of the imaging device 2 and second positioning information indicating the position of the target point within the imaging range of the imaging device 2. and an image position setting unit that associates the relative position information with the image position corresponding to the target position in the image captured by the imaging device. .
  • the position of the mobile object 3 such as a car can be accurately specified on the image position.
  • the relative position information includes distance information indicating the distance from the position of the imaging device 2 to the target point, and angle information indicating the imaging direction of the imaging device.
  • the position of a moving body 3 such as an automobile can be accurately specified using distance information from the imaging device 2 and angle information indicating the direction.
  • the relative position information acquisition unit 106 included in the information processing apparatus 10 acquires relative position information for each of a plurality of target points of different points, and the image position setting unit 109 acquires the image position corresponding to each of the plurality of target points. associated with the corresponding relative position information.
  • each piece of relative position information in the image associated with each of the plurality of second points is corrected based on the relative position information of other second points.
  • More accurate information can be obtained as each relative position information in the image.
  • the information processing apparatus 10 includes an imaging device orientation/position information acquisition unit 107 that acquires imaging device orientation/position information regarding the orientation and/or position of the imaging device 2 .
  • the information processing apparatus detects a change in the orientation and/or position of the imaging device 2 based on the imaging device orientation/position information acquired by the imaging device orientation/position information acquisition unit 107, the information processing device determines the image position in the image based on the change.
  • An image position correction unit 108 is provided to correct the relative position information relationship.
  • the accuracy of the relative position information in the image is improved.
  • the information processing device has reference position information that is absolute position information of the imaging device that is more accurate than the first positioning information, and obtains current position information of the target point based on the reference position information and the relative position information.
  • An image position setting unit 109 is provided.
  • the reference position information provides an accurate absolute position of the imaging device. An error is canceled between the positioning information of the imaging device 2 based on the positioning signal and the positioning information of the moving body 3 at the target point. As a result, accurate relative position information between the moving body 3 at the target point and the imaging device can be obtained. Together with the reference position information, accurate current position information of the moving object 3, which is the target point, can be obtained.
  • the second positioning information is positioning information indicating the current position based on the positioning signal of the moving body 3 passing through the target point.
  • Accurate relative position information between the mobile object 3 and the first point can be obtained.
  • Accurate position information of the moving body 3 is obtained together with the reference position information.
  • the information processing device 10 includes a moving object identification unit that identifies, by image recognition, the moving object 3 passing through the target position in the image captured by the imaging device.
  • the position of the mobile object 3 can be identified. Then, the moving body 3 can be specified by image recognition.
  • the information processing device 10 includes an interpolation/extrapolation processing unit that associates position information with respect to an arbitrary position in the image from the correspondence between a plurality of pieces of second positioning information and positions in the image.
  • the relative position information of any position in the image can be obtained.
  • the program indicates the positional relationship between the imaging device and the target point based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device.
  • a relative position information acquisition function for acquiring relative position information
  • an image position setting function for associating relative position information with an image position corresponding to a target position in an image captured by the imaging device.
  • the position of the mobile object 3 such as a car can be accurately specified on the image position.
  • the position specifying method determines the positional relationship between the imaging device and the target point based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device. and an image position setting step of associating the relative position information with the image position corresponding to the target position in the image captured by the imaging device.
  • the position of the mobile object 3 such as a car can be accurately specified on the image position.
  • the present invention is not limited to the above-described embodiments, and can be modified as appropriate.
  • 1 Server 2 Imaging device, 3 Moving body, 10 Information processing device, 101 Image information management unit, 103 Reference position information acquisition unit, 104 First positioning information acquisition unit, 105 Second positioning information acquisition unit, 106 Relative position information acquisition unit 107 imaging apparatus attitude position information acquisition unit 108 image position correction unit 109 image position setting unit

Abstract

An information processing device 10 includes: a relative position information acquisition unit 106 that acquires, according to a first position positioning information showing the position of an imaging device 2 and a second position positioning information showing the position of a target point in an imaging range of the imaging device 2, relative position information showing a positional relationship between the imaging device and the target point; and an image position setting unit 109 that associates the relative position information with an image position corresponding to a target position in an image captured by the imaging device.

Description

情報処理装置、プログラム及び測位方法Information processing device, program and positioning method
 本発明は、情報処理装置、プログラム及び測位方法に関する。 The present invention relates to an information processing device, a program, and a positioning method.
 従来、移動体を撮像した画像に基づいて、当該移動体の位置を特定する技術が知られている。この種の技術が記載されるものとして例えば特許文献1がある。 Conventionally, there is known a technique for specifying the position of a mobile object based on an image of the mobile object. For example, Patent Document 1 describes this type of technology.
 特許文献1は、移動体を監視して追従する際の不可欠な要素となる外部の監視カメラを用いた移動体の測位装置に関するものである。特許文献1には、加速度センサ、ジャイロ、磁気センサ、気圧センサ等を持つ移動体と、監視カメラの画像から予測した移動体の足元から移動体の位置座標を出力することが記載されている。 Patent Document 1 relates to a positioning device for a mobile object using an external surveillance camera, which is an essential element when monitoring and following a mobile object. Japanese Patent Laid-Open No. 2004-100002 describes a mobile body having an acceleration sensor, a gyro, a magnetic sensor, an atmospheric pressure sensor, etc., and outputting the position coordinates of the mobile body from the feet of the mobile body predicted from the image of the surveillance camera.
特開2016-1875号Japanese Patent Application Laid-Open No. 2016-1875
 特許文献1に記載されるような従来技術では、移動体の有するセンサに基づいて移動体の予測位置を補正できるものの、移動体の位置をより正確に特定するという点で改善の余地があった。 Although the conventional technology described in Patent Literature 1 can correct the predicted position of the moving object based on the sensors of the moving object, there is still room for improvement in terms of more accurately identifying the position of the moving object. .
 本発明は、上述の課題を解決するためになされたものであり、その目的は画像中の自動車等の移動体の位置を正確に特定することにある。 The present invention was made to solve the above problems, and its purpose is to accurately identify the position of a moving object such as an automobile in an image.
 上記目的を達成するため、本発明の一態様である情報処理装置は、撮像装置の位置を示す第1位置測位情報と、前記撮像装置の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて前記撮像装置と前記対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得部と、撮像装置によって撮像される画像中の前記対象地点に対応する画像位置に対して相対位置情報を関連づける画像位置設定部と、を備える。 To achieve the above object, an information processing apparatus according to one aspect of the present invention provides first positioning information indicating the position of an imaging device and second position indicating the position of a target point within the imaging range of the imaging device. a relative position information acquisition unit that acquires relative position information indicating the positional relationship between the imaging device and the target point based on positioning information; and an image position corresponding to the target point in the image captured by the imaging device. and an image position setting unit that associates the relative position information with the image.
 本発明によれば、画像中の自動車等の移動体の位置を正確に特定することが出来る。 According to the present invention, it is possible to accurately identify the position of a moving object such as an automobile in an image.
本発明の一実施形態に係る撮像装置、移動体、サーバー、測位衛星を示す模式図である。1 is a schematic diagram showing an imaging device, a mobile object, a server, and a positioning satellite according to one embodiment of the present invention; FIG. 本発明の一実施形態に係る情報処理装置のハード構成を示すブロック図である。1 is a block diagram showing the hardware configuration of an information processing apparatus according to one embodiment of the present invention; FIG. 本発明の一実施形態に係る撮像装置、サーバー、移動体の機能的構成を示すブロック図である。1 is a block diagram showing functional configurations of an imaging device, a server, and a moving body according to an embodiment of the present invention; FIG. 本発明の一実施形態に係る撮像装置と移動体との位置関係を示す平面模式図である。1 is a schematic plan view showing the positional relationship between an imaging device and a moving object according to an embodiment of the present invention; FIG. 本発明の一実施形態に係る撮像装置の撮影した画像に位置情報を重ねた画像の例を示す図である。FIG. 4 is a diagram showing an example of an image in which position information is superimposed on an image captured by an imaging device according to an embodiment of the present invention; 本発明の一実施形態に係る撮像装置と移動体との位置関係を示す平面模式図であって、移動体の位置を複数測位した例を示す図である。FIG. 4 is a schematic plan view showing the positional relationship between the imaging device according to the embodiment of the present invention and a moving body, and is a diagram showing an example of positioning a plurality of positions of the moving body. 本発明の一実施形態にかかる撮像装置の撮影した画像に位置情報を重ねた画像の例を示す図であって、特に、移動体の位置を複数測位した例を示す図である。FIG. 10 is a diagram showing an example of an image in which position information is superimposed on an image captured by the imaging device according to the embodiment of the present invention, and is a diagram particularly showing an example in which a plurality of positions of a moving body are measured. 本発明の一実施形態に係る撮像装置の撮影した画像に位置情報を表示するフローチャートの前半部である。4 is the first half of a flowchart for displaying position information on an image captured by an imaging device according to an embodiment of the present invention; 本発明の一実施形態に係る撮像装置の撮影した画像に位置情報を表示するフローチャートの後半部である。FIG. 11 is the second half of the flowchart for displaying position information on an image captured by the imaging device according to the embodiment of the present invention; FIG. 本発明の一実施形態に係る撮像装置の撮影した画像の任意の座標における位置情報が得られる画像の例を示す図である。FIG. 4 is a diagram showing an example of an image from which position information at arbitrary coordinates is obtained in an image captured by the imaging device according to the embodiment of the present invention;
 以下、本発明の実施形態について図面を参照しながら説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1に示すように、本実施形態の情報処理装置10は、撮像装置2と撮像装置2の周囲を移動する移動体3等との間で電子情報を交換する。情報処理装置10は情報処理装置用通信装置18を、撮像装置2は撮像装置用通信装置28を、移動体3は移動体用通信装置38をそれぞれ適宜有する。電子情報の交換は、主に通信回線121により行われる。この例では、通信回線121は無線通信である。なお、通信回線121が無線通信に限定される訳ではない。例えば、通信回線121が有線通信を利用していても良い。 As shown in FIG. 1, the information processing apparatus 10 of the present embodiment exchanges electronic information between the imaging device 2 and the moving body 3 moving around the imaging device 2, for example. The information processing device 10 appropriately has an information processing device communication device 18, the imaging device 2 has an imaging device communication device 28, and the moving body 3 has a moving body communication device 38, respectively. Electronic information is exchanged mainly through the communication line 121 . In this example, communication line 121 is wireless communication. Note that the communication line 121 is not limited to wireless communication. For example, the communication line 121 may use wired communication.
 以下、位置情報として、参照位置情報、第1位置測位情報、第2位置測位情報が取得される。参照位置情報は、撮像装置2の位置を正確に測定した位置情報である。第1位置測位情報は、撮像装置2の位置を表すGNSSで測定した現在位置の位置情報であり、第1位置測位情報単独では、参照位置情報と比して位置精度は低い。第2位置測位情報は、対象地点である移動体3の位置を表すGNSSで測定した現在位置の位置情報であり、第2位置測位情報単独では、参照位置情報と比して位置精度は低い。これら第1位置測位情報と第2位置測位情報とは、時刻が同期している衛星の測位信号に基づいて取得される。ここで、時刻が同期とは、完全に時刻が一致していることのみを意味するものではない。第一位置情報の取得時刻と第2位置情報の取得時刻とを関連付けられる時刻である。より限定的には、第1位置情報の取得時刻におけるGNSS電波受信状況と第2位置情報の取得時刻におけるGNSS電波受信状況とを相殺し得る時刻の差に取得時刻の差が収まっていることが例として挙げられる。 Reference position information, first positioning information, and second positioning information are obtained as position information. The reference position information is position information obtained by accurately measuring the position of the imaging device 2 . The first positioning information is the position information of the current position measured by GNSS representing the position of the imaging device 2, and the position accuracy of the first positioning information alone is lower than that of the reference position information. The second positioning information is position information of the current position measured by GNSS representing the position of the mobile object 3, which is the target point, and the position accuracy of the second positioning information alone is lower than that of the reference position information. The first positioning information and the second positioning information are acquired based on positioning signals of satellites synchronized in time. Here, the synchronization of the times does not only mean that the times are completely matched. It is the time that associates the acquisition time of the first location information and the acquisition time of the second location information. More specifically, the difference in acquisition time is within the difference in time that can offset the GNSS radio wave reception status at the acquisition time of the first location information and the GNSS radio wave reception status at the acquisition time of the second location information. Examples include:
 撮像装置2の参照位置情報は、参照位置用アンテナ20が参照位置用衛星41からの測位信号を受信することで得られる。図示されていない参照位置用衛星測位信号受信部401が参照位置用アンテナ20からの電気信号を受信し、図1に図示されていない参照位置測位計算部402が参照位置情報を計算して導出する。撮像装置2の第1位置測位情報は、撮像装置2が具備する第1位置用アンテナ21が第1位置用衛星42からの測位信号を受信することで得られる。図1に図示されていない第1位置用衛星測位信号受信部403が第1位置用アンテナ21からの電気信号を受信し、図示されていない第1位置測位計算部404が第1位置測位情報を計算して導出する。 The reference position information of the imaging device 2 is obtained when the reference position antenna 20 receives the positioning signal from the reference position satellite 41 . A reference position satellite positioning signal reception unit 401 (not shown) receives an electrical signal from the reference position antenna 20, and a reference position positioning calculation unit 402 (not shown in FIG. 1) calculates and derives reference position information. . The first positioning information of the imaging device 2 is obtained when the first positioning antenna 21 of the imaging device 2 receives positioning signals from the first positioning satellites 42 . A first positioning satellite positioning signal receiving unit 403 (not shown in FIG. 1) receives an electrical signal from the first positioning antenna 21, and a first positioning calculation unit 404 (not shown) receives the first positioning information. Calculate and derive.
 移動体3は、第2位置用アンテナ31を有する。第2位置用アンテナ31は、第2位置用衛星43からの測位信号を受信する。図示されていない第2位置用衛星測位信号受信部405が第2位置用アンテナ31からの電気信号を受信し、図示されていない第2位置測位計算部406が第2位置測位情報を計算して導出する。 The moving object 3 has a second position antenna 31. The second positioning antenna 31 receives positioning signals from the second positioning satellite 43 . A second positioning satellite positioning signal receiving unit 405 (not shown) receives the electrical signal from the second positioning antenna 31, and a second positioning calculating unit 406 (not shown) calculates second positioning information. derive
 第1位置用衛星42と、第2位置用衛星43と、は位置座標を算出するためには4つの衛星が必要である。位置座標と時間が変数となることが理由である。図1においては、第1位置用衛星42として5つの衛星が、第2位置用衛星43として5つの衛星が適用され、そのうちの4つの衛星が共通である例が示されている。 The first positioning satellite 42 and the second positioning satellite 43 require four satellites to calculate the position coordinates. The reason is that position coordinates and time are variables. FIG. 1 shows an example in which five satellites are applied as the first positioning satellites 42, five satellites are applied as the second positioning satellites 43, and four satellites among them are common.
 詳細は後述するが、情報処理装置10は、参照位置情報と第1位置測位情報と第2位置測位情報とに基づいて対象地点である移動体3の位置を特定する。 Although the details will be described later, the information processing device 10 identifies the position of the moving object 3, which is the target point, based on the reference position information, the first positioning information, and the second positioning information.
 図2に示すように、情報処理装置10は、プロセッサ13と、ROM(Read Only Memory)14と、RAM(Random Access Memory)15と、入出力部11と、通信部12と、入出力インターフェース17とを有する。 As shown in FIG. 2, the information processing apparatus 10 includes a processor 13, a ROM (Read Only Memory) 14, a RAM (Random Access Memory) 15, an input/output unit 11, a communication unit 12, and an input/output interface 17. and
 プロセッサ13は、各種演算及び処理を行う。プロセッサ13は、例えば、CPU(central processing unit)、MPU(micro processing unit)、SoC(system on a chip)、DSP(digital signal processor)、GPU(graphics processing unit)、ASIC(application specific integrated circuit)、PLD(programmable logic device)又はFPGA(field-programmable gate array)等である。或いは、プロセッサ13は、これらのうちの複数を組み合わせたものである。又、プロセッサ13は、これらにハードウェアアクセラレーター等を組み合わせたものあっても良い。 The processor 13 performs various calculations and processes. The processor 13 includes, for example, a CPU (central processing unit), MPU (micro processing unit), SoC (system on a chip), DSP (digital signal processor), GPU (graphics processing unit), ASIC (application technology), Examples include PLD (programmable logic device) or FPGA (field-programmable gate array). Alternatively, processor 13 is a combination of several of these. Also, the processor 13 may be a combination of these with a hardware accelerator or the like.
 プロセッサ13、ROM14及びRAM15は、バス16を介して相互に接続されている。プロセッサ13は、ROM14に記録されているプログラム又はRAM15にロードされたプログラムに従って各種の処理を実行する。プログラムの一部又は全部は、プロセッサ13の回路内に組み込まれていても良い。 The processor 13 , ROM 14 and RAM 15 are interconnected via a bus 16 . The processor 13 executes various processes according to programs recorded in the ROM 14 or programs loaded in the RAM 15 . Part or all of the program may be incorporated within the circuitry of processor 13 .
 バス16は入出力インターフェース17にも接続される。入出力インターフェース17には、入出力部11と、通信部12と、が接続されている。 The bus 16 is also connected to the input/output interface 17 . The input/output unit 11 and the communication unit 12 are connected to the input/output interface 17 .
 入出力部11は、有線又は無線により電気的に入出力インターフェース17に接続される。入出力部11は例えばキーボード及びマウス等の入力部と画像を表示するディスプレイ及び音声を拡声するスピーカ等の出力部とによって構成される。なお、入出力部11はタッチパネルのように表示機能と入力機能が一体的な構成であっても良い。 The input/output unit 11 is electrically connected to the input/output interface 17 by wire or wirelessly. The input/output unit 11 includes, for example, an input unit such as a keyboard and a mouse, and an output unit such as a display for displaying images and a speaker for amplifying sounds. Note that the input/output unit 11 may have a configuration in which the display function and the input function are integrated like a touch panel.
 通信部12は、プロセッサ13が、通信回線121を介して他の装置(例えば、撮像装置2、移動体3)との間で通信を行う装置である。 The communication unit 12 is a device in which the processor 13 communicates with other devices (for example, the imaging device 2 and the mobile object 3) via the communication line 121.
 ここで示したハードウェア構成は、あくまで一例であり、特にこの構成に限定されるわけではない。シングルプロセッサ、マルチプロセッサ及びマルチコアプロセッサ等の各種処理装置単体によって構成されるものの他、これら各種処理装置と、ASIC(Application Specific Integrated Circuit)及びFPGA(Field‐Programmable Gate Array)等の処理回路とが組み合わせられたものをプロセッサとして機能的構成を実現するものとして採用しても良い。 The hardware configuration shown here is just an example, and it is not particularly limited to this configuration. In addition to those composed of single processors such as single processors, multiprocessors, and multicore processors, these various processors are combined with processing circuits such as ASIC (Application Specific Integrated Circuit) and FPGA (Field-Programmable Gate Array). may be employed as a processor to realize the functional configuration.
 図3は本実施形態の情報処理装置10、及び、情報処理装置10が信号の交換をする撮像装置2、移動体3等の機能的構成を示す。移動体3においては、1つの移動体3が複数の地点に移動し各地点で測位される場合がある。一方、移動体3が複数あり、複数の各地点で測位される場合がある。これらの何れの場合についても、本明細書中では、移動体3と表記する。 FIG. 3 shows the functional configuration of the information processing device 10 of the present embodiment, and the imaging device 2, mobile object 3, etc. with which the information processing device 10 exchanges signals. As for the moving body 3, there is a case where one moving body 3 moves to a plurality of points and is positioned at each point. On the other hand, there may be a plurality of moving bodies 3 and positioning may be performed at a plurality of points. Any of these cases will be referred to as the moving object 3 in this specification.
 情報処理装置10の機能的構成は、プロセッサ13によって実現される。撮像装置2の機能的構成は、撮像装置2に搭載される電子部品やコンピュータ等の情報処理装置のプロセッサによって実現される。同様に、移動体3の機能的構成も、移動体3に搭載される電子部品コンピュータ等の情報処理装置によって実現される。 A functional configuration of the information processing device 10 is implemented by the processor 13 . The functional configuration of the imaging device 2 is realized by electronic components mounted on the imaging device 2 and a processor of an information processing device such as a computer. Similarly, the functional configuration of the mobile body 3 is realized by an information processing device such as an electronic component computer mounted on the mobile body 3 .
 情報処理装置10は、ここでは撮像装置2及び移動体3とは別体であり、サーバー1に機能的構成の全てが含まれている。情報処理装置10は、撮像装置2、移動体3にその一部或いは全部が組み込まれる構成も可能である。 The information processing device 10 here is separate from the imaging device 2 and the mobile object 3, and the server 1 includes all of the functional configurations. The information processing device 10 can also be configured to be partially or wholly incorporated in the imaging device 2 and the moving body 3 .
 以下、図1から3を適宜参照して、これら機能的構成について説明する。参照位置用衛星測位信号受信部401は、撮像装置2の参照位置を測位するために参照位置用アンテナ20を介して衛星信号を受信する。参照位置測位計算部402は、参照位置用衛星測位信号受信部401からの信号を受けて、撮像装置2の参照位置を計算する。ここで、参照位置用衛星41としては、例えば、準天頂衛星のような正確な位置を提供する衛星が選ばれ、参照位置として正確な位置が提供される。この参照位置用衛星41からの電波を受信する参照位置用アンテナ20は一般的に比較的大きく、位置の測定がなされた後は、参照位置用アンテナ20が取り外されることが好ましい。参照位置用アンテナ20及び参照位置情報取得部103等を有する測定装置は撮像装置2に着脱可能に接続されていることが好ましい。これにより、参照位置が測位された後は、参照位置用アンテナ20及び参照位置情報取得部103を撮像装置2から取り外すことができる。 These functional configurations will be described below with reference to FIGS. 1 to 3 as appropriate. The reference position satellite positioning signal reception unit 401 receives satellite signals via the reference position antenna 20 in order to position the reference position of the imaging device 2 . The reference position positioning calculator 402 receives the signal from the reference position satellite positioning signal receiver 401 and calculates the reference position of the imaging device 2 . Here, as the reference position satellite 41, for example, a satellite that provides an accurate position such as a quasi-zenith satellite is selected, and an accurate position is provided as the reference position. The reference position antenna 20 for receiving radio waves from the reference position satellite 41 is generally relatively large, and it is preferable that the reference position antenna 20 be removed after the position is measured. It is preferable that the measurement device including the reference position antenna 20 and the reference position information acquisition unit 103 is detachably connected to the imaging device 2 . Accordingly, after the reference position is determined, the reference position antenna 20 and the reference position information acquisition unit 103 can be removed from the imaging device 2 .
 撮像装置2の参照位置の特定においては、準天頂衛星を使う例の他に、地図上に撮像装置2の設置位置をプロットし、地図から位置座標が特定されてもよい。 In specifying the reference position of the imaging device 2, in addition to the example of using a quasi-zenith satellite, the installation position of the imaging device 2 may be plotted on a map and the position coordinates may be specified from the map.
 撮像装置2の位置は、上記の参照位置用衛星41を基とする測位に加えて、移動体3が具備し得る測位システムと同様の測位システムにより導出される。本明細書では撮像装置2の現在の位置は第1位置と称される。この移動体3が具備し得る測位システムと同様のシステムは、第1位置用アンテナ21と第1位置用衛星測位信号受信部403と第1位置測位計算部404を有する。第1位置用衛星測位信号受信部403は第1位置用アンテナ21により測位信号を受信し、第1位置測位計算部404が撮像装置2の第1位置を計算する。この第1位置用衛星測位信号受信部403と第1位置測位計算部404とは撮像装置2に備えられてもよい。 The position of the imaging device 2 is derived by a positioning system similar to the positioning system that the mobile body 3 can have, in addition to positioning based on the reference position satellite 41 described above. The current position of imaging device 2 is referred to herein as the first position. A system similar to the positioning system that the moving body 3 can have has a first positioning antenna 21 , a first positioning satellite positioning signal receiving section 403 and a first positioning calculation section 404 . The first positioning satellite positioning signal receiving unit 403 receives the positioning signal from the first positioning antenna 21 , and the first positioning calculation unit 404 calculates the first position of the imaging device 2 . The first positioning satellite positioning signal reception unit 403 and the first positioning calculation unit 404 may be provided in the imaging device 2 .
 撮像装置2の参照位置は、第1位置用アンテナ21と第1位置用衛星測位信号受信部403と第1位置測位計算部404を用いて、長時間測位あるいはスタティック測位により求められてもよい。 The reference position of the imaging device 2 may be obtained by long-term positioning or static positioning using the first positioning antenna 21, the first positioning satellite positioning signal receiving unit 403, and the first positioning calculation unit 404.
 撮像装置2は撮像部201を有し、例えば駐車場内を移動する移動体3等を撮影する。撮像装置2は姿勢制御装置22を有する。姿勢制御部202は、撮像装置2の撮影方向を姿勢制御装置22を介して制御する。撮像装置2は、姿勢位置情報検出部203を有し、姿勢制御部202により制御されている撮像装置2の撮影方向を検出し、検出結果は情報処理装置10の有する画像情報管理部101に送られる。 The image capturing device 2 has an image capturing unit 201, and captures, for example, the moving body 3 moving in the parking lot. The imaging device 2 has an attitude control device 22 . The posture control unit 202 controls the shooting direction of the imaging device 2 via the posture control device 22 . The imaging device 2 has an orientation/position information detection unit 203 , detects the shooting direction of the imaging device 2 controlled by the orientation control unit 202 , and sends the detection result to the image information management unit 101 of the information processing device 10 . be done.
 情報処理装置10は、画像情報管理部101、情報記録部102、参照位置情報取得部103、第1位置測位情報取得部104、第2位置測位情報取得部105、相対位置情報取得部106を有する。 The information processing apparatus 10 has an image information management unit 101, an information recording unit 102, a reference position information acquisition unit 103, a first positioning information acquisition unit 104, a second positioning information acquisition unit 105, and a relative position information acquisition unit 106. .
 説明の都合上、先に移動体3の有する機能的構成について説明する。移動体3は、例えば駐車場を移動する自動車である。移動体3の現在の位置は、本明細書中では、第2位置と称される。移動体3は、図1に示すように、第2位置用アンテナ31を有し、更に、図3に示すように、第2位置用衛星測位信号受信部405と第2位置測位計算部406とを有する。移動体3は複数が用いられて各々の位置が測位される場合と、1つの移動体3が用いられて、移動体3が移動し、移動した各位置が測位される場合がある。本明細書では特に断らない限り、これらは区別されず、移動体3と称される。 For convenience of explanation, the functional configuration of the moving body 3 will be explained first. The moving body 3 is, for example, an automobile moving in a parking lot. The current position of the mobile 3 is referred to herein as the second position. As shown in FIG. 1, the moving body 3 has a second positioning antenna 31, and further, as shown in FIG. have There are cases where a plurality of moving bodies 3 are used and the respective positions are measured, and there are cases where one moving body 3 is used, the moving body 3 moves, and each moved position is measured. In this specification, unless otherwise specified, they are referred to as the moving body 3 without being distinguished from each other.
 第2位置用アンテナ31により第2位置用衛星測位信号受信部405が測位信号を受信し、第2位置測位計算部406が第2位置を計算する。 The second positioning satellite positioning signal receiving unit 405 receives the positioning signal from the second positioning antenna 31, and the second positioning calculation unit 406 calculates the second position.
 第1位置用アンテナ21を介して第1位置測位計算部404が測位する第1位置と、第2位置用アンテナ31を介して第2位置測位計算部406が測位する第2位置とは現在位置を示す測位情報である。 The first position measured by the first positioning calculation unit 404 via the first position antenna 21 and the second position measured by the second positioning calculation unit 406 via the second positioning antenna 31 are the current positions. is positioning information indicating
 情報処理装置10の各機能的構成の機能について詳細を図3により説明する。信号の流れに沿って説明するため、説明する機能的構成の図3中の位置は左右上下に前後する。 The details of the function of each functional configuration of the information processing device 10 will be described with reference to FIG. In order to describe along the flow of signals, the positions of the functional configuration to be described in FIG.
 参照位置情報取得部103は、参照位置測位計算部402が計算した参照位置を取得する。第1位置測位情報取得部104は、第1位置測位計算部404が計算して導出した第1位置測位情報を取得する。第2位置測位情報取得部105は、移動体3の有する第2位置測位計算部406の導出した第2位置測位情報を取得する。 The reference position information acquisition unit 103 acquires the reference position calculated by the reference position positioning calculation unit 402 . The first positioning information acquisition unit 104 acquires the first positioning information calculated and derived by the first positioning calculation unit 404 . The second positioning information acquisition unit 105 acquires the second positioning information derived by the second positioning calculation unit 406 of the moving body 3 .
 相対位置情報取得部106は、第1位置測位情報取得部104と第2位置測位情報取得部105とからの情報を基に、撮像装置2の第1位置と移動体3の第2位置との相対位置を導出する。相対位置情報には、例えば、距離、方向、高さ等の情報が含まれる。 The relative position information acquisition unit 106 calculates the distance between the first position of the imaging device 2 and the second position of the moving body 3 based on the information from the first positioning information acquisition unit 104 and the second positioning information acquisition unit 105. Derive the relative position. The relative position information includes, for example, information such as distance, direction, and height.
 ここで、相対位置の導出においては、第1位置用衛星測位信号受信部403と第2位置用衛星測位信号受信部405とは、同一の衛星を用いて測定することが好ましい。衛星からの電波信号がそれぞれの受信部に到達する時間差を測定して、2転換の相対的な位置関係を求める。各観測点で同じ衛星の電波を受信しており、衛星から放出された電波が同様の気象条件の中を通過してくるため、2点の観測値の差を取ることにより、観測値に含まれる衛星の位置誤差や対流圏及び電離層での遅延量が消去される。これにより正確な相対位置が求められる。 Here, in deriving the relative position, it is preferable that the first positioning satellite positioning signal receiving section 403 and the second positioning satellite positioning signal receiving section 405 use the same satellite for measurement. The relative positional relationship between the two transitions is obtained by measuring the time difference between the radio signals from the satellites arriving at the respective receivers. Each observation point receives radio waves from the same satellite, and the radio waves emitted from the satellite pass through similar weather conditions. This eliminates satellite position errors and delays in the troposphere and ionosphere. This provides an accurate relative position.
 衛星からの測位信号に基づいて、第1位置測位情報と第2位置測位情報とが導出される。そしてこれらの位置の差分或いは差分を示す距離が相対位置情報として取得される。 The first positioning information and the second positioning information are derived based on the positioning signals from the satellites. Then, the difference between these positions or the distance indicating the difference is obtained as relative position information.
 画像情報管理部101は、参照位置情報取得部103から参照位置を得、相対位置情報取得部106から相対位置を得る。そして、参照位置に対して相対位置を加えることにより移動体3の正確な位置あるいは絶対位置座標が導出される。 The image information management unit 101 obtains the reference position from the reference position information acquisition unit 103 and the relative position from the relative position information acquisition unit 106 . Then, by adding the relative position to the reference position, the accurate position or absolute position coordinates of the moving body 3 are derived.
 画像情報管理部101は、参照位置情報取得部103から参照位置を、撮像装置2の有する撮像部201から画像を、撮像装置2の有する姿勢位置情報検出部203から撮像装置2の姿勢を、相対位置情報取得部106から移動体3の相対位置情報をそれぞれ受け取る。 The image information management unit 101 acquires the reference position from the reference position information acquisition unit 103, the image from the imaging unit 201 of the imaging device 2, and the orientation of the imaging device 2 from the orientation position information detection unit 203 of the imaging device 2. The relative position information of the moving body 3 is received from the position information acquisition unit 106 respectively.
 画像情報管理部101は、撮像装置姿勢位置情報取得部107、画像位置補正部108、画像位置設定部109、補間補外処理部110、移動体特定部111を有する。 The image information management unit 101 has an imaging device orientation position information acquisition unit 107 , an image position correction unit 108 , an image position setting unit 109 , an interpolation/extrapolation processing unit 110 and a moving object identification unit 111 .
 撮像装置姿勢位置情報取得部107は姿勢位置情報検出部203から撮像装置2の姿勢情報を受け取る。画像位置補正部108は、撮像部201から得られた撮像画像の画像位置を補正する。例えば、撮像装置2が90°方向を向いている時に撮影された画像と、撮像装置2が45°方向を向いている時に撮影された画像とを補正してつなぎ合わせる。そして、撮像装置自体が有する画角よりも広い画角の画像を実現する。 The imaging device posture/position information acquisition unit 107 receives posture information of the imaging device 2 from the posture/position information detection unit 203 . An image position correction unit 108 corrects the image position of the captured image obtained from the imaging unit 201 . For example, an image captured when the imaging device 2 is oriented at 90° and an image captured when the imaging device 2 is oriented at 45° are corrected and joined together. Then, an image with a wider angle of view than the angle of view of the imaging device itself is realized.
 姿勢位置情報検出部203は、例えば、撮像装置2が敷設されている三脚の三軸サーボモータの出力を検出する。但し、撮像装置2の姿勢の推定は、加速度センサ、地磁気センサなど、撮像装置2に直接固定されたものに限られない。撮像装置を俯瞰するカメラ画像、あるいは撮像装置に外付けした姿勢推定デバイスから、撮像装置2の姿勢を推定してもよい。撮像装置の姿勢の変化を検知した際は、その変化量に応じて相対位置情報を更新する。 The posture position information detection unit 203 detects, for example, the output of the three-axis servomotor of the tripod on which the imaging device 2 is installed. However, estimation of the orientation of the imaging device 2 is not limited to those directly fixed to the imaging device 2, such as acceleration sensors and geomagnetic sensors. The posture of the imaging device 2 may be estimated from a camera image that overlooks the imaging device or from a posture estimation device externally attached to the imaging device. When a change in posture of the imaging device is detected, the relative position information is updated according to the amount of change.
 撮像装置姿勢位置情報取得部107は、参照位置情報取得部103から位置情報及び姿勢情報を得る。撮像装置2の位置或いは且つ姿勢が変えられている場合が想定される。これらの変化が姿勢位置情報検出器23により検出されると、この変化に基づいて、画像位置補正部108は、画像中の位置と相対位置情報との関係を補正する。 The imaging device orientation position information acquisition unit 107 acquires position information and orientation information from the reference position information acquisition unit 103 . It is assumed that the position or attitude of the imaging device 2 has been changed. When these changes are detected by the posture position information detector 23, the image position correction unit 108 corrects the relationship between the positions in the image and the relative position information based on these changes.
 画像位置設定部109は、画像位置補正部108による画像中の位置と相対位置情報との関係の補正を取り入れ、撮像装置2によって撮像される画像中の画像位置に対して、相対位置情報を関連付ける。 The image position setting unit 109 incorporates the correction of the relationship between the position in the image and the relative position information by the image position correcting unit 108, and associates the relative position information with the image position in the image captured by the imaging device 2. .
 撮像装置2によって撮像される撮像画像240に参照位置情報が付与される、或いは、移動体3を含む画像に移動体位置情報が付与される。そして、この情報は情報記録部102に記録される。 The captured image 240 captured by the imaging device 2 is given reference position information, or the moving body position information is given to an image including the moving body 3 . This information is then recorded in the information recording unit 102 .
 補間補外処理部110は、複数の第2位置に対応して既に導出されている撮像画像240中の座標と相対位置情報とを基に、撮像画像240中の任意の座標に対応する相対位置情報を補間および補外により求める。 The interpolation/extrapolation processing unit 110 calculates the relative position corresponding to arbitrary coordinates in the captured image 240 based on the coordinates in the captured image 240 already derived corresponding to the plurality of second positions and the relative position information. Information is obtained by interpolation and extrapolation.
 移動体特定部111は、撮像画像240中の移動体3を画像認識により特定する。そして、移動体特定部111は、特定した移動体3の位置に対応する画像上の座標を導出する。 The moving body identifying unit 111 identifies the moving body 3 in the captured image 240 by image recognition. Then, the moving body identification unit 111 derives coordinates on the image corresponding to the identified position of the moving body 3 .
 次に、撮像装置2によって撮像される画像中の画像位置に対して、相対位置情報を関連付ける処理の例について図4及び図5を参照して説明する。図4は、駐車場を移動する移動体3と撮像装置2との位置関係を示す模式平面図である。図5は撮像装置2の撮影する撮像画像240の例である。 Next, an example of processing for associating relative position information with an image position in an image captured by the imaging device 2 will be described with reference to FIGS. 4 and 5. FIG. FIG. 4 is a schematic plan view showing the positional relationship between the moving body 3 moving in the parking lot and the imaging device 2. As shown in FIG. FIG. 5 is an example of a captured image 240 captured by the imaging device 2 .
 図5においては、撮像装置2の装置位置情報とし、距離と角度と高さとが画面に付与されている。図5の例では、撮像装置2からの距離が13.8m、角度が123°、高さが1.3mである。 In FIG. 5, the distance, angle, and height are given to the screen as device position information of the imaging device 2 . In the example of FIG. 5, the distance from the imaging device 2 is 13.8 m, the angle is 123°, and the height is 1.3 m.
 移動体特定部111が、撮像画像240中の移動体3を画像認識技術により認識し、その画像240上の座標を特定する。第1位置測位情報取得部104が取得する第1位置測位情報と第2位置測位情報取得部105が取得する第2位置測位情報とに基づいて、相対位置情報取得部106が移動体3の撮像時における相対位置情報を取得する。そして、画像240上の移動体3の位置座標と装置位置情報とが画像位置設定部109により関連付けられる。 The moving body identification unit 111 recognizes the moving body 3 in the captured image 240 by image recognition technology and identifies the coordinates on the image 240 . Based on the first positioning information obtained by the first positioning information obtaining unit 104 and the second positioning information obtained by the second positioning information obtaining unit 105, the relative position information obtaining unit 106 captures an image of the moving body 3. Get relative position information at time. Then, the image position setting unit 109 associates the position coordinates of the mobile object 3 on the image 240 with the device position information.
 上記処理は、画像中の各地点で繰り返される。図6及び図7を参照して説明する。図6は平面図を、図7は画面表示例を示す。例えば、画像ファイルと位置情報および姿勢情報等のデータが含まれる情報ファイルとは別ファイルとして保存され、画像に情報ファイルに基づく情報を重ねるか否かは再生時に任意に選択される。 The above process is repeated at each point in the image. Description will be made with reference to FIGS. 6 and 7. FIG. FIG. 6 shows a plan view, and FIG. 7 shows a screen display example. For example, an image file and an information file containing data such as position information and orientation information are saved as separate files, and whether or not to superimpose information based on the information file on the image is arbitrarily selected at the time of reproduction.
 移動体3が駐車場にて移動した地点で第2位置測位情報が第2位置測位情報取得部105により取得される。また、図6に示すように、移動体3において、駐車場の中で複数の地点において第3位置測位情報が取得される。そして、撮像装置2との間における相対位置情報が図6では、5点において相対位置情報が、相対位置情報取得部106により取得される。図6においては、距離と角度と高さの組み合わせを(距離、角度、高さ)として、(13.8m、12.3°、1.3m)(11.3m、57°、2.2m)等の相対位置情報が得られている。 The second positioning information acquisition unit 105 acquires the second positioning information at the point where the moving body 3 moves in the parking lot. Further, as shown in FIG. 6, the moving body 3 acquires the third positioning information at a plurality of points in the parking lot. Then, relative position information with respect to the imaging device 2 is obtained by the relative position information obtaining unit 106 at five points in FIG. In FIG. 6, (13.8 m, 12.3°, 1.3 m) (11.3 m, 57°, 2.2 m) where the combination of distance, angle, and height is (distance, angle, height) and other relative position information are obtained.
 以上、一連の処理により画像中の各地点に撮像装置3からの相対位置情報が関連付けられる。以降の処理では、撮像装置3が撮像した画像240中の被写体の位置を各地点の相対位置情報に基づいて算出することができる。補間補外処理部110が、複数の第2位置に対応して既に導出されている撮像画像240中の座標と相対位置情報とを基に、撮像画像240中の任意の座標に対応する相対位置情報を補間および補外により求める。即ち、移動体から位置情報を取得しなくても画像だけで正確に画像中の対象地点にある移動体の位置を画像位置設定部109により導出することができる。 As described above, relative position information from the imaging device 3 is associated with each point in the image through a series of processes. In subsequent processing, the position of the subject in the image 240 captured by the imaging device 3 can be calculated based on the relative position information of each point. The interpolation/extrapolation processing unit 110 determines the relative position corresponding to arbitrary coordinates in the captured image 240 based on the coordinates in the captured image 240 already derived corresponding to the plurality of second positions and the relative position information. Information is obtained by interpolation and extrapolation. That is, the image position setting unit 109 can accurately derive the position of the moving body at the target point in the image only from the image without acquiring the position information from the moving body.
 ところで、移動体3の相対位置情報を複数の点について取得した場合、異常値が混在することがある。これは、装置への信号ノイズが入ること、或いは、測位用電波の受信状態が悪化したような場合が想定される。このような異常値は、他の相対位置情報に基づいて補正される。例えば、その異常値を伴う地点を削除する、平均値を採用することが有効である。 By the way, when the relative position information of the moving body 3 is acquired for multiple points, abnormal values may be mixed. This is assumed to be caused by signal noise entering the device or by deterioration in the reception state of positioning radio waves. Such outliers are corrected based on other relative position information. For example, it is useful to take the mean, removing the points with its outliers.
 図8および図9は機能的構成による処理の流れを示すフローチャートである。図8はフローチャートの前半を、図9はフローチャートの後半を示す。つながりを示すために丸にAで示すダミーのステップが記載されている。処理がスタートすると、情報処理装置10が参照位置情報を取得する(ステップS501)。次に、撮像装置2は姿勢情報を取得する(ステップS502)。撮像装置2の姿勢が変化している場合(ステップS503:Yes)には、ステップS501に戻る。撮像装置2の姿勢が変化していない場合(ステップS503:No)には、撮像装置2が画像情報を取得する(ステップS504)。次に、撮像装置2は第1位置測位情報を取得する(ステップS505)。移動体3が第2位置測位情報を取得している場合は、移動体3の相対位置情報を取得する(ステップS507)。移動体3が第2位置測位情報を取得していない場合(ステップS506:No)には、画像中の移動体の座標を算出する。そして、移動体3の相対位置情報が取得される(ステップS507)。移動体3の絶対位置座標が取得される場合(ステップS509;Yes)には、撮像装置2の参照位置情報に移動体3の相対位置情報が加味され、移動体の絶対位置情報が生成され(ステップS511)、画像情報に移動体3の位置情報が関連付けられる(ステップS510)。絶対位置情報としては、例えば、緯度、経度が挙げられる。移動体3の絶対位置座標が取得されない場合(ステップS509:No)には、画像情報に移動体3の位置情報が関連付けられる。最後に得られた情報は情報記録部102に記録され、処理は終了する(END)。  FIGS. 8 and 9 are flowcharts showing the flow of processing according to the functional configuration. FIG. 8 shows the first half of the flowchart, and FIG. 9 shows the second half of the flowchart. A dummy step indicated by A is written in the circle to show the connection. When the process starts, the information processing apparatus 10 acquires reference position information (step S501). Next, the imaging device 2 acquires posture information (step S502). If the posture of the imaging device 2 has changed (step S503: Yes), the process returns to step S501. If the posture of the imaging device 2 has not changed (step S503: No), the imaging device 2 acquires image information (step S504). Next, the imaging device 2 acquires first positioning information (step S505). If the moving body 3 has acquired the second positioning information, the relative position information of the moving body 3 is acquired (step S507). If the mobile object 3 has not acquired the second positioning information (step S506: No), the coordinates of the mobile object in the image are calculated. Then, the relative position information of the moving body 3 is acquired (step S507). When the absolute position coordinates of the moving body 3 are acquired (step S509; Yes), the relative position information of the moving body 3 is added to the reference position information of the imaging device 2, and the absolute position information of the moving body is generated ( Step S511), the position information of the moving body 3 is associated with the image information (step S510). Absolute position information includes, for example, latitude and longitude. When the absolute position coordinates of the moving body 3 are not acquired (step S509: No), the position information of the moving body 3 is associated with the image information. The finally obtained information is recorded in the information recording unit 102, and the process ends (END).
 図8においては、撮像装置2の参照位置情報が第1位置測位情報に先んじて取得されている。これに限られることはなく、順番を入れ替えることも可能である。更には、参照位置情報は、撮像装置2と移動体3との相対位置情報230が取得された後或いは予め取得することも可能である。 In FIG. 8, the reference position information of the imaging device 2 is acquired prior to the first positioning information. The order is not limited to this, and the order can be changed. Furthermore, the reference position information can be acquired after the relative position information 230 between the imaging device 2 and the moving object 3 is acquired or in advance.
 ここまでの説明においては、情報処理装置10が独立してサーバー1に組み込まれている。これに限られる訳ではなく、情報処理装置10は、撮像装置2或いは移動体3にその一部或いは全部が組み込まれることが可能である。例えば、情報処理装置10の全部が撮像装置2に組み込まれた場合には、ハード的には撮像装置2と移動体3とが、WiFiを一例とする通信回線121により接続されていれば良いという利点が生まれる。画像位置補正部108、画像位置設定部109、補間補外処理部110、移動体特定部111、相対位置情報取得部106等に代表される演算を伴う構成部をサーバー1に設け、その他を撮像装置2に設ける構成も可能である。これにより、中央演算装置をサーバー1にのみ設けることが出来、放熱対策、大容量無停電電源等を撮像装置2に備える必要がなくなる。 In the description so far, the information processing device 10 is independently incorporated in the server 1. The information processing device 10 is not limited to this, and a part or all of the information processing device 10 can be incorporated in the imaging device 2 or the moving body 3 . For example, in the case where the entire information processing device 10 is incorporated in the imaging device 2, in terms of hardware, the imaging device 2 and the moving body 3 may be connected by a communication line 121, for example WiFi. benefits arise. An image position correction unit 108, an image position setting unit 109, an interpolation/extrapolation processing unit 110, a moving object identification unit 111, a relative position information acquisition unit 106, etc. are provided in the server 1, and other components are imaged. A configuration provided in the device 2 is also possible. As a result, the central processing unit can be provided only in the server 1, and there is no need to provide the imaging device 2 with a countermeasure against heat dissipation, a large-capacity uninterruptible power supply, or the like.
(変形例1)
 図7における画像240上の各地点の座標と、撮像装置2から図6および図7に示された各地点までの距離、角度、高さという相対位置情報に基づいて、補間及び補外の処理が行なわれる。画像240上の任意の点について、撮像装置2からの距離と角度とが求められる。図10は補間補外の結果を基に描画された画像240上の等距離線と方位とを示す。高さについては省略しているが同様に求められる完全に平な土地についての測位であれば楕円による等距離線となるが、実際には土地には凹凸があるので、等距離線は楕円とは限らない。
(Modification 1)
Interpolation and extrapolation processing based on the coordinates of each point on the image 240 in FIG. 7 and relative position information such as the distance, angle, and height from the imaging device 2 to each point shown in FIGS. is performed. The distance and angle from the imaging device 2 are obtained for any point on the image 240 . FIG. 10 shows equidistant lines and bearings on image 240 drawn based on the result of interpolation extrapolation. Although the height is omitted, if it is a perfectly flat land that is similarly required, it will be an equidistant line by an ellipse. is not limited.
 これにより、移動体3が画像240上に写っている場合、画像240上の位置あるいは座標を同定することにより、移動体3と撮像装置2との相対位置情報が移動体特定部111により求められる。移動体3は必ずしも第2位置測位情報を有する必要はない。まず、移動体3が移動体特定部11により画像240中で認識される。移動体3の輪郭が移動体特定部111により認識され、地面との接触点が同定される。接触点が移動体3の画像240上の位置あるいは座標として特定される。そして画像240上の座標より、上記した補間補外を通して、移動体3の撮像装置2からの距離、角度、高さが移動体特定部111により導出される。 As a result, when the moving object 3 is captured on the image 240, the moving object identifying unit 111 can obtain the relative position information between the moving object 3 and the imaging device 2 by identifying the position or coordinates on the image 240. . The mobile 3 does not necessarily have to have the second positioning information. First, the moving body 3 is recognized in the image 240 by the moving body identification unit 11 . The contour of the mobile object 3 is recognized by the mobile object identification unit 111, and the contact points with the ground are identified. A contact point is identified as a position or coordinates on the image 240 of the moving body 3 . Then, from the coordinates on the image 240, the distance, angle, and height of the moving body 3 from the imaging device 2 are derived by the moving body identifying unit 111 through the interpolation extrapolation described above.
(変形例2)
 情報記録部102は、相対位置情報取得部106が取得した相対情報、即ち撮像装置2から移動体3までの距離角度高さ、だけではなく、画像情報、および、画像から得られた情報(奥行、位置、高さ)を記録する。通信負荷を減らすため、被写体の位置情報だけでなく、背景などに写る道路、建物などの位置情報を含む。
(Modification 2)
The information recording unit 102 stores not only the relative information acquired by the relative position information acquisition unit 106, that is, the distance, angle, and height from the imaging device 2 to the moving object 3, but also image information and information obtained from the image (depth , position and height). In order to reduce the communication load, it includes not only the position information of the subject, but also the position information of roads, buildings, etc. in the background.
(変形例3)
 相対位置情報のみを求める場合は、参照位置情報を用いなくともよい。参照位置情報を使用しない場合は、参照位置情報取得部103を省略することもできる。このように、本実施形態の機能的構成は適宜省略することもできるし、別の機能的構成を追加することができる。
(Modification 3)
When obtaining only relative position information, reference position information may not be used. If the reference position information is not used, the reference position information acquisition unit 103 can be omitted. In this way, the functional configuration of this embodiment can be omitted as appropriate, and another functional configuration can be added.
(変形例4)
 以上の実施の形態の説明においては、移動体は自動車であったが、車両、船舶、自転車、飛行体、人物、動物、携帯端末等であってもよい。ここまで、移動体の位置の測位として測位衛星の測位用電波の受信機能の利用を例に挙げたが、このような測位機器に限定されることはなく、俯瞰カメラあるいは測距デバイス等により位置が推定される構成であってもよい。また、走行履歴から予測された位置情報、ビーコンによる測位、LiDARなどの測距デバイス、撮像装置により取得された画像の解析結果などによる位置情報であってもよい。移動体自体が位置情報を取得する手段を有さなくてもよい。
(Modification 4)
In the description of the above embodiments, the mobile object is an automobile, but it may be a vehicle, a ship, a bicycle, an aircraft, a person, an animal, a mobile terminal, or the like. So far, the use of the reception function of positioning radio waves of positioning satellites has been taken as an example for positioning the position of a mobile object. may be a configuration in which is estimated. Position information predicted from travel history, positioning by beacons, distance measuring devices such as LiDAR, and position information based on analysis results of images acquired by imaging devices may also be used. The moving body itself does not have to have means for acquiring position information.
 上述した一連の処理は、ハードウェアにより実行させることもできるし、ソフトウェアにより実行させることもできる。一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、コンピュータ等にネットワーク及び記録媒体からインストールされる。このようなプログラムを含む記録媒体は、借り手にプログラムを提供するために装置本体とは別に配布されるリムーバブルメディアにより構成されるだけでなく、装置本体に予め組み込まれた状態で借り手に提供される記録媒体等で構成される。リムーバブルメディアは、例えば、磁気ディスク(フロッピディスクを含む)、光ディスク、又は光磁気ディスク等により構成される。光ディスクは、例えば、CD-ROM(Compact Disk-Read Only Memory),DVD(Digital Versatile Disk),Blu-ray(登録商標) Disc(ブルーレイディスク)等により構成される。光磁気ディスクは、MD(Mini-Disk)等により構成される。又、装置本体に予め組み込まれた状態で借り手に提供される記録媒体は、例えば、プログラムが記録されているプログラムメモリ及びハードディスク等で構成される。 The series of processes described above can be executed by hardware or by software. When a series of processes is to be executed by software, a program that constitutes the software is installed in a computer or the like from a network and a recording medium. A recording medium containing such a program is not only composed of removable media distributed separately from the main body of the device in order to provide the program to the borrower, but is also provided to the borrower in a state pre-installed in the main body of the device. It consists of a recording medium, etc. Removable media include, for example, magnetic disks (including floppy disks), optical disks, or magneto-optical disks. Optical discs are composed of, for example, CD-ROMs (Compact Disk-Read Only Memory), DVDs (Digital Versatile Disks), Blu-ray (registered trademark) Discs (Blu-ray Discs), and the like. The magneto-optical disk is composed of an MD (Mini-Disk) or the like. Also, the recording medium provided to the borrower in a state of being pre-installed in the apparatus main body is composed of, for example, a program memory and a hard disk in which the program is recorded.
 以上説明した実施形態に係る情報処理装置10、融資支援プログラム及び融資支援方法によれば以下のような効果が奏される。 According to the information processing device 10, the loan support program, and the loan support method according to the embodiment described above, the following effects are achieved.
 情報処理装置10は、撮像装置2の位置を示す第1位置測位情報と、撮像装置2の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて撮像装置と対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得部と、撮像装置によって撮像される画像中の対象位置に対応する画像位置に対して相対位置情報を関連づける画像位置設定部と、を備える。 The information processing device 10 connects the imaging device and the target point based on first positioning information indicating the position of the imaging device 2 and second positioning information indicating the position of the target point within the imaging range of the imaging device 2. and an image position setting unit that associates the relative position information with the image position corresponding to the target position in the image captured by the imaging device. .
 自動車等の移動体3の位置を画像位置上に正確に特定することが出来る。 The position of the mobile object 3 such as a car can be accurately specified on the image position.
 情報処理装置10において、相対位置情報には、撮像装置2の位置から対象地点までの距離を示す距離情報と、撮像装置の撮像方向を示す角度情報と、が含まれる。 In the information processing device 10, the relative position information includes distance information indicating the distance from the position of the imaging device 2 to the target point, and angle information indicating the imaging direction of the imaging device.
 監視カメラのような撮像装置2が設置されている状況で、自動車等の移動体3の位置を撮像装置2からの距離情報と方向を示す角度情報とを利用して正確に特定できる。 In a situation where an imaging device 2 such as a surveillance camera is installed, the position of a moving body 3 such as an automobile can be accurately specified using distance information from the imaging device 2 and angle information indicating the direction.
 情報処理装置10の有する相対位置情報取得部106は、異なる地点の複数の対象地点毎に相対位置情報を取得し、画像位置設定部109は、複数の対象地点のそれぞれに対応する画像位置に対して対応する相対位置情報を関連づける。 The relative position information acquisition unit 106 included in the information processing apparatus 10 acquires relative position information for each of a plurality of target points of different points, and the image position setting unit 109 acquires the image position corresponding to each of the plurality of target points. associated with the corresponding relative position information.
 複数の対象地点の相対位置情報が画像上で設定されることにより、画像における情報量が増える。 By setting the relative position information of multiple target points on the image, the amount of information in the image increases.
 情報処理装置10において、複数の第2の地点の各々に対応づけられた画像中の各相対位置情報が、他の第2の地点の相対位置情報に基づいて補正される。 In the information processing device 10, each piece of relative position information in the image associated with each of the plurality of second points is corrected based on the relative position information of other second points.
 画像中の各相対位置情報としてより正確な情報が得られる。  More accurate information can be obtained as each relative position information in the image.
 情報処理装置10は、撮像装置2の姿勢或いは/且つ位置に関する撮像装置姿勢位置情報を取得する撮像装置姿勢位置情報取得部107を備える。 The information processing apparatus 10 includes an imaging device orientation/position information acquisition unit 107 that acquires imaging device orientation/position information regarding the orientation and/or position of the imaging device 2 .
 撮像装置2の姿勢或いは/且つ位置を変化させてより画角の広い画像に対しての位置情報を取得することが出来る。 By changing the posture and/or position of the imaging device 2, it is possible to acquire position information for an image with a wider angle of view.
 情報処理装置は、撮像装置姿勢位置情報取得部107が取得した撮像装置姿勢位置情報に基づいて撮像装置2の姿勢或いは/且つ位置の変化を検出すると、当該変化に基づいて画像中の画像位置と相対位置情報関係を補正する画像位置補正部108を備える。 When the information processing apparatus detects a change in the orientation and/or position of the imaging device 2 based on the imaging device orientation/position information acquired by the imaging device orientation/position information acquisition unit 107, the information processing device determines the image position in the image based on the change. An image position correction unit 108 is provided to correct the relative position information relationship.
 画像中の相対位置情報の正確性が向上する。  The accuracy of the relative position information in the image is improved.
 情報処理装置は、第1位置測位情報より正確な撮像装置の絶対位置情報である参照位置情報を有し、前記参照位置情報と前記相対位置情報とに基づいて前記対象地点の現在位置情報を得る画像位置設定部109を備える。 The information processing device has reference position information that is absolute position information of the imaging device that is more accurate than the first positioning information, and obtains current position information of the target point based on the reference position information and the relative position information. An image position setting unit 109 is provided.
 参照位置情報により撮像装置の正確な絶対位置がもたらされる。測位信号に基づく撮像装置2の測位情報と対象地点の移動体3の測位情報との間で誤差が相殺される。これにより対象地点の移動体3と撮像装置との間の正確な相対位置情報が得られる。参照位置情報と合わせて、対象地点である移動体3の正確な現在位置情報が得られる。 The reference position information provides an accurate absolute position of the imaging device. An error is canceled between the positioning information of the imaging device 2 based on the positioning signal and the positioning information of the moving body 3 at the target point. As a result, accurate relative position information between the moving body 3 at the target point and the imaging device can be obtained. Together with the reference position information, accurate current position information of the moving object 3, which is the target point, can be obtained.
 情報処理装置10において、第2位置測位情報は、対象地点を通過する移動体3の測位信号に基づく現在位置を示す測位情報である。 In the information processing device 10, the second positioning information is positioning information indicating the current position based on the positioning signal of the moving body 3 passing through the target point.
 移動体3と第1の地点との間の正確な相対位置情報が得られる。参照位置情報と合わせて、移動体3の正確な位置情報が得られる。 Accurate relative position information between the mobile object 3 and the first point can be obtained. Accurate position information of the moving body 3 is obtained together with the reference position information.
 情報処理装置10は、撮像装置によって撮像される画像中の対象位置を通過する移動体3を画像認識により特定する移動体特定部を備える。 The information processing device 10 includes a moving object identification unit that identifies, by image recognition, the moving object 3 passing through the target position in the image captured by the imaging device.
 移動体3が測位情報を取得するアンテナやシステムを有さない場合においても、移動体3の位置を同定することが出来る。そして、画像認識により移動体3を特定することが出来る。 Even if the mobile object 3 does not have an antenna or system for acquiring positioning information, the position of the mobile object 3 can be identified. Then, the moving body 3 can be specified by image recognition.
 情報処理装置10は、複数の第2位置測位情報と画像中の位置との対応付けから、画像中の任意の位置について位置情報を対応付ける補間補外処理部を備える。 The information processing device 10 includes an interpolation/extrapolation processing unit that associates position information with respect to an arbitrary position in the image from the correspondence between a plurality of pieces of second positioning information and positions in the image.
 画像中の任意の位置の相対位置情報が得られる。  The relative position information of any position in the image can be obtained.
 プログラムは、撮像装置の位置を示す第1位置測位情報と、撮像装置の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて撮像装置と対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得機能と、撮像装置によって撮像される画像中の対象位置に対応する画像位置に対して相対位置情報を関連づける画像位置設定機能と、を備える。 The program indicates the positional relationship between the imaging device and the target point based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device. A relative position information acquisition function for acquiring relative position information, and an image position setting function for associating relative position information with an image position corresponding to a target position in an image captured by the imaging device.
 自動車等の移動体3の位置を画像位置上に正確に特定することが出来る。 The position of the mobile object 3 such as a car can be accurately specified on the image position.
 位置特定方法は、撮像装置の位置を示す第1位置測位情報と、撮像装置の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて撮像装置と対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得ステップと、撮像装置によって撮像される画像中の対象位置に対応する画像位置に対して相対位置情報を関連づける画像位置設定ステップと、を備える。 The position specifying method determines the positional relationship between the imaging device and the target point based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device. and an image position setting step of associating the relative position information with the image position corresponding to the target position in the image captured by the imaging device.
 自動車等の移動体3の位置を画像位置上に正確に特定することが出来る。 The position of the mobile object 3 such as a car can be accurately specified on the image position.
 以上、本発明の実施形態について説明したが、本発明は、上述の実施形態に制限されるものではなく、適宜変更が可能である。例えば、上述の各実施形態の一部の機能的構成を省略したり、別の機能的構成を組み合わせたりすることもできる。 Although the embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and can be modified as appropriate. For example, it is possible to omit some of the functional configurations of the respective embodiments described above, or to combine other functional configurations.
 1 サーバー、2 撮像装置、3 移動体、10 情報処理装置、101 画像情報管理部、103 参照位置情報取得部、104 第1位置測位情報取得部、105 第2位置測位情報取得部、106 相対位置情報取得部、107 撮像装置姿勢位置情報取得部、108 画像位置補正部、109 画像位置設定部 1 Server, 2 Imaging device, 3 Moving body, 10 Information processing device, 101 Image information management unit, 103 Reference position information acquisition unit, 104 First positioning information acquisition unit, 105 Second positioning information acquisition unit, 106 Relative position information acquisition unit 107 imaging apparatus attitude position information acquisition unit 108 image position correction unit 109 image position setting unit

Claims (12)

  1.  撮像装置の位置を示す第1位置測位情報と、前記撮像装置の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて前記撮像装置と前記対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得部と、
     前記撮像装置によって撮像される画像中の前記対象地点に対応する画像位置に対して前記相対位置情報を関連づける画像位置設定部と、
    を備える情報処理装置。
    A positional relationship between the imaging device and the target point is indicated based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device. a relative position information acquisition unit that acquires relative position information;
    an image position setting unit that associates the relative position information with an image position corresponding to the target point in the image captured by the imaging device;
    Information processing device.
  2.  前記相対位置情報には、前記撮像装置の位置から前記対象地点までの距離を示す距離情報と、前記撮像装置の撮像方向を示す角度情報と、が含まれる
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the relative position information includes distance information indicating the distance from the position of the imaging device to the target point, and angle information indicating the imaging direction of the imaging device.
  3.  前記相対位置情報取得部は、異なる地点の複数の前記対象地点毎に前記相対位置情報を取得し、
     前記画像位置設定部は、複数の前記対象地点のそれぞれに対応する前記画像位置に対して対応する前記相対位置情報を関連づける
     請求項1又は請求項2に記載の情報処理装置。
    The relative position information acquisition unit acquires the relative position information for each of the plurality of target points at different points,
    3. The information processing apparatus according to claim 1, wherein the image position setting unit associates the corresponding relative position information with the image positions corresponding to each of the plurality of target points.
  4.  複数の第2の地点の各々に対応づけられた前記画像中の各前記相対位置情報が、他の第2の地点の前記相対位置情報に基づいて補正される
     請求項3に記載の情報処理装置。
    The information processing apparatus according to claim 3, wherein each of the relative position information in the image associated with each of the plurality of second points is corrected based on the relative position information of other second points. .
  5.  前記撮像装置の姿勢或いは/且つ位置に関する撮像装置姿勢位置情報を取得する撮像装置姿勢位置情報取得部を備える
     請求項1から4の何れか1項に記載の情報処理装置。
    5. The information processing apparatus according to any one of claims 1 to 4, further comprising an imaging device attitude/position information acquisition unit that acquires imaging device attitude/position information relating to the attitude and/or position of the imaging device.
  6.  前記撮像装置姿勢位置情報取得部が取得した前記撮像装置姿勢位置情報に基づいて前記撮像装置の姿勢或いは/且つ位置の変化を検出すると、当該変化に基づいて前記画像中の前記画像位置と前記相対位置情報関係を補正する画像位置補正部を備える
     請求項1から5の何れか1項に記載の情報処理装置。
    When a change in the orientation and/or position of the imaging device is detected based on the imaging device orientation and position information acquired by the imaging device orientation and position information acquisition unit, the image position in the image and the relative position in the image are detected based on the change. The information processing apparatus according to any one of claims 1 to 5, further comprising an image position correcting section that corrects positional information relationships.
  7.  前記第1位置測位情報より正確な撮像装置の絶対位置情報である参照位置情報を有し、前記参照位置情報と前記相対位置情報とに基づいて前記対象地点の現在位置情報を得る画像位置設定部を備える
     請求項1から6の何れか1項に記載の情報処理装置。
    An image position setting unit that has reference position information that is absolute position information of an imaging device that is more accurate than the first positioning information, and obtains current position information of the target point based on the reference position information and the relative position information. The information processing apparatus according to any one of claims 1 to 6.
  8.  前記第2位置測位情報は、前記対象地点を通過する前記移動体の測位信号に基づく現在位置を示す測位情報である
     請求項1から7の何れか1項に記載の情報処理装置。
    The information processing apparatus according to any one of claims 1 to 7, wherein the second positioning information is positioning information indicating a current position based on a positioning signal of the moving body passing through the target point.
  9.  前記撮像装置によって撮像される前記画像中の前記対象地点を通過する移動体を画像認識により特定する移動体特定部を備える
     請求項1から8の何れか1項に記載の情報処理装置。
    The information processing apparatus according to any one of claims 1 to 8, further comprising a moving body identifying unit that identifies a moving body passing through the target point in the image captured by the imaging device by image recognition.
  10.  複数の第2位置測位情報と前記画像中の位置との対応付けから、前記画像中の任意の位置について位置情報を対応付ける補間補外処理部を備える
     請求項3から9の何れか1項に記載の情報処理装置。
    10. The method according to any one of claims 3 to 9, further comprising an interpolation/extrapolation processing unit that associates position information with respect to an arbitrary position in the image based on association between a plurality of pieces of second positioning information and positions in the image. information processing equipment.
  11.  撮像装置の位置を示す第1位置測位情報と、前記撮像装置の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて前記撮像装置と前記対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得機能と、
     前記撮像装置によって撮像される画像中の前記対象地点に対応する画像位置に対して前記相対位置情報を関連づける画像位置設定機能と、
     を備えるプログラム。
    A positional relationship between the imaging device and the target point is indicated based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device. A relative position information acquisition function for acquiring relative position information;
    an image position setting function that associates the relative position information with an image position corresponding to the target point in the image captured by the imaging device;
    A program with
  12.  撮像装置の位置を示す第1位置測位情報と、前記撮像装置の撮像範囲の中の対象地点の位置を示す第2位置測位情報と、に基づいて前記撮像装置と前記対象地点の位置関係を示す相対位置情報を取得する相対位置情報取得ステップと、
     前記撮像装置によって撮像される画像中の前記対象地点に対応する画像位置に対して前記相対位置情報を関連づける画像位置設定ステップと、
     を備える測位方法。
    A positional relationship between the imaging device and the target point is indicated based on first positioning information indicating the position of the imaging device and second positioning information indicating the position of the target point within the imaging range of the imaging device. a relative position information obtaining step for obtaining relative position information;
    an image position setting step of associating the relative position information with an image position corresponding to the target point in the image captured by the imaging device;
    A positioning method comprising:
PCT/JP2022/028823 2021-07-27 2022-07-26 Information processing device, program, and positioning method WO2023008444A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021122737 2021-07-27
JP2021-122737 2021-07-27

Publications (1)

Publication Number Publication Date
WO2023008444A1 true WO2023008444A1 (en) 2023-02-02

Family

ID=85087659

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/028823 WO2023008444A1 (en) 2021-07-27 2022-07-26 Information processing device, program, and positioning method

Country Status (1)

Country Link
WO (1) WO2023008444A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005227086A (en) * 2004-02-12 2005-08-25 Denso Corp Vehicle specific position computing device and vehicle direction specifying device
JP2009128356A (en) * 2007-11-26 2009-06-11 Korea Electronics Telecommun Car navigation system and method
JP2012227717A (en) * 2011-04-19 2012-11-15 Olympus Imaging Corp Display device, display program, and display method
JP2015109641A (en) * 2013-11-29 2015-06-11 アクシス アーベー System for following object marked by tag device with camera
CN109099889A (en) * 2018-07-10 2018-12-28 广州市中海达测绘仪器有限公司 Close range photogrammetric system and method
WO2020137312A1 (en) * 2018-12-28 2020-07-02 パナソニックIpマネジメント株式会社 Positioning device and mobile body
JP2021050969A (en) * 2019-09-24 2021-04-01 Kddi株式会社 Information terminal device, method, and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005227086A (en) * 2004-02-12 2005-08-25 Denso Corp Vehicle specific position computing device and vehicle direction specifying device
JP2009128356A (en) * 2007-11-26 2009-06-11 Korea Electronics Telecommun Car navigation system and method
JP2012227717A (en) * 2011-04-19 2012-11-15 Olympus Imaging Corp Display device, display program, and display method
JP2015109641A (en) * 2013-11-29 2015-06-11 アクシス アーベー System for following object marked by tag device with camera
CN109099889A (en) * 2018-07-10 2018-12-28 广州市中海达测绘仪器有限公司 Close range photogrammetric system and method
WO2020137312A1 (en) * 2018-12-28 2020-07-02 パナソニックIpマネジメント株式会社 Positioning device and mobile body
JP2021050969A (en) * 2019-09-24 2021-04-01 Kddi株式会社 Information terminal device, method, and program

Similar Documents

Publication Publication Date Title
US10408918B2 (en) Sky polarization and sun sensor system and method
EP3469306B1 (en) Geometric matching in visual navigation systems
KR101639029B1 (en) Sensor calibration and position estimation based on vanishing point determination
WO2019108579A2 (en) Radar aided visual inertial odometry initialization
US9329036B2 (en) Mobile device positioning based on independently obtained barometric pressure measurements
US11248908B2 (en) Precise altitude estimation for indoor positioning
US9528837B2 (en) Mobile device position uncertainty based on a measure of potential hindrance of an estimated trajectory
US10768267B2 (en) Position estimating apparatus, position estimating method, and terminal apparatus
JP6759175B2 (en) Information processing equipment and information processing system
TW201711011A (en) Positioning and directing data analysis system and method thereof
WO2020045100A1 (en) Positioning device and positioning method
JP2001264076A (en) Car navigation system
US9595109B1 (en) Digital camera with orientation sensor for optical tracking of objects
JP2006242731A (en) Positioning device and positioning method
JP2013068482A (en) Azimuth direction correction system, terminal device, server device, azimuth direction correction method and program
WO2023008444A1 (en) Information processing device, program, and positioning method
JP2021143861A (en) Information processor, information processing method, and information processing system
WO2015144055A1 (en) Locating method and device
WO2023007588A1 (en) Information processing device, program, and positioning method
KR20220167817A (en) Imu sensor convergence gnss signal processing apparatus
JP2012177681A (en) Mobile object positioning method, mobile object positioning device and mobile object positioning program by gps signal
JPH11295411A (en) Dgps position locating system
US20210240196A1 (en) Positioning apparatus, recording medium, and positioning method
JP2017182612A (en) Image arrangement method and image arrangement computer program
JP6929492B2 (en) Locator device and its accuracy evaluation system and positioning method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22849505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE