WO2019234936A1 - Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau - Google Patents

Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau Download PDF

Info

Publication number
WO2019234936A1
WO2019234936A1 PCT/JP2018/022120 JP2018022120W WO2019234936A1 WO 2019234936 A1 WO2019234936 A1 WO 2019234936A1 JP 2018022120 W JP2018022120 W JP 2018022120W WO 2019234936 A1 WO2019234936 A1 WO 2019234936A1
Authority
WO
WIPO (PCT)
Prior art keywords
sign
point
image
camera
points
Prior art date
Application number
PCT/JP2018/022120
Other languages
English (en)
Japanese (ja)
Inventor
橋本 康宣
小野 裕明
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2018/022120 priority Critical patent/WO2019234936A1/fr
Priority to JP2020523971A priority patent/JP7041262B2/ja
Publication of WO2019234936A1 publication Critical patent/WO2019234936A1/fr
Priority to JP2022037091A priority patent/JP7413421B2/ja
Priority to JP2023220146A priority patent/JP2024026547A/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments

Definitions

  • the present invention relates to position and orientation information estimation technology, and more particularly, to position and orientation information estimation technology in places where radio waves from artificial satellites and the like do not reach.
  • Patent Document 1 “a plurality of position information display devices for installing latitude / longitude information in a plurality of buildings, and a position information reading device for reading latitude / longitude information described in the position information display device; The current position information is obtained by reading the latitude / longitude information from the position information display device closest to the current position using the position information reading device.
  • the direction information display device installed in the vicinity where the latitude / longitude information is installed , And obtains azimuth information by reading the azimuth information from the azimuth information display device closest to the current position (summary excerpt).
  • Patent Document 1 What is obtained by the technique disclosed in Patent Document 1 is only position information of a position information display board installed in advance, and position information of a place without a position information display board cannot be obtained. That is, the current position information of the user cannot be obtained. For example, in a large-scale underground shopping center or shopping center, an error is likely to occur when searching for a destination such as a store by grasping the current position and direction.
  • the present invention has been made in view of the above circumstances, and provides a technique for accurately estimating a user's current position and direction with a simple configuration even in a place where radio waves from artificial satellites or the like do not reach. Objective.
  • the present invention is a portable terminal comprising a camera and a processing unit that processes an image acquired by the camera, wherein the processing unit is provided with two marker points at different positions away from the portable terminal by the camera.
  • An image acquisition unit that acquires a sign image that is an image including a sign point whose position information is known, and the acquired sign image is analyzed, and a terminal position that is the position of the mobile terminal and the two sign points
  • a calculation unit that calculates a plane orientation of the first triangle and an interior angle of the first triangle formed by calculating the terminal position using the plane orientation and the interior angle.
  • the present invention is a camera position estimation system that includes a camera and a processing device that processes an image acquired by the camera, and estimates the camera position that is the position of the camera by analyzing the image.
  • the camera acquires a sign image that is an image including two sign points at different positions away from the camera and includes a sign point whose position information is known, and the processing device acquires the sign image acquired by the camera. Analyzing the sign image, calculating the plane orientation of the first triangle formed by the camera position and the two sign points and the interior angle of the first triangle, and using the plane orientation and the interior angle
  • the camera position is calculated.
  • the present invention provides a camera position estimation method for estimating a camera position that is a position of the camera by analyzing the image in a system including a camera and a processing device that processes an image acquired by the camera.
  • the camera acquires a marker image that is an image including two marker points at different positions away from the camera and includes a marker point whose position information is known, and the marker image acquired by the camera is obtained. Analyzing, calculating the plane orientation of the first triangle formed by the camera position and the two marker points and the interior angle of the first triangle, and using the plane orientation and the interior angle, the camera position Is calculated.
  • (A) is a hardware block diagram of the portable terminal of 1st embodiment
  • (b) is a functional block diagram of the terminal position estimation part of the portable terminal of 1st embodiment.
  • (A) is explanatory drawing for demonstrating the mode at the time of terminal position estimation of 1st embodiment
  • (b) is explanatory drawing for demonstrating the position calculation method by a general triangulation.
  • (A)-(d) is explanatory drawing for demonstrating the calculation method of the angle which the portable terminal and signboard surface of 1st embodiment each comprise.
  • (A)-(c) is explanatory drawing for demonstrating the terminal position estimation method of 2nd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of 2nd embodiment.
  • (A)-(c) is explanatory drawing for demonstrating the terminal position estimation method of 2nd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of 2nd embodiment.
  • (A) And (b) is explanatory drawing in the case of displaying the three-dimensional position of a target point. It is explanatory drawing for demonstrating an example of the marker point of 3rd embodiment.
  • (A)-(c) is explanatory drawing for demonstrating the terminal position estimation method of 3rd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of 3rd embodiment.
  • (A)-(c) is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment.
  • (A)-(d) is explanatory drawing for demonstrating an example of the positional relationship of the marker point of the modification of 3rd embodiment, and an additional marker point, respectively.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment. It is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment.
  • (A)-(c) is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment. It is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the terminal position estimation method of the modification of 3rd embodiment.
  • (A) And (b) is explanatory drawing for demonstrating the estimated position data of the application example of 3rd embodiment. It is a flowchart of the terminal position estimation process of the application example of 3rd embodiment. It is explanatory drawing for demonstrating the estimated position log
  • a landmark whose position information is known and is located away from the camera is photographed by the camera, and the position of the camera is calculated (estimated) by image processing.
  • a landmark whose position information is known is a sign board attached or installed in advance on the surface of an object such as a pillar or a wall. It is assumed that position information of the sign board and direction information of the sign board surface are registered in the sign board.
  • the sign plate has a shape in which an angle formed by a line segment connecting the camera position and the sign plate center and the sign plate surface can be grasped.
  • FIG. 1 is a diagram for explaining an outline of processing according to the present embodiment.
  • the owner 910 holds a portable information processing apparatus (hereinafter referred to as a portable terminal) 100 having a photographing function (camera) and photographs the sign board 200.
  • a portable terminal a portable information processing apparatus
  • photographing function camera
  • the mobile terminal 100 can be connected to the server 960 via a preset access point (AP) 970 and a network 940 or via a base station 950 of a mobile phone company.
  • AP access point
  • the sign board 200 of this embodiment has, for example, a square shape with one side LS.
  • the sign board 200 is capable of photographing position information and azimuth information (hereinafter collectively referred to as position azimuth information), and analyzing the photographed image. It has a position / orientation information display area 201 displayed in a readable manner.
  • the position / orientation information may be described in a QR code (registered trademark).
  • the position / orientation information is described on the sign surface of the sign plate 200.
  • the position information is, for example, the latitude and longitude of the center point of the sign board 200.
  • the azimuth information is indicated by, for example, an angle formed by a horizontal arrow 202 (hereinafter referred to as a directional arrow 202) on the sign surface of the sign board 200 and a predetermined reference direction.
  • a horizontal arrow 202 hereinafter referred to as a directional arrow 202
  • a predetermined reference direction it is assumed that the angle formed by the direction of the azimuth arrow 202 and the north direction is registered.
  • the owner 910 can visually check the description in the position / orientation information display area 201.
  • the sign board 200 may be photographed by the camera of the portable terminal 100 and acquired by image processing.
  • the position / orientation information may not be clearly indicated on the sign board 200.
  • the URL of the information acquisition destination may be described in a QR code (registered trademark) or the like, and the position and orientation information of the sign board 200 may be acquired from a server or the like by photographing with a camera.
  • the mobile terminal 100 is an information processing apparatus having a shooting function and an information processing function.
  • a mobile phone for example, a mobile phone, a smartphone, a tablet terminal, a wearable terminal such as a watch or a head-mounted display, a feature phone, or other portable digital device.
  • an autonomous device such as a drone may be used.
  • FIG. 3A is a hardware configuration diagram of the mobile terminal 100 of the present embodiment.
  • the mobile terminal 100 includes a CPU (Central Processing Unit) 101, a storage device 110, a photographing device 120, a user interface (I / F) 130, a sensor device 140, and a communication device 150. And an expansion I / F 160.
  • Each device is connected to the CPU 101 via the bus 102.
  • the CPU 101 is a microprocessor unit that controls the entire mobile terminal 100.
  • a bus 102 is a data communication path for performing data transmission / reception between the CPU 101 and each device of the portable terminal 100.
  • the storage device 110 includes a ROM (Read Only Memory) 111, a RAM (Random Access Memory) 112, and a storage 113.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the ROM 111 is a memory in which a basic operation program such as an operating system and other operation programs are stored.
  • a rewritable ROM such as an EEPROM (Electrically Erasable and Programmable Read Only Memory) or a flash ROM is used.
  • the storage 113 stores the operation program and operation setting value of the mobile terminal 100 and various programs and various data necessary for realizing each function of the present embodiment.
  • the storage 113 holds information stored in the portable terminal 100 even when external power is not supplied.
  • a device such as a flash ROM, SSD (Solid State Drive), or HDD (Hard Disk Drive) is used as the storage 113.
  • RAM 112 is a work area for executing basic operation programs and other operation programs.
  • the ROM 111 and the RAM 112 may be integrated with the CPU 101. Further, the ROM 111 may not use an independent configuration as shown in FIG. 3A but may use a partial storage area in the storage 113. That is, all or part of the functions of the ROM 111 may be replaced by a partial area of the storage 113.
  • each operation program stored in the ROM 111 or the storage 113 can be updated and function expanded by, for example, a download process from each distribution server on the network.
  • the photographing apparatus 120 includes a camera 121, an image processor 122, and an image memory 123.
  • the camera 121 acquires surroundings and objects as image data by converting light input from the lens into an electrical signal using an image sensor such as a CCD (Charge-Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor. To do.
  • an image sensor such as a CCD (Charge-Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • the image processor 122 performs format conversion, menu and other OSD (On-Screen Display) signal superimposing processing on the image data acquired by the camera 121 as necessary.
  • the image processor 122 includes a video RAM (not shown), and drives a display 131 (to be described later) based on image data input to the video RAM.
  • the image processor 122 includes, for example, a codec unit that compresses / decompresses a captured image or video, an image quality improvement processing unit that improves image quality of an image or video, an angle correction or rotation correction from a captured image, a QR code, or the like The function of an image processing unit or the like for recognizing information and correcting the information is realized.
  • the image memory 123 temporarily stores image data acquired by the camera 121 or image data processed by the image processor 122.
  • the user I / F 130 includes a display 131, for example.
  • the display 131 is a display device such as a liquid crystal panel, for example, and displays a processing result by each unit of the mobile terminal 100 as a display unit.
  • a touch panel function may be provided in which operation devices functioning as reception units that receive input of operation instructions are stacked.
  • the function of the operation device may be realized by a keyboard, a mouse, or the like connected via an expansion I / F 160 described later.
  • the sensor device 140 is a sensor group for detecting the state of the mobile terminal 100.
  • a GPS (Global Positioning System) receiver 141 for example, a GPS (Global Positioning System) receiver 141, a three-axis gyro sensor 142, and a three-axis acceleration sensor 143 are provided.
  • a ranging sensor for example, a geomagnetic sensor, an illuminance sensor, a proximity sensor, a biological information sensor, an atmospheric pressure sensor, and the like may be provided.
  • These sensors detect the position, tilt, direction, movement, etc. of the mobile terminal 100. If the current position is a position where GPS radio waves can be acquired, the position information is acquired by the GPS receiver 141.
  • the communication device 150 performs communication between the mobile terminal 100 and an external device.
  • a LAN (Local Area Network) communication device 151 for example, a LAN (Local Area Network) communication device 151, a telephone network communication device 152, and a short-range communication device 153 are provided.
  • LAN Local Area Network
  • the LAN communication device 151 is connected to the network via an access point (AP) device by wireless connection such as Wi-Fi (registered trademark), and transmits / receives data to / from other devices on the network.
  • AP access point
  • the telephone network communicator 152 performs a call and data transmission / reception by wireless communication with a base station of the mobile telephone communication network.
  • the short-range communication device 153 transmits and receives data to and from other devices in the vicinity of the mobile terminal 100 by a wired connection means such as USB (Universal Serial Bus). In addition, data is transmitted and received by wireless communication with other devices including a short-range communication device.
  • the short-range communication device 153 is, for example, an I / F of short-range wireless communication (NFC (Near Field Communication)), and is an extremely short distance of about several centimeters to about 10 centimeters, and between devices equipped with NFC chips. Two-way communication may be realized. For example, it corresponds to a service using a non-contact IC chip such as electronic money mounted on the mobile terminal 100.
  • the short-range communication device 153 may transmit and receive data by wireless communication with other devices including the wireless communication device.
  • Bluetooth registered trademark
  • the wireless communication device realizes simple exchange of information using radio waves between information devices with a distance of several meters to several tens of meters.
  • the LAN communication device 151, the telephone network communication device 152, and the short-range communication device 153 each include a coding circuit, a decoding circuit, an antenna, and the like.
  • the communication device 150 may further include a communication device that realizes infrared communication and other communication devices.
  • the extension I / F 160 is an interface group for extending the functions of the mobile terminal 100.
  • a video / audio I / F, an operation device I / F, a memory I / F, and the like are provided.
  • the video / audio I / F performs input of video signals / audio signals from external video / audio output devices, output of video signals / audio signals to external video / audio input devices, and the like.
  • External operation devices such as a keyboard are connected via an operation device I / F.
  • the memory I / F transmits and receives data by connecting a memory card and other memory media.
  • the configuration example of the mobile terminal 100 illustrated in FIG. 3A focuses on the configuration essential to the present embodiment.
  • the mobile terminal 100 may be further added with a configuration (not shown) such as a digital broadcast receiving function and an electronic money settlement function in addition to these configurations.
  • the mobile terminal 100 includes an image acquisition unit 171, a calculation unit 172, a display control unit 173, and a data storage unit 174 as the terminal position estimation unit 170. Prepare.
  • the image acquisition unit 171 controls the photographing device 120 to acquire an image.
  • the acquired image is further subjected to processing such as image compression / decompression processing and image improvement.
  • an image of the sign board 200 is acquired. More specifically, two different sign boards 200 are acquired.
  • the calculation unit 172 analyzes the image acquired by the image acquisition unit 171 and calculates the current position of the mobile terminal 100.
  • the calculation unit 172 analyzes the contents of the sign board 200 using various correction functions and character recognition techniques, specifies the direction and angle of the sign board 200 from the photographed shape of the sign board 200, and converts it into data. Or
  • the display control unit 173 controls display on the display 131.
  • the data storage unit 174 stores data necessary for processing, a processing halfway and processing result, and generated data.
  • a QR code analysis unit 175 may be further provided.
  • the QR code analysis unit 175 analyzes the content of the QR code photographed by the photographing device 120.
  • the analysis result may be displayed on the display 131 by the display control unit 173.
  • each part of the terminal location estimation part 170 except the data storage part 174 is implement
  • the software may be software, and for example, a part of the software may be hardware for speeding up.
  • the data storage unit 174 is constructed in the storage device 110.
  • terminal position estimation processing by the terminal position estimation unit 170 of this embodiment will be described.
  • a method in which the information of the two different signboards 200 acquired by the image acquisition unit 171 is analyzed and the calculation unit 172 calculates the current position of the owner 910 (terminal position estimation) is shown in FIG. )
  • the current position is calculated as the position of the mobile terminal 100 held by the owner 910.
  • the calculated current position is referred to as a terminal position.
  • the case where the owner 910 uses the camera 121 of the portable terminal 100 held by the owner 910 to photograph the sign plates 211 and 212 will be described as an example.
  • the portable terminal 100 analyzes the photographing result and calculates the terminal position.
  • the center position of the sign board 211 is point A
  • the center position of the sign board 212 is point B
  • the terminal position is point C.
  • the points A, B, and C are on the same horizontal plane.
  • an error occurs in the estimated value of the terminal position.
  • this plane is referred to as x, y plane
  • the coordinates of point A are (xa, ya)
  • the coordinates of point B are (xb, yb)
  • the coordinates of point C are (xc, yc).
  • an angle ⁇ 1 formed by the point C and the sign surface of the sign board 211 and an angle ⁇ 1 formed by the point C and the sign surface of the sign board 212 are obtained. Measurement is performed to obtain the coordinates of the point C. Note that the coordinates of the points A and B are actually represented by latitude and longitude, but are converted into the coordinates on the x and y planes described above for calculation.
  • the terminal position C is calculated by triangulation.
  • Triangulation is a survey method using trigonometry and geometry that calculates the position of a point to be measured by measuring the angle from a known point at both ends of a certain baseline to the point to be measured.
  • Briefly explain the method for calculating the position of the point you want to measure by triangulation.
  • the position calculation method by triangulation will be briefly described with reference to FIG.
  • a line segment CA and a line segment AB are formed in a triangle formed by a point C whose position is to be obtained and two points (points A and B) whose positions are known.
  • the position of the point C is calculated by measuring the angle and the angles ⁇ and ⁇ formed by the line segment BC and the line segment AB.
  • the coordinate system in which the direction connecting point A and point B is the x-axis direction is considered.
  • the length of the line segment AB is L
  • the distance between the point C and the line segment AB is d
  • the angle formed by the line segment CA and the line segment AB is d
  • the both end angles which are the angles formed by the line segment BC and the line segment AB
  • the coordinates xc and yc of the point C are expressed using L, ⁇ and ⁇ , respectively.
  • the image acquisition unit 171 acquires the images (label images) 221 and 222 of the two different sign plates 211 and 212, respectively.
  • the calculation unit 172 analyzes the sign images 221 and 222 acquired by the image acquisition unit 171 and calculates the position information and both end angle information of the sign plates 211 and 212.
  • both end angle information is the end points of the side connecting the sign plates 211 and 212 of the triangle having the center point of the sign plate 211, the center point of the sign plate 212 and the terminal position as vertices, that is, the above-described angles ⁇ and ⁇ . Is an angle corresponding to.
  • the intersection line of the sign face of the sign board 200 and the horizontal plane and the line connecting the mobile terminal 100 and the center point of each sign face are The angle formed is simply referred to as the angle formed between the sign surface and the mobile terminal 100.
  • the position information of the sign plates 211 and 212 is obtained by reading the position and orientation information from the sign images 221 and 222.
  • the position / orientation information is described in letters and numbers, it is obtained by encoding with an OCR (Optical Character Recognition) function. Further, when the position / orientation information is described in a QR code or the like, the position / orientation information is acquired by reading the QR code.
  • OCR Optical Character Recognition
  • the angles ⁇ 1 and ⁇ 1 are calculated using the sizes of the sign images 221 and 222 and the direction information of the position and direction information. Specifically, the ratio between the length of the vertical side and the length of the horizontal side of the sign image 221 and the azimuth information are used.
  • the sign board 200 is a square having a side length of LS0.
  • FIG. 5B on the horizontal plane including the sign board 200 and the portable terminal 100, when the angle formed by the sign face of the sign board 200 and the portable terminal 100 is ⁇ 1, FIG. As shown, the ratio between the vertical side LS1 and the horizontal side LS2 of the sign image 221 captured by the mobile terminal 100 is sin ⁇ 1.
  • sin (90 °) 1 as shown in FIG.
  • the angle between the mobile terminal 100 and the sign surface of the sign board 211 is ⁇ 1
  • the angle between the mobile terminal 100 and the sign face of the sign plate 212 is ⁇ 1.
  • ⁇ 2 is registered as the direction of the arrow indicating the azimuth on the sign plate 211 (the angle formed by the arrow on the sign surface to the north), and ⁇ 2 is registered as the direction of the arrow indicating the azimuth of the sign plate 212. It shall be.
  • ⁇ 2 is calculated from the latitude and longitude information of the sign board 211 and the sign board 212. That is, it is calculated by the following formula.
  • L ⁇ ⁇ (xa ⁇ xb) 2 + (ya ⁇ yb) 2 ⁇ (5)
  • tan ( ⁇ 2) (yb ⁇ ya) / (xb ⁇ xa) (6)
  • FIG. 7 is a process flow of the terminal position estimation process of the present embodiment. This process is started by an instruction from the owner 910. For example, when the mobile terminal 100 includes a plurality of modes, the mobile terminal 100 may be started when an instruction to shift to a mode for executing the terminal position estimation process of the present embodiment is received.
  • the owner 910 confirms that two different marker plates 211 and 212 are in a position where they can be photographed, and starts this processing.
  • the image acquisition unit 171 acquires the sign image 221 of the first sign plate 211 according to the instruction from the owner 910 (step S1101).
  • the calculation unit 172 analyzes the sign image 221 and obtains the position and orientation information of the first sign plate 211 (step S1102).
  • an area corresponding to the position / orientation information display area 201 on the sign image 221 is subjected to image processing, and the position information of the center position of the first sign board 211 and the direction of the direction arrow 202 are acquired.
  • calculation unit 172 analyzes the sign image 221 and calculates an angle ⁇ 1 formed by the sign surface and the mobile terminal 100 (step S1103).
  • the image acquisition unit 171 acquires the sign image 222 of the second sign plate 212 in accordance with the instruction from the owner 910 (step S1104).
  • the calculation unit 172 analyzes the sign image 222 and acquires the position / orientation information of the second sign plate 212 (step S1105).
  • the area corresponding to the position / orientation information display area 201 on the sign image 222 is subjected to image processing, and the position information of the center position of the second sign plate 212 and the direction of the direction arrow 202 are acquired.
  • calculation unit 172 analyzes the sign image 222 and calculates an angle ⁇ 1 formed by the sign surface and the mobile terminal 100 (step S1106).
  • the calculation unit 172 calculates the terminal position, which is the current position of the mobile terminal 100, using the above method, using the position and orientation information and angle of the first sign board and the position and orientation information and angle of the second sign board. (Step S1107).
  • the calculation unit 172 stores the calculation result in the data storage unit 174, and the display control unit 173 displays the calculation result on the display 131 (step S1108) and ends the process.
  • a sign image that is an image including two sign points that are separated from the mobile terminal 100 and have different positions and including a sign point with known position information is acquired, and the obtained sign Analyzing the image, calculating the plane orientation of the first triangle formed by the terminal position C, which is the position of the mobile terminal 100, and the two sign points, and the interior angle of the first triangle, the plane orientation and the interior angle
  • the terminal position C is calculated using
  • two different sign boards 200 in which the terminal position C and the center point are on the same horizontal plane are photographed as the sign images. And let the center point of each marker board 200 be two marker points.
  • the sign board 200 has a position and orientation information display area 201 that can be read by analyzing the sign image of the position information of the center point and the direction of the sign face with respect to the reference direction, respectively. Prepare for.
  • a line segment connecting the camera 121 and the center point due to the distortion of the sign image has a shape capable of measuring the angle formed with the normal line of the sign surface.
  • a calculation part calculates the both-ends angle
  • the portable terminal 100 can calculate its own position only by photographing the two sign boards 200. Therefore, according to the present embodiment, an accurate position can be acquired even when the GPS function cannot be used, such as in an underground shopping center or a large-scale shopping center.
  • QR code that can access information such as advertisements may be described on the sign board 200 in addition to the position information.
  • the mobile terminal 100 can also read this QR code when acquiring an image to estimate position information. Thereby, information that can be acquired based on the QR code can also be displayed.
  • the capacity of a general QR code is a maximum of 7,089 characters in numbers and a maximum of 1,817 characters in kanji / kana, providing a sufficient amount of information. Therefore, with this configuration, the terminal position X is estimated, and at the same time, the name and position (longitude / latitude / height) of the store in the underground shopping mall or large-scale shopping center are displayed on the detailed map. Information on products, events, etc. can be provided. Further, nearby sign board position information may be described.
  • the sign board 200 has a line connecting the position of the mobile terminal 100 including the camera 121 that has photographed the sign board 200 and the center of the sign board 200 due to distortion of the photographed image of the sign board 200 on the face of the sign board 200. It is only necessary to have an angle measurement configuration capable of measuring an angle formed with a normal line of a certain marking surface.
  • the shape of the sign plate 200 may be any shape that can calculate the angle between the camera position and the sign surface using the photographed sign image.
  • it may be a rectangle with a known ratio of the long side to the short side, a circle, or an ellipse with a known ratio of the major axis to the minor axis.
  • FIG. 8A shows an example where the sign plate 200a is circular.
  • f be the diameter of the circle.
  • the sign board 200 is viewed obliquely on the same horizontal plane, the height (vertical) length is f, but the horizontal (horizontal) length is shorter than f. This aspect ratio varies depending on the angle formed by the sign board 200 and the portable terminal 100.
  • the calculation unit 172 analyzes the acquired sign image and calculates the angle ⁇ 1 according to the above formula.
  • the shape of the sign plate 200 itself is not limited.
  • reference points points where marks indicated by crosses are present
  • the four reference points 213 the ratio between the length in the vertical direction and the length in the horizontal direction between the reference points 213 is assumed to be known.
  • the shape of the sign plate 200 itself may not have the above characteristics.
  • what is necessary is just to provide the mark of the shape which has the said characteristic on the marking surface of the marking board 200.
  • FIG. The mark is, for example, a rectangular frame or a circular frame.
  • ⁇ Modification 2 of the first embodiment> when calculating the terminal position, a focal length calculation function provided in the distance measuring sensor or the camera 121 may be used. The calculation method in this case will be described with reference to FIGS. In addition, when using the focal distance calculation function with which the camera 121 is provided, it is limited to the range in which the camera 121 can calculate a focal distance.
  • the calculation principle will be explained.
  • the positions of the points A and B, the length (distance) LB of the side AC The description will be made assuming that the position of the point C is calculated when the length (distance) LA of the side BC is known.
  • the distance LB and the distance LA are acquired by the distance measuring sensor or the focal length calculation function of the camera 121.
  • FIG. 9B shows a case where the surfaces of the two sign plates 211 and 212 are not in a straight line.
  • the angles ⁇ 1 and ⁇ 1 with the sign surface are calculated by the method of the above embodiment.
  • the internal angles ⁇ and ⁇ are calculated by the above formulas (8) and (10). Thereby, the position information of the point C is calculated.
  • one terminal board 200 may be photographed to calculate the terminal position.
  • a calculation method in this case will be described with reference to FIG.
  • the angle between the sign surface and the portable terminal 100 obtained from the captured image is ⁇ 1, and the direction is ⁇ 2.
  • the coordinates (xc, yc) of the point C are expressed by the following equations.
  • the xc xa + LBcos ⁇ 3 (15)
  • yc ya ⁇ LBsin ⁇ 3 (16)
  • the sign board 200 is described as a plate-like thing created individually.
  • the sign board 200 is not limited to this, and may be printed on a part of an advertisement in a store, for example. Thereby, it is possible to renew the sign board 200 every time the advertisement is changed and to insert new information into the QR code.
  • the position where the sign board 200 is installed does not always have to be the same place, and the place may be moved. Each time it moves, position and orientation information after movement is given.
  • each sign board 200 is not known in advance.
  • the present invention is not limited to this.
  • the place where each sign board 200 is present may be displayed in the vicinity of the escalator or in a busy place.
  • the information may be registered on the website.
  • it may be installed in places where customers are relatively easy to see and check, such as the entrance of each store and each product corner.
  • the calculated terminal position is displayed on the display 131, but the calculation result processing by the calculation unit 172 is not limited to this. For example, you may output to the other application which requires the present position of the portable terminal 100.
  • FIG. 1 the calculation result processing by the calculation unit 172 is not limited to this. For example, you may output to the other application which requires the present position of the portable terminal 100.
  • a function executed by this application is referred to as a navigation unit (navigation unit) 176.
  • the store information of each store is registered in advance and stored in the server 960 or the like.
  • the store information includes a store name, a store description, and store location information.
  • the position information is registered in latitude and longitude.
  • the navigation unit 176 accesses the server 960 in which the store information is registered via the network 940, and acquires the store information. Then, it is displayed on the display 131 in a predetermined display form.
  • the navigation unit 176 causes the display 131 to display a store on the map according to its position information. Further, the current position of the mobile terminal 100 is also displayed. For example, the mobile terminal 100 stores the map information in advance in the data storage unit 174 or acquires the map information from the server 960.
  • the current position of the owner 910 of the mobile terminal 100 is the east corner of the area A21. Further, it is assumed that the store selected by the owner is in the area A13.
  • the navigation unit 176 displays the route on this map.
  • navigation information from the current position to the desired store may be displayed on the display 131 in text.
  • information such as “the target store is located one block north and two blocks east” may be displayed.
  • each store may be provided with detailed product information, and when entering the store, the location of the product may be displayed.
  • the merchandise information includes a merchandise name, a merchandise description, and an arrangement position of the merchandise. Then, a list of product names may be displayed, and when the user selects a desired product, navigation to the product may be performed similarly in the store.
  • the navigation unit 176 acquires display information from the server 960 via the access point 970 and the network 940 via the communication device 150, for example.
  • the position information that is the basis for calculating the position information of the mobile terminal 100 is not limited to latitude and longitude.
  • the coordinate value of an original coordinate system such as each shopping center may be used.
  • the position information of the product is also registered with coordinate values in the same coordinate system.
  • the QR code for each store may be shown on the information center of the shopping center, and the mobile terminal 100 may read the QR code to perform navigation to the store.
  • QR code of the handling product may be shown at the entrance of each store and the navigation may be performed in the same manner.
  • the position of the owner 910 changes every moment. However, it is troublesome to always measure the above-mentioned position. Therefore, regarding the movement between them, for example, the movement distance and direction may be detected by the triaxial acceleration sensor 143, the triaxial gyro sensor 142, and the clock function, and the corrected value may be indicated in the map information.
  • the position of the nearby sign board 200 is indicated to prompt the owner 910 to perform position measurement, or the portable terminal 100 automatically performs position measurement. It may be. For example, when the movement distance from the previous position measurement point has increased, or when it appears that it is moving on a location that is not a passage on the map, it is determined that the position estimation error by the internal sensor has increased.
  • the length of the line segment connecting the center points of the two sign boards 200 is determined on the sign board 200 from the ratio of the distortion of the sign shape between the horizontal plane direction and the direction orthogonal thereto in the sign image.
  • the terminal position is calculated by calculating both end angles.
  • a circle is drawn on the sign board 200, and information necessary for terminal position estimation is acquired from the degree of distortion of the circular shape on the captured image.
  • the sign board 200 and the mobile terminal 100 may not be on the same horizontal plane. Moreover, the number of the sign board 200 which acquires an image is sufficient.
  • the position information is a three-dimensional amount including the height. Then, consider two line segments that are not on the same straight line with known positions, and at least two triangles formed from each line segment and terminal position, and calculate each angle and length of each side of those triangles. By doing so, the terminal position is estimated three-dimensionally. Since the two triangles share the terminal position, the plane orientation of the triangle is determined by calculating the respective triangle shapes, and the terminal position can be calculated.
  • the sign board 230 of the present embodiment Prior to the description of the terminal position calculation method, first, the sign board 230 of the present embodiment will be described. As shown in FIG. 12, the sign plate 230 of this embodiment includes a sign circle 231 having a circular shape, an azimuth line 232, and a position / direction information display area 233.
  • position / orientation information display area 233 information (position / orientation information) necessary for photographing the sign board 230 and calculating terminal position information is recorded.
  • position information of the center of the marker circle 231, the diameter of the marker circle 231, the azimuth line direction, and the surface normal direction are described.
  • the azimuth line direction is a three-dimensional direction of the azimuth line 232, and is indicated by a unit vector component, for example.
  • the surface normal direction is the three-dimensional direction of the normal of the sign surface of the sign plate 230, and is indicated by, for example, a component of the surface normal vector.
  • an arrow, a triangle, or the like may be attached to the azimuth line 232 so that the orientation thereof can be easily grasped.
  • the hardware configuration and functional blocks of the mobile terminal 100 of the present embodiment have basically the same configuration as that of the first embodiment. However, the calculation method by the calculation unit 172 is different.
  • the distance in the real space between the point I and the point J is denoted as LIJ .
  • an angle formed by the line segment IJ and the line segment JK at points I, J, and K that are not in a straight line is referred to as an angle IJK.
  • this angle IJK is also referred to as an expected angle at which the line segment IK is expected from the point J or an expected angle at which the interval between the point J and the point IK is expected.
  • the vector product is represented as [V A , V B ] and the scalar product is represented as (V A , V B ).
  • the point used for calculating the terminal position on the sign circle 231 of the sign board 230 is called a sign point.
  • the coordinates of the sign point can be calculated from the position / orientation information described on the sign plate 230.
  • the prospective angle to expect between the two marker points from the terminal position is geometrical based on the focal length of the camera 121 and the length on the image sensor between the marker points obtained from the image with the two marker points in the field of view. Can be calculated.
  • the focal length is specified from the positional relationship between the lens of the camera 121 and the image sensor.
  • the unknown side length and the interior angle value are measured, and the expected angle for estimating the distance between the two sign points from the sign point coordinate value and the terminal position is measured.
  • the terminal position is calculated from the value.
  • the output of the distance measuring sensor may be used.
  • FIG. 13A shows a state where the owner 910 located away from the place where the single sign board 230 is installed, such as a pillar, is shooting using the mobile terminal 100.
  • the position (terminal position) of the mobile terminal 100 is X (point X).
  • the owner 910 photographs a single sign board 230 from an oblique position (a direction different from the normal direction of the sign face) with respect to the face of the sign plate 230.
  • the sign circle 231 of the sign plate 230 is displayed in an elliptical shape as an image of the sign circle 231 (mark circle image 231i) shown in FIG.
  • the portable terminal 100 and the sign board 230 can have an angle of a height direction. Accordingly, the calculation is performed on the assumption that the major axis 234i of the ellipse of the sign circle image 231i may be different from the azimuth line (azimuth line image 232i) on the image.
  • FIG. 13 (c) shows only the sign circle image 231i displayed on the display 131 of FIG. 13 (b). It should be noted that in the following explanatory diagrams, the images other than the images corresponding to the marker circle 231 and the azimuth line 232 are omitted in order to simplify the drawing.
  • the center point of the sign circle image 231i is Oi.
  • the point Ai and the point Bi are the intersections of the bearing image 232i and the marked circle image 231i, respectively
  • the points Ci and Di are the intersections of the major axis 234i and the ellipse of the marked circle image 231i, respectively
  • Ei , Fi are the intersections of the minor axis and the ellipse of the sign circle image 231i, respectively.
  • calculation unit 172 obtains each point Ai to point Fi on the image by analyzing the photographed sign circle image 231i, and calculates the length of the major axis and the minor axis on the image.
  • FIG. 14A is a diagram in which the image is expanded in the short axis direction on the image and converted so as to overlap the marker circle 231 using the ratio of the long axis length to the short axis length on the image. ).
  • FIG. 14A shows points Ai, Bi, Ci, Di, Ei, and Fi that have been converted to the actual marker circle 231 and azimuth line 232 by image expansion, respectively, A, B, Indicated as C, D, E, F.
  • the angle AOC is denoted as ⁇ AC .
  • each line segment AB, CD, EF is known from the diameter of the mark circle 231, and since the line segment AB is the azimuth line 232, its three-dimensional direction is known. Further, the surface normal direction of the sign plate 230 is also known. Using these, the three-dimensional direction of the line segment CD, the three-dimensional direction of the short axis EF, and the spatial coordinates of the axis end points (C, D, E, F) used as the marker points are obtained. In addition, the center point O of the marker circle 231 is also used as the marker point, but the spatial coordinates of O are known from the position and orientation information.
  • a triangle XOC having the terminal position X, the point C, and the point O as vertices is considered (first triangle; may be a triangle XOD). Since the line segment CD (first diameter) is in the major axis direction on the image in the sign circle image 231i, the terminal position X exists at any position 236 orthogonal to the line segment CD.
  • the line segment CD is orthogonal to a straight line (XO) connecting the point X as the terminal position and the center point O of the marker circle 231. Therefore, the triangle XOC is a right triangle as shown in FIG.
  • the distance L OC between the point O and the point C is the radius R (1/2 of the diameter) of the marker circle 231.
  • the diameter is acquired by analyzing a region corresponding to the position / orientation information display region 233 on the captured image. Therefore, the distance L OX between the point X and the point O can be calculated by the following equation.
  • the angle CXO is ⁇ C. ⁇ C is measured from an image captured by the camera 121.
  • L OX R ⁇ cot ⁇ C (19)
  • a triangle (second triangle) formed by two sign points that are not on the same straight line as the line segment CD and the terminal position X is considered.
  • the line segment EF is adopted as the second diameter
  • the triangle XOE and the triangle XOF shown in FIG. 15B are considered.
  • the angle EXO is ⁇ E
  • the angle FXO is ⁇ F
  • the angle EOX is ⁇ E
  • the angle FOX is ⁇ F
  • the angle OEX is ⁇ E
  • the angle OFX is ⁇ F.
  • L OE and L OF are the radius R of the marker circle 231 and are known. Further, the above ⁇ E and ⁇ F can be calculated geometrically from the focal length of the camera 121 and the length on the image sensor corresponding to EF. The focal length is specified from the positional relationship between the lens in the camera 121 and the image sensor. Therefore, ⁇ E and ⁇ F can be calculated using these.
  • cos ⁇ E ( ⁇ F cos ⁇ F - ⁇ E cos ⁇ E) / ⁇ ( ⁇ E sin ⁇ E + ⁇ F sin ⁇ F) 2 + ( ⁇ F cos ⁇ F - ⁇ E cos ⁇ E) 2 ⁇ 1/2 ⁇ (30)
  • sin ⁇ E is calculated as follows.
  • sin ⁇ E ( ⁇ E sin ⁇ E + ⁇ F sin ⁇ F ) / ⁇ ( ⁇ E sin ⁇ E + ⁇ F sin ⁇ F ) 2 + ( ⁇ F cos ⁇ F ⁇ E cos ⁇ E ) 2 ⁇ 1/2 (31)
  • L OX is represented by the following formula.
  • cos ⁇ E (cot ⁇ F ⁇ cot ⁇ E ) / ⁇ 4+ (cot ⁇ F ⁇ cot ⁇ E ) 2 ⁇ 1/2 (35)
  • sin ⁇ E 2 / ⁇ 4+ (cot ⁇ F ⁇ cot ⁇ E ) 2 ⁇ 1/2 (36)
  • L OX is obtained by the following equation.
  • L OX R (cot ⁇ E + cot ⁇ F ) / ⁇ 4+ (cot ⁇ F ⁇ cot ⁇ E ) 2 ⁇ 1/2 (37)
  • the terminal position X is specified as the position where the angle EOX is ⁇ E among the positions 236 at the distance L OX from the line segment CD.
  • the terminal position X is calculated.
  • the central angle AOC of the arc AC shown in FIG. 16A is positive in the counterclockwise direction from point A to point C with respect to ⁇ AC .
  • FIG. 16B shows the direction of each unit vector to be described later.
  • the unit vector in the direction from the point O to the point A in the real space is V A
  • the surface normal vector (unit vector) of the sign board 230 in the real space is V S
  • V A the O center to rotate 90 degrees in the sign surface in the clockwise vector and V T.
  • V T is obtained as a vector product of V A and V S , it can be calculated by the following equation using the position and direction information.
  • V T [V A , V S ] (38)
  • the angle formed by the vector V T and the line segment OE is equal to ⁇ AC .
  • V X A unit vector in the direction from the point O toward the point X is defined as V X.
  • V X is orthogonal to the long axis CD. For this reason, the projection of V X onto the marking surface is on the line segment EF. Further, since the angle between V X and the line segment OE is lambda E, V X is calculated by the following equation.
  • V X cos ⁇ E cos ⁇ AC V T + cos ⁇ E sin ⁇ AC V A + sin ⁇ E V S ⁇ (39)
  • the position information of the center point and the orientation information of the surface are known, and the sign circle 231 described on the sign plate 230 and the string passing through the center of the sign circle 231 And a base line connecting the two points and a predetermined diameter at which the predetermined diameter and circumference on the mark circle 231 intersect by analyzing the captured image. And the two end angles connecting the portable terminal 100 and the predetermined two points, respectively, to calculate the terminal position.
  • the sign board 230 has the structure which can calculate the direction of the surface containing the positional information of said 2 points
  • the terminal position can be calculated by photographing the sign board 230 as in the first embodiment. Therefore, according to the present embodiment, as in the first embodiment, even when the GPS function cannot be used, the current position and orientation of the owner 910 of the mobile terminal 100 can be calculated with a simple configuration and with high accuracy. .
  • the terminal position X is obtained as a three-dimensional position including the height.
  • the direction from the terminal position X to the sign board 230 is obtained as a direction vector in real space. Therefore, it can be seen in which direction the casing of the mobile terminal 100 is oriented in the real space. Thereby, since precise calibration including the three-dimensional direction of the internal coordinate system of the portable terminal 100 can be performed, position guidance including the height is also possible. It is also possible to display an image indicating the three-dimensional position of the target point by superimposing it on the actual image.
  • the product position may be superimposed on the image of the shelf and displayed as an AR (Augmented Reality) image.
  • An example is shown in FIG.
  • a live-action image 301 of a product shelf photographed by the mobile terminal 100 is displayed, and an AR image 300 indicating the location of the target product is superimposed and displayed.
  • Detailed location information including the height of the product may be acquired from a store via the Internet.
  • the part in the store that the owner 910 is viewing may be estimated from the direction in which the mobile terminal 100 is facing and the video may be displayed.
  • the navigation screen displays the position of the owner 910 on the plan view as shown in FIG. 11A until it arrives in the vicinity of the target product, and when the navigation screen arrives in the vicinity of the target product.
  • the mode may be automatically switched to the mode for displaying the actual image of the product shelf, and the product shelf and the target product image may be displayed.
  • an instruction as to which direction the mobile terminal 100 should be directed may be displayed (FIG. 17). (B)). Whether or not it is near the target product may be determined based on whether or not the condition that the product is visible and the owner 910 is within a predetermined distance from the product is satisfied.
  • the position guidance target is not limited to the product, but may indicate a route such as a door on the way to reach the target location.
  • the current position and direction can be calculated only by photographing one sign board 230. Even if the portable terminal 100 is not on the same horizontal plane as the sign board 230, the terminal position can be calculated. Compared to the first embodiment, the terminal position can be calculated with a simpler configuration. In addition, since the degree of freedom of installation of the sign board 230 is increased and, for example, a ceiling or a floor may be used, the owner 910 of the mobile terminal 100 has a wider range for seeing the sign board 230 and the possibility of calculating the terminal position is increased. .
  • the position of the mobile terminal 100 (terminal position) is calculated using a plurality of marker points, each of which has a known three-dimensional position, without using the shape of the marker.
  • the configuration of the mobile terminal 100 of the present embodiment is basically the same as that of the first embodiment.
  • a description will be given focusing on the configuration different from the first embodiment.
  • the notation of distance, angle, etc. is the same as in the second embodiment.
  • the shape of the sign is not used, but the three-dimensional position information of the sign point is used.
  • the terminal position is calculated using four marker points.
  • the four marker points are on the same plane.
  • FIG. An example of the sign point of this embodiment is shown in FIG. As shown in the figure, in this embodiment, four marker points 241 are arranged on one marker plate 240. In the present embodiment, it is assumed that four points are arranged at each vertex of a virtual square on the sign board 240. That is, it is assumed that three or more points are not arranged on the same straight line.
  • each marker point 241 is referred to as point A, point B, point C, and point D, respectively.
  • the intersection point 242 of the diagonal line is set as a point O.
  • the point O may be displayed on the sign board 240 or may be processed as internal data for image processing.
  • the sign board 240 of this embodiment includes a position information display area 243.
  • the position coordinates of each marker point 241 are described.
  • the position coordinates are, for example, the latitude / longitude height of each marker point 241.
  • the position coordinates of the intersection 242 (point O) can be calculated from the position coordinates of each marker point 241, but may be described in the position information display area 243.
  • the image acquisition unit 171 acquires four marker points 241 and position information of each marker point 241 as an image.
  • calculation part 172 analyzes the image which the image acquisition part 171 acquired, and calculates the positional information on the terminal position X of the portable terminal 100 as a present position.
  • the four marker points 241 are in a positional relationship where a line segment connecting point A and point C and a line segment connecting point B and point D intersect at point O. Shall.
  • the terminal position X here, as shown in FIG. 19A and FIG. 19B, as a line segment composed of the sign points 241, a group of AO and OC, a group of BO and OD, think of. Lines do not appear on a straight line between groups. The corners and side lengths of the triangle formed by these line segments and the terminal position X are calculated.
  • the distances L OA , L OC , L OB , and L OD between the marker points 241 including the point O can be calculated using the position information.
  • the angle AXO ( ⁇ A ), the angle CXO ( ⁇ C ), the angle BXO ( ⁇ B ), and the angle DXO ( ⁇ D ) are as described with reference to FIG. From the focal length and the corresponding length on the image sensor, it can be calculated geometrically using the captured image.
  • the angle XAO is ⁇ A
  • the angle XCO is ⁇ C
  • the angle XOA is ⁇ A
  • the angle XOC is ⁇ C
  • the angle XBO is ⁇ B
  • the angle XDO Is represented as ⁇ D
  • the angle XOB as ⁇ B
  • the angle XOD as ⁇ D.
  • ⁇ A , ⁇ B , and L OX can be calculated by the following equations in the same calculation as that of the modification of the second embodiment described with reference to FIG.
  • cos ⁇ A (L OC cot ⁇ C -L OA cot ⁇ A) / ⁇ (L OA + L OC) 2 + (L OC cot ⁇ C -L OA cot ⁇ A) 2 ⁇ 1/2 ⁇ (41)
  • sin ⁇ A (L OA + L OC) / ⁇ (L OA + L OC) 2 + (L OC cot ⁇ C -L OA cot ⁇ A) 2 ⁇ 1/2 ⁇ (42)
  • cos ⁇ B (L OD cot ⁇ D ⁇ L OB cot ⁇ B ) / ⁇ (L OB + L OD ) 2 + (L OD cot ⁇ D ⁇ L OB cot ⁇ B ) 2 ⁇ 1/2 (43)
  • sin ⁇ B (L OB + L OD ) / ⁇ (L OB +
  • the distance L OX can be calculated using only the triangle AXC or the triangle BXD as described above. As in the modification of the second embodiment, either value may be used.
  • the calculation unit 172 calculates an expected angle ( ⁇ A + ⁇ C ) from which the line segment AC is expected and an expected angle ( ⁇ B + ⁇ D ) from which the line segment BD is expected from the terminal position X, and compares both.
  • the distance L OX may be obtained using the larger one. Or you may use the average value of both.
  • the terminal position X is obtained.
  • a unit vector in the direction from the intersection point O to the point A and V A the unit vector in the direction from the intersection point O to the point B and V B. Both can be calculated from the position coordinates of the sign point.
  • a unit vector perpendicular to the sign plane from the intersection O and facing the sign side is V S, and an angle BOA measured counterclockwise from the point B to the point A is ⁇ BA .
  • V S is obtained by the following equation.
  • V S [V B , V A ] / sin ⁇ BA (47)
  • V X forms a label surface of the label plate 240 and the angle a (V X and V S angle Is assumed to be ⁇ X ( ⁇ 90 degrees).
  • V X and V S angle Is assumed to be ⁇ X ( ⁇ 90 degrees).
  • V X , V B , and VA is expressed by the following equation.
  • [V X , [V B , V A ]] [V X , V S ] sin ⁇ BA (49)
  • the left side and the right side are respectively expressed by the following formulas.
  • V T V X , V S ]
  • V T is parallel to the marking surface, V T and V H are orthogonal, and when V T is rotated 90 degrees counterclockwise, it becomes equal to V H. Therefore, VH is obtained as follows.
  • the position coordinates of the intersection point O P O, the position coordinates of the terminal position X and P X, P X is determined as follows.
  • P X P O + L OX sin ⁇ X V S + L OX cos ⁇ X
  • V H P O + (L OX / sin 2 ⁇ BA) ⁇ ⁇ sgn (sin ⁇ BA) (sin2 ⁇ BA -cos 2 ⁇ A -cos 2 ⁇ B + 2cos ⁇ A cos ⁇ B cos ⁇ BA) 1/2 [V B, V A] + (cos ⁇ A -cos ⁇ B cos ⁇ BA ) V A + (cos ⁇ B -cos ⁇ A cos ⁇ BA) V B ⁇ ⁇ (55)
  • the four marker points 241 on the same plane are photographed, and the terminal position of the mobile terminal 100 is calculated by image analysis. For example, by image analysis, a line segment connecting two marker points (point A, point C) among the four marker points 241 and a line connecting the terminal position (point X) and each of the two marker points A and C.
  • the angle formed by the minute (angle XAC, angle XCA) and the direction of the plane including the points X, A, and C are calculated.
  • the portable terminal 100 of this embodiment estimates a terminal position using these.
  • the terminal position can be calculated by photographing four different marker points. Therefore, according to the present embodiment, the current position and orientation of the owner 910 of the portable terminal 100 can be calculated with a simple configuration with high accuracy even when the GPS function cannot be used, as in the above embodiments.
  • the current position and direction can be calculated only by photographing one sign board 240 having four sign points 241.
  • the terminal position X is obtained as a three-dimensional position including the height.
  • the direction from the terminal position X to the sign board 240 is obtained as a direction vector in real space. Therefore, it can be seen in which direction the casing of the mobile terminal 100 is oriented in the real space. Thereby, position guidance including the height is also possible.
  • the current position can be calculated even when the mobile terminal 100 is not on the same horizontal plane as the sign board 240. For this reason, the degree of freedom with which the sign plate 240 is attached increases, and the possibility of calculation for the owner 910 of the mobile terminal 100 also increases.
  • the position information may not be arranged on the sign board 240. It suffices if it is on the same virtual plane and is photographed at the same time as the four marker points 241 and described at a position where the information can be acquired. When each sign point 241 is arranged at a distant position, the position coordinates of the point may be described for each sign point.
  • all the sign points 241 may not fit in one image.
  • the expected angle between two terminal points is measured from the terminal position using the output of the three-axis gyro sensor 142.
  • the sign point A is photographed with the image a and the sign point B is photographed with the image b.
  • the prospective angles to be expected between the image center and the sign point A in the image and between the image center and the sign point B in the image are the focal length of the camera 121, It can be calculated geometrically from the length on the image sensor between the marker points.
  • the spatial orientation in the internal coordinate system in the center direction of each image is calculated from the output of the three-axis gyro sensor 142. Accordingly, the orientation of the sign points A and B in the spatial orientation of the internal coordinate system can be calculated, and the prospective angle for predicting the distance between the two sign points A and B from the terminal position can be calculated.
  • the difference in the central direction between the image a and the image b only needs to be known relatively, so the orientation of the internal coordinate system calculated from the output of the three-axis gyro sensor 142 is It may be deviated from the actual orientation of the space.
  • the position information acquisition destination such as URL is described on the sign board 240 as the position information, and the position coordinates of each sign point 241 are acquired from the management server via this QR code. May be.
  • the position coordinates may not be the latitude and longitude height.
  • the position coordinate based on the coordinate system peculiar to the building and the region where the sign board 240 is installed may be used.
  • the sign board 240 when it is arranged in a shopping center, it may be associated with a floor map of the shopping center.
  • each marker point 241 may not be a square vertex. It is sufficient that at least one point on the sign plate 240 is not on a straight line connecting the other two or more sign points 241.
  • the point A, the point O, and the point C are on a straight line, and only the point B is not on the line segment AC.
  • the coordinates of each marker point 241 are described in the position information display area 243.
  • the configuration other than the arrangement of the marker points 241 is the same as that of the present embodiment.
  • FIG. 21B is a triangle AXC connecting the terminal position X, the point A, and the point C.
  • the angle AXO is represented as ⁇ A
  • the angle CXO is represented as ⁇ C
  • the angle XAO is represented as ⁇ A
  • the angle XCO is represented as ⁇ C
  • the angle XOA is represented as ⁇ A
  • the angle XOC is represented as ⁇ C.
  • the triangle AXC has the same configuration as the triangle AXC in FIG. Therefore, L OX and ⁇ E are obtained by the same calculation as described above.
  • FIG. 21C shows a triangle BXO connecting the terminal position X, the point O, and the point B.
  • the angle BXO is represented as ⁇ B
  • the angle XBO is represented as ⁇ B
  • the angle XOB is represented as ⁇ B.
  • the terminal position X of the mobile terminal 100 can be calculated by measuring the expected angle from which the terminal point X is expected to be between each marker point.
  • the same effects as in the above embodiment can be obtained.
  • ⁇ Modification 3 of the third embodiment> (When the four sign points are not on the same plane) Further, the four marker points 241 may not be on the same plane. As long as all the four sign points 241 are not arranged on a straight line, the terminal position X can be calculated. Hereinafter, a calculation method of the terminal position X by the calculation unit 172 in this case will be described.
  • FIG. 22A is a view of each sign point 241 viewed from the direction of the terminal position X.
  • FIGS. 22B, 22C, and 22D are diagrams in which each marker point 241 is viewed from the normal direction of the plane determined by the terminal position X and the line segment CD.
  • Point O Two points on the straight line (referred to as apparent intersections) in the same direction (or opposite directions) when viewed from the mobile terminal 100 are also referred to as new additional sign points O and U (hereinafter simply referred to as points O and U). .)
  • the “opposite direction” is a case where the mobile terminal 100 is sandwiched between two straight lines as shown in FIG.
  • Point O is a point on line AB
  • point U is a point on line CD.
  • the point O is closer to the terminal than the point U.
  • the point O and the point U are not necessarily between the point A and the point B and between the point C and the point D.
  • FIG. 1 An example of the relationship between the marking points A, B, U and the terminal position X of the mobile terminal 100 on this plane is shown in FIG.
  • the intersection of the line segment AB and the line segment UX is O.
  • Angle of each vertex as shown, respectively, the angular UAO eta A, the angular UBO eta B, the angular AUO zeta A, the angular OUB zeta B, the angular XAO xi] A, the angular XBO xi] B, the angular AOX Is represented as ⁇ A , the angle BOX as ⁇ B , the angle AXO as ⁇ A , and the angle BXO as ⁇ B.
  • the terminal position X is obtained.
  • the sign point U is on the straight line CD. Therefore, when the terminal position X is calculated using each point on the straight line CD as the marker point U, the candidate point of the terminal position X is placed on a predetermined curve (first curve) in the space.
  • a curve (second curve) different from the first curve in which the candidate point of the terminal position X is placed in the space is determined.
  • the terminal position X is an intersection of the first curve and the second curve.
  • L AO L AW -L UW cot ⁇ A (73)
  • L OX (L AW ⁇ L UW cot ⁇ A ) (cot ⁇ A + cot ⁇ A ) sin ⁇ A (74)
  • L CU L CQ + L OQ cot ⁇ C (75)
  • L UX (L CQ + L OQ cot ⁇ C ) (cot ⁇ C + cot ⁇ C ) sin ⁇ C (76)
  • the point U When the point U is determined, the point W is determined, and the coordinates of the terminal position X are determined by using the equations (71), (73), and (74). On the other hand, when the point O is determined, the point Q is determined, and the coordinates of the terminal position X are determined using the equations (72), (75), and (76). The point U and the point O are searched so that the terminal position X calculated in both becomes the same coordinate.
  • a prospective angle that is expected from the terminal position X between the sign point 241 and the intersection point is measured, a straight line connecting the sign points 241 of different directions, an angle of a straight line connecting the sign point 241 and the terminal position X, and the sign point 241
  • the distance from the terminal position X to the terminal position X can be obtained, and the coordinate value of the terminal position X in the external coordinate system can be obtained.
  • the intersection point is one of apparent intersection points (point O, point U) viewed from the terminal position X of two straight lines connecting the sign points 241. Further, since the direction in the external coordinate system from the terminal position X toward the two or more sign points 241 is known, the orientation of the internal coordinate system of the mobile terminal 100 in the external coordinate system can also be grasped.
  • the measurement of the prospective angle for predicting the distance between the two sign points 241 from the terminal position X can be performed by analyzing the photo when the two sign points 241 fit within one picture. Even if there is no sensor (hereinafter referred to as an orientation sensor) for estimating the orientation (the orientation of the terminal in space) of the mobile terminal 100 such as the three-axis gyro sensor 142, the estimation of the terminal position X and the orientation of the terminal (Direction in the external coordinate system) can be estimated. Therefore, it is desirable that the camera 121 can capture all solid angles.
  • the marker points 241 are not limited to four points. As long as all the points are not on a straight line, it may be three or more. If there are three or more marker coordinates of the marker point 241, it is possible to calculate the orientation of the surface including the marker point 241. As will be described later, two or more points may be used as long as the orientation of the internal coordinate system is correct.
  • FIG. 25 shows an example of three marker points A, B, and C, a projection plane 410, and a projection plane 420.
  • each marker point 241 are projected onto the projection plane 410 and the projection plane 420, respectively.
  • projection points of each point onto the projection plane 410 are points A ′, B ′, and C ′, respectively.
  • the projection points on the projection plane 420 are points A ′′, B ′′, and C ′′, respectively.
  • the terminal position X is also projected onto each projection plane, and the projection points are X ′ and X ′′, respectively.
  • the expected angles (angles) at which the sign points are estimated from the terminal position X in the real space From the expected angles (angles) at which the sign points are estimated from the terminal position X in the real space, the expected angles (angles) at which the projected points A ′, B ′, C ′ are estimated from the projected points X ′ are calculated. Since the projection direction is known in the internal coordinate system, the angle in the projection plane 410 can also be calculated.
  • FIGS. 26 (a) to 26 (c) show diagrams of the marker points projected on the projection plane 410 and the terminal position X.
  • the projection is similarly performed from the measured value of the expected angle from the terminal position X to the interval between the marker points 241.
  • the terminal position X ′ in the plane 410 is obtained.
  • W ′ be the leg of a perpendicular line drawn from the sign point C ′ to the straight line A′B ′.
  • L C′W ′ > 0, ⁇ A ′ > 0, ⁇
  • B ′ 0.
  • L C′W ′ ⁇ 0, ⁇ A ′ ⁇ 0, ⁇ B ′. ⁇ 0.
  • L C′W ′ 0.
  • cot ⁇ A '(L A'B' -L C'W '(cot ⁇ A' + cot ⁇ B ')) L B'W' cot ⁇ B '-L A'W' cot ⁇ A ' ⁇ (77)
  • L A′O ′ L A′W′ ⁇ L C′W ′ cot ⁇ A ′ (78)
  • L O′X ′ (L A′W′ ⁇ L C′W ′ cot ⁇ A ′ ) (cot ⁇ A ′ + cot ⁇ A ′ ) sin ⁇ A ′ (79)
  • the point W ′ and the point O ′ coincide.
  • the terminal position X ′ cannot be obtained when the mark points A′B′C ′ and the terminal position X ′ are aligned on the same plane or on the same circumference in the projection plane 410. In this case, the projection plane is reset or moved and remeasured.
  • the terminal positions X ′ and X ′′ projected onto the projection plane 410 and the projection plane 420 are obtained.
  • the terminal position in the real space is obtained as the intersection of the straight lines extending in the projection direction from the terminal positions X ′ and X ′′. X can be determined.
  • the direction in which the mobile terminal 100 faces can be estimated from the measurement, the direction of the internal coordinate system can be corrected.
  • the terminal position X may be obtained again by the same procedure using the corrected internal coordinate system. This procedure can be repeated until the terminal position X and the direction of the internal coordinate system have the required accuracy.
  • the terminal position X can be calculated from only the information of the two marker points 241.
  • the terminal position X follows a circumference 244 passing through points A and B along a straight line AB. Is on a curved surface rotated around the axis. Note that the angle AXB ( ⁇ AB ) can be calculated from the acquired image.
  • the terminal position X (o) on the curved surface can be specified.
  • the direction of the internal coordinate system of the portable terminal 100 cannot be corrected only by measuring the sign point direction ( ⁇ AB ).
  • the estimated value X of the terminal position X (d) is obtained, and the estimated value X (d)
  • a new weighted average value of the specified terminal position X (o) by the above technique mobile The terminal position X of the terminal 100 is set, and the direction of the internal coordinate system is corrected to match the direction (V A ) of the marker point calculated from the terminal position X.
  • X (o) be a leg of a perpendicular line drawn down to a straight line L from the terminal position X (d) estimated by another method. Further, the weighted average value of the terminal position X (d) and the terminal position X (o) is set as a new terminal position X, and the internal coordinate system is matched with the direction (V A ) of the sign point calculated from the terminal position. Correct the direction.
  • the owner 910 images a plurality of sign points 241 at one place and calculates the terminal position X.
  • the owner 910 may calculate the terminal position X using a plurality of landmarks 241 acquired at different positions.
  • the owner 910 of the mobile terminal 100 may not be able to photograph a plurality of sign points 241 necessary for terminal position estimation at one location. This modification assumes such a case.
  • the owner 910 of the mobile terminal 100 moves when the plurality of sign points 241 cannot be photographed at one place, and moves as many times as necessary to estimate the terminal position.
  • the sign points 241 in FIG. 29A, points 241a, 241b, 241c, and 241d) are photographed. At this time, the movement amount is calculated by an internal function of the mobile terminal 100.
  • the calculation unit 172 of the present modification corrects the position information of each marker point 241 using the movement amount (three-dimensional vector amount) so that the marker point 241 is photographed at the same location. That is, the position information of the sign points 241 acquired at different positions is corrected to position information based on the current position. Then, the terminal position X is calculated using the position information of the sign point 241 after correction.
  • Point S the representation of the internal coordinate system of the portable terminal 100 and Q S. It is assumed that the direction of the marker point A is measured at the point S and the direction of the marker point B is measured at the point T.
  • a point A ′ obtained by virtually moving the marker point 241 (point A) measured outside the point T is set as a new marker point 241.
  • the movement amount is an amount that cancels the movement amount between the measurement points. Then, it is assumed that a new sign point 241 (point A ′) is measured at the point T, and the terminal position X is estimated by the above method using another sign point 241 (point B) measured at the point T.
  • the direction and the prospective angle between the marker points 241 are obtained as described above. Also in this modification, the direction and the expected angle between the marker points 241 can be measured as the marker angle and the expected angle in the internal coordinate system of the mobile terminal 100.
  • the movement vector between measurement points is measured by an internal sensor.
  • the movement vector is obtained as a two-time integral value of the triaxial acceleration sensor 143.
  • the calculation unit 172 manages the estimated terminal position X as estimated position data in association with the estimated date and time.
  • the estimated position data is stored in the data storage unit 174, for example.
  • the estimated position data 180 is a point No. that is identification information that uniquely identifies each estimated position.
  • the coordinates (marking coordinates) 185 of the sign point 241 photographed in step 185 and the direction (marking direction) 186 from the portable terminal 100 of the sign point 241 are registered.
  • FIG. 31 is a processing flow of terminal position estimation processing of this application example.
  • the current position is calculated using the four marker points 241.
  • position coordinates (label coordinates) of at least four sign points 241 are acquired, and terminal positions (terminal coordinates) are calculated.
  • the terminal position first calculated by the calculation unit 172 is registered in the estimated position data 180 in association with the calculation time.
  • the position information (marking coordinates) of each marking point 241 used for the calculation and the marking direction in the internal coordinate system of the portable terminal 100 are also registered.
  • the owner 910 When the owner 910 holds the mobile terminal 100, moves a predetermined distance, and finds a new sign point 241, the owner 910 performs a position estimation process via the user I / F 130 of the mobile terminal 100. At the same time, the camera 121 is directed to a new sign point 241.
  • the terminal position estimation unit 170 Upon receiving an instruction from the owner 910, the terminal position estimation unit 170 causes the image acquisition unit 171 to acquire an image of a new sign point 241 (step S3101).
  • the calculation unit 172 analyzes the image, acquires position information (marking coordinates) of the new sign point 241, and creates a new one in the internal coordinate system.
  • the direction (marking direction) of the sign point 241 is acquired (step S3102).
  • the calculating unit 172 further refers to the estimated position data 180, extracts the date and time of the immediately preceding record, and calculates the average speed of the mobile terminal 100 from the date and time to the current time as the speed (step S3103). Note that the velocity is calculated and recorded at predetermined time intervals from the output of the triaxial acceleration sensor 143.
  • the calculation unit 172 uses the acquired sign coordinates, sign direction, and calculated speed as new point data to create a new point number. Are registered in the estimated position data 180 (step S3104).
  • the calculation unit 172 calculates the terminal position X using the latest four sign points 241.
  • the mobile terminal 100 is acquired in a different location except for the new sign point 241. Therefore, first, the movement vector is calculated by the above method. Then, by converting the label direction by the moving vector, the marker coordinates assumed to be measured from the current position are obtained (step S3105).
  • the estimated position data 180 is updated with the obtained sign direction and sign coordinates (step S3106).
  • the current position is calculated using P B1 , P C1 , P D1 and P E2 acquired in step 1. Therefore, P B1 , P C1 , and P D1 are converted into coordinate values based on the current position X 2 according to the movement amount, and P B2 , P C2 , and P D2 are obtained.
  • the sign direction is changed from V 1B , V 1C , and V 1D because both the terminal position and the sign coordinates are changed.
  • the calculation unit 172 calculates the estimated position coordinates of the current position X 2 using the latest four mark points after the conversion (label coordinates P B2 , P C2 , P D2 , P E2 ) (step S3107). Then, the calculation result is stored in the estimated position data 180 (step S3108), and the process ends.
  • an estimated position data 180 After estimation of the current position X 2, an estimated position data 180, shown in FIG. 30 (b). Rewrite the record used to calculate the new estimated position. Information before rewriting may be left as a history.
  • the calculation unit 172 of the mobile terminal 100 sets the sign point 241 to a point. No. 3 records are created and registered. Then, the number of records just before supplementing to make the four marker points (in the example of FIG. 30B , the marker coordinates are selected from the marker point records P C2 , P D2 , and P E2 ) are converted by the amount of movement. . The coordinate position after conversion, direction, using the position orientation information of the new landmarks 241, calculates the terminal position X 3 of the imaging locations.
  • ⁇ Application example 2> each time the owner 910 captures an image, the owner 910 determines the presence / absence of the marker point 241 in the captured image.
  • the terminal position X is calculated, the number of marker points used for calculating the terminal position X is not limited to four. Furthermore, even if it is the same mark point, when it measures from a different terminal position, you may handle as different mark point data. Hereinafter, the process in this case will be described.
  • the data of the estimated position and the marker point 241 are stored in the data storage unit 174 in the acquired state.
  • An example of the estimated position history 190 in this case is shown in FIG. As shown in this figure, the estimated position history 190 holds data of items substantially similar to the estimated position data 180.
  • data No. which is identification information for uniquely specifying each measurement data.
  • the date and time (date / time) 192 at which the landmark used to estimate the position was measured the estimated terminal position (point coordinates) 193, the terminal speed 194 at the estimated time, and the time
  • the coordinates (marking coordinates) 195 of the sign point 241 photographed in step 195 and the direction (marking direction) 196 from the portable terminal 100 of the sign point 241 are registered.
  • FIG. 33 is a processing flow of terminal position estimation processing of this example.
  • the image acquisition unit 171 starts processing upon acquisition of a new image. Note that when a moving image is being shot, the processing may be started at regular intervals.
  • the calculation unit 172 analyzes the acquired image and determines whether or not the sign point 241 has been detected (step S4101). If the moving speed is equal to or lower than a certain value, a plurality of pieces of image data taken continuously may be used assuming that the moving speed is still. Whether or not it has been detected is determined, for example, based on whether or not predetermined position information has been extracted by image analysis.
  • step S4101 If it could not be detected (step S4101; No), the process is terminated as it is.
  • step S4101 when one or more landmarks 241 are detected (step S4101; Yes), first, the calculation unit 172 estimates the terminal position X 0 (d) based on the internal sensor (step S4102).
  • the point No The integrated value of the acceleration of the mobile terminal 100 is added to the coordinates (point coordinates) 193 of the terminal position X1 of 1 and the speed 194 of the mobile terminal 100.
  • the integral value of acceleration is obtained by subtracting the gravitational acceleration from the acceleration measurement value by the triaxial acceleration sensor 143 that is an internal sensor.
  • the terminal position coordinates X 0 (d) and the terminal speed S 0 (d) are estimated by the following equations.
  • a (u) is an acceleration measured by a triaxial acceleration sensor
  • g 1 is a gravitational acceleration plus a correction acceleration, both of which are based on the internal coordinate system.
  • t is a point No. It represents the elapsed time from the time when 1 measurement was performed.
  • the calculation unit 172 specifies the number M (M is an integer equal to or greater than 1) of the detected sign points 241 (step S4103). Specifically, the calculation unit 172 analyzes the image acquired at the terminal position X 0 (d) and detects the number of marker points in the image. The calculation unit 172 obtains an estimated value X 0 (o) of a new terminal position by combining the detected marker point 241 and information of the marker point 241 measured in the past.
  • the calculation unit 172 determines whether to use the number N of sign points to be used (N is an integer equal to or greater than 1) including a new sign point. (Step S4104).
  • an estimated value X 0 (o) of the terminal position is obtained by a method according to the number N of marker points described above (step S4105).
  • the coordinate of the newly measured marker point 241 is P n
  • the new marker point direction vector based on the internal coordinate system at this time is V n (o) (a plurality of marker points 241 measured at the new terminal position are plural. In some cases, the direction of each marker point 241 is also included).
  • the terminal position is moved and a new measurement is performed. Measurement points may be used instead.
  • each weighting coefficient shall have the following relations, for example. 1 ⁇ K 4 > K 3 > K 2 > K 1 > 0 (87) However, the relationship between the weighting factors is not limited to this.
  • the direction of the internal coordinate system is updated (step S4107).
  • the direction vector V n of the new sign point 241 viewed from the terminal position is calculated from the new terminal position coordinate X 0 with the external coordinate value and the new sign point coordinate P n with the same external coordinate value.
  • the internal vector rotated so that the direction vector measured with the internal coordinate system becomes equal to T n. Set the coordinate system as the new target internal coordinate system.
  • the calculated value and the measured value of the direction vector V n may not be completely matched at all the new sign points 241.
  • a new target internal coordinate system is set so that the sum of direction errors is minimized.
  • the target internal coordinate system may be adopted as a new internal coordinate system, the rotation angle for obtaining the target internal coordinate system may be reduced by using a weighting coefficient.
  • each rotation angle may be multiplied by a weight coefficient K ′ n with respect to case n.
  • the weighting coefficient has the following relationship, for example. 1 ⁇ K ′ 4 > K ′ 3 > K ′ 2 > K ′ 1 > 0 (88)
  • the relationship between the weighting factors is not limited to this.
  • step S4108 a new estimate of the velocity at a point X 0 is obtained by the following equation.
  • S 0 S 0 (d) ⁇ (X 0 (d) ⁇ X 0 ) / ⁇ t (89)
  • ⁇ t is No. This is the elapsed time from the measurement time at point 1 to the time when a new measurement was performed.
  • the acceleration measurement value is corrected by correcting the gravitational acceleration.
  • g 0 is the corrected gravitational acceleration.
  • the estimated position history 190 is rewritten. Specifically, it can be rewritten according to the following procedure.
  • the past data is No.
  • the data of (k) is No. Copy to (k + N).
  • no. From (1) to No. The date and time measured as “date / time” as data up to (N), X 0 as “point coordinates”, S 0 as “speed”, P n as “sign coordinates”, and V n as “sign direction” Write.
  • the corrected g 0 is set as the value of g 1 used in the equation (83) in the next step, and the process is terminated.
  • the position of the terminal in the external coordinate system is sequentially estimated, and correction can be performed so that the direction of the internal coordinate system matches the direction of the external coordinate system.
  • the terminal position is estimated in advance as a sign point 241 attached to a sign board or the like.
  • the present invention is not limited to this.
  • a characteristic point of a building such as a representative point of a landmark or a corner of a ceiling may be used.
  • the calculation unit 172 may acquire the position coordinates of these representative points or feature points from the server 960 and perform position calculation.
  • the position coordinates of the sign point 241 are acquired from the position information arranged near the sign point 241 or the QR code, but the present invention is not limited to this. For example, it is good also as a structure which can transmit position information (coordinate value) from the marker point 241.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present invention.
  • the sign point 241 may emit a signal having good straightness such as light.
  • the electromagnetic wave transmitted from the marker point 241 may be a radio wave as long as it has a rectilinearity and a receiving device capable of measuring the direction.
  • the sign point 241 continues to send coordinate information of the sign point 241 as a beacon signal.
  • the identification signal of the sign point 241 may be placed on the beacon signal. In that case, the position information of the sign point 241 may be acquired from the server 960 using the identification signal.
  • a marker point 241 that outputs a beacon signal may be mounted on the moving body.
  • the moving body on which the marker point 241 is mounted or the marker point 241 itself has a configuration capable of calculating its own position from signals from other marker points 241.
  • the sign point 241 can automatically set its own position and can cope with movement.
  • the mobile terminal 100 itself may become a beacon point 241 by a beacon.
  • the terminals When the terminals are close to each other, it is possible to accurately grasp the relative position between the terminals. For example, it is effective for collision avoidance when drones fly densely.
  • the current position is estimated (calculated) in the mobile terminal 100.
  • An image may be acquired by the mobile terminal 100 and transmitted to the server 960 via the access point 970 and the network 940, and the server 960 may calculate the current position of the mobile terminal 100 that is the transmission source. In this case, the calculation result is returned to the mobile terminal 100 that is the transmission source.
  • the present invention is not limited to the above-described embodiments and modifications, and includes various modifications.
  • the above-described embodiments and modification examples have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
  • each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files that realize each function can be stored in a memory unit, a recording device such as a hard disk or SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.
  • control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. In practice, it may be considered that almost all the components are connected to each other.
  • 100 mobile terminal, 101: CPU, 102: bus, 110: storage device, 111: ROM, 112: RAM, 113: storage, 120: photographing device, 121: camera, 122: image processor, 123: image memory, 130 : User I / F, 131: Display, 140: Sensor device, 141: GPS receiver, 142: 3-axis gyro sensor, 143: 3-axis acceleration sensor, 150: Communication device, 151: LAN communication device, 152: Telephone network Communication device 153: Short-range communication device 160: Extended I / F 170: Terminal position estimation unit 171: Image acquisition unit 172: Calculation unit 173: Display control unit 174: Data storage unit 175: QR Code analysis unit, 176: navigation unit, 180: estimated position data, 184: speed, 190: estimated position history, 194: speed, 200: sign board, 200a: sign board, 201: position and orientation information display area, 202: bearing arrow, 211: sign board, 212: sign board, 213: reference point, 2

Abstract

La présente invention permet d'estimer avec précision, avec une configuration simple, la position et l'orientation actuelles d'un utilisateur, même dans un emplacement que des ondes radio provenant d'un satellite, etc., ne peuvent atteindre. Le présent terminal mobile 100 comprend une caméra 121 et une unité de traitement pour traiter une image obtenue par la caméra 121. L'unité de traitement comprend une unité d'obtention d'image 171 pour utiliser la caméra 121 afin d'obtenir une image de panneau qui est une image comprenant deux points de panneau qui sont à des positions différentes éloignées du terminal mobile 100 et ont des informations de position connues, et une unité de calcul 172 pour analyser l'image de panneau obtenue, calculer l'orientation plane et les angles intérieurs d'un premier triangle formé par les deux points de panneau et une position de terminal qui est la position du terminal mobile 100, et utiliser l'orientation plane et les angles intérieurs pour calculer la position de terminal.
PCT/JP2018/022120 2018-06-08 2018-06-08 Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau WO2019234936A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2018/022120 WO2019234936A1 (fr) 2018-06-08 2018-06-08 Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau
JP2020523971A JP7041262B2 (ja) 2018-06-08 2018-06-08 携帯端末、カメラ位置推定システム、カメラ位置推定方法および標識板
JP2022037091A JP7413421B2 (ja) 2018-06-08 2022-03-10 位置および向き認識システム、携帯端末、及び位置および向き認識方法
JP2023220146A JP2024026547A (ja) 2018-06-08 2023-12-27 携帯端末、カメラ位置推定システム、カメラ位置推定方法および標識板

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/022120 WO2019234936A1 (fr) 2018-06-08 2018-06-08 Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau

Publications (1)

Publication Number Publication Date
WO2019234936A1 true WO2019234936A1 (fr) 2019-12-12

Family

ID=68769851

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/022120 WO2019234936A1 (fr) 2018-06-08 2018-06-08 Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau

Country Status (2)

Country Link
JP (3) JP7041262B2 (fr)
WO (1) WO2019234936A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021099682A (ja) * 2019-12-23 2021-07-01 株式会社構造計画研究所 位置推定装置、移動体、位置推定方法及びプログラム
CN113358032A (zh) * 2021-05-20 2021-09-07 中交第二公路工程局有限公司 一种基于激光跟踪测距的隧道内自动监控量测量系统
WO2021256239A1 (fr) * 2020-06-15 2021-12-23 Necソリューションイノベータ株式会社 Dispositif de navigation, système de navigation, procédé de navigation, programme et support de stockage
WO2022224316A1 (fr) * 2021-04-19 2022-10-27 日鉄ソリューションズ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP7429049B2 (ja) 2021-04-23 2024-02-07 株式会社コンピュータサイエンス研究所 歩行者位置特定システム及び歩行者位置特定ソフトウェア
JP7478995B2 (ja) 2020-03-13 2024-05-08 パナソニックIpマネジメント株式会社 携帯端末、ナビゲーション方法、及び、コンピュータプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000205888A (ja) * 1999-01-07 2000-07-28 Hitachi Ltd 位置・方位情報取得方法及び装置
WO2015122389A1 (fr) * 2014-02-12 2015-08-20 ヤマハ発動機株式会社 Dispositif d'imagerie, véhicule et procédé de correction d'image
JP2016070674A (ja) * 2014-09-26 2016-05-09 富士通株式会社 3次元座標算出装置、3次元座標算出方法および3次元座標算出プログラム
JP2016142737A (ja) * 2015-02-04 2016-08-08 プリューフテヒニーク ディーター ブッシュ アクチェンゲゼルシャフト 2個の物体の目標位置偏差を検出する装置ならびに方法
JP2017201281A (ja) * 2016-05-06 2017-11-09 計測ネットサービス株式会社 測量機器による自動視準方法及び自動視準装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3501610B2 (ja) * 1997-01-31 2004-03-02 新日本製鐵株式会社 三次元座標測定方法
JP4282067B2 (ja) 2003-09-30 2009-06-17 キヤノン株式会社 指標識別方法および装置
JP4708752B2 (ja) 2004-09-28 2011-06-22 キヤノン株式会社 情報処理方法および装置
JP6237326B2 (ja) 2014-02-25 2017-11-29 富士通株式会社 姿勢推定装置、姿勢推定方法及び姿勢推定用コンピュータプログラム
JP6572618B2 (ja) 2015-05-08 2019-09-11 富士通株式会社 情報処理装置、情報処理プログラム、情報処理方法、端末装置、設定方法、設定プログラム
JP2017181374A (ja) 2016-03-31 2017-10-05 三井住友建設株式会社 表面高さ表示方法
JP6211157B1 (ja) 2016-09-01 2017-10-11 三菱電機株式会社 キャリブレーション装置およびキャリブレーション方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000205888A (ja) * 1999-01-07 2000-07-28 Hitachi Ltd 位置・方位情報取得方法及び装置
WO2015122389A1 (fr) * 2014-02-12 2015-08-20 ヤマハ発動機株式会社 Dispositif d'imagerie, véhicule et procédé de correction d'image
JP2016070674A (ja) * 2014-09-26 2016-05-09 富士通株式会社 3次元座標算出装置、3次元座標算出方法および3次元座標算出プログラム
JP2016142737A (ja) * 2015-02-04 2016-08-08 プリューフテヒニーク ディーター ブッシュ アクチェンゲゼルシャフト 2個の物体の目標位置偏差を検出する装置ならびに方法
JP2017201281A (ja) * 2016-05-06 2017-11-09 計測ネットサービス株式会社 測量機器による自動視準方法及び自動視準装置

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021099682A (ja) * 2019-12-23 2021-07-01 株式会社構造計画研究所 位置推定装置、移動体、位置推定方法及びプログラム
JP7304284B2 (ja) 2019-12-23 2023-07-06 株式会社構造計画研究所 位置推定装置、移動体、位置推定方法及びプログラム
JP7478995B2 (ja) 2020-03-13 2024-05-08 パナソニックIpマネジメント株式会社 携帯端末、ナビゲーション方法、及び、コンピュータプログラム
WO2021256239A1 (fr) * 2020-06-15 2021-12-23 Necソリューションイノベータ株式会社 Dispositif de navigation, système de navigation, procédé de navigation, programme et support de stockage
JPWO2021256239A1 (fr) * 2020-06-15 2021-12-23
JP7294735B2 (ja) 2020-06-15 2023-06-20 Necソリューションイノベータ株式会社 ナビゲーション装置、ナビゲーションシステム、ナビゲーション方法、プログラム、及び、記録媒体
WO2022224316A1 (fr) * 2021-04-19 2022-10-27 日鉄ソリューションズ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP7429049B2 (ja) 2021-04-23 2024-02-07 株式会社コンピュータサイエンス研究所 歩行者位置特定システム及び歩行者位置特定ソフトウェア
CN113358032A (zh) * 2021-05-20 2021-09-07 中交第二公路工程局有限公司 一种基于激光跟踪测距的隧道内自动监控量测量系统
CN113358032B (zh) * 2021-05-20 2023-12-12 中交第二公路工程局有限公司 一种基于激光跟踪测距的隧道内自动监控量测量系统

Also Published As

Publication number Publication date
JP2024026547A (ja) 2024-02-28
JP2022082583A (ja) 2022-06-02
JP7041262B2 (ja) 2022-03-23
JP7413421B2 (ja) 2024-01-15
JPWO2019234936A1 (ja) 2021-02-25

Similar Documents

Publication Publication Date Title
WO2019234936A1 (fr) Terminal mobile, système d'estimation de position de caméra, procédé d'estimation de position de caméra et panneau
US11704869B2 (en) System and method for determining geo-location(s) in images
US9189853B1 (en) Automatic pose estimation from uncalibrated unordered spherical panoramas
US9953461B2 (en) Navigation system applying augmented reality
US20160327946A1 (en) Information processing device, information processing method, terminal device, and setting method
US10482659B2 (en) System and method for superimposing spatially correlated data over live real-world images
KR101285360B1 (ko) 증강현실을 이용한 관심 지점 표시 장치 및 방법
US9583074B2 (en) Optimization of label placements in street level images
Verma et al. Indoor navigation using augmented reality
JP6180647B2 (ja) クラウドポイントを利用した屋内地図構築装置および方法
US8369578B2 (en) Method and system for position determination using image deformation
Feng et al. Augmented reality markers as spatial indices for indoor mobile AECFM applications
CN110703805B (zh) 立体物体测绘航线规划方法、装置、设备、无人机及介质
JP6597259B2 (ja) プログラム、情報処理装置、画像表示方法、画像処理システム
CN113610702B (zh) 一种建图方法、装置、电子设备及存储介质
JP7220784B2 (ja) 測量用サンプリング点の計画方法、装置、制御端末及び記憶媒体
US10635925B2 (en) Method and system for display the data from the video camera
Hasler et al. Implementation and first evaluation of an indoor mapping application using smartphones and AR frameworks
JP7345366B2 (ja) 情報表示システム、情報表示装置、情報表示方法および情報表示プログラム
KR200488998Y1 (ko) 실내 지도 구축 장치
JP7144164B2 (ja) 情報提供システム、サーバ装置、及び端末用プログラム
CN108062786B (zh) 以三维信息模型为基础的综合感知定位技术应用系统
TWI639133B (zh) 使用點雲的室內地圖構建裝置及方法
JP7039535B2 (ja) 生成装置、生成方法、および生成プログラム
JP6959305B2 (ja) 生成装置、生成方法、および生成プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18921410

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020523971

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18921410

Country of ref document: EP

Kind code of ref document: A1