WO2023243558A1 - Information processing device, program, and information processing system - Google Patents
Information processing device, program, and information processing system Download PDFInfo
- Publication number
- WO2023243558A1 WO2023243558A1 PCT/JP2023/021513 JP2023021513W WO2023243558A1 WO 2023243558 A1 WO2023243558 A1 WO 2023243558A1 JP 2023021513 W JP2023021513 W JP 2023021513W WO 2023243558 A1 WO2023243558 A1 WO 2023243558A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- information processing
- priority
- update
- processing apparatus
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 125
- 238000013468 resource allocation Methods 0.000 claims description 9
- 238000003672 processing method Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 20
- 238000005516 engineering process Methods 0.000 description 19
- 238000000034 method Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 210000000746 body region Anatomy 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
Definitions
- the present technology relates to an information processing device, a program, and an information processing system.
- VPS Vehicle Positioning System
- a 3D map of the real space is used, and by associating the camera images sent from each device with the features of the 3D map, the position and orientation at which the camera image was captured is estimated in the 3D map coordinate system.
- 3D maps can be generated by simultaneously estimating the three-dimensional shape and imaging point for multiple images from different viewpoints using technologies such as SfM (Structure from Motion) and VIO (Visual Inertial Odometry). .
- SfM Structure from Motion
- VIO Visual Inertial Odometry
- Patent Document 1 Although the technology of Patent Document 1 can be used to collect data for creating a 3D map of the environment, there is an unresolved problem that it takes time to create a 3D map.
- This technology was developed in view of these problems, and provides an information processing device, a program, and an information processing system that can efficiently update a 3D map by setting update priorities for the 3D map.
- the purpose is to provide.
- a first technique includes a priority setting unit that sets a priority when updating each of a plurality of 3D maps that are the targets of an update request,
- the information processing apparatus includes a 3D map update unit that uses sensing data to update a 3D map that is the subject of an update request.
- the second technology sets a priority when updating each of a plurality of 3D maps that are the subject of an update request, and uses sensing data based on the priority to update the 3D map that is the subject of an update request.
- This is a program that causes a computer to execute an information processing method for updating a map.
- the third technology is a client that sends sensing data acquired by a sensor unit and requests an update, and a priority that sets a priority when updating each of a plurality of 3D maps that are the targets of an update request.
- This information processing system includes an information processing apparatus including a degree setting section and a 3D map updating section that updates a 3D map that is the subject of an update request using sensing data based on the priority.
- FIG. 1 is a block diagram showing the configuration of a client 10.
- FIG. 2 is a block diagram showing the configuration of a server 20.
- FIG. 1 is a block diagram showing the configuration of an information processing system 1000.
- FIG. 1 is a block diagram showing the configurations of an information processing system 1000 and an information processing device 100.
- FIG. 3 is a sequence diagram showing processing in the client 10 and the information processing device 100.
- FIG. 3 is a flowchart showing processing in the information processing device 100.
- FIG. 3 is a sequence diagram showing processing in the client 10 and the information processing device 100.
- FIG. 12 is a flowchart showing processing in a specific example of the present embodiment. It is a figure which shows the camera image imaged by client 10A in the specific example of this Embodiment.
- Embodiments of the present technology will be described below with reference to the drawings. Note that the explanation will be given in the following order. ⁇ 1. Embodiment> [1-1. Configuration of client 10 and server 20] [1-2. Configuration of information processing system 1000 and information processing device 100] [1-3. Processing in information processing system 1000 and information processing device 100] [1-4. Specific example of processing in information processing system 1000 and information processing device 100] ⁇ 2. Modified example>
- the client 10 included in the information processing system 1000 will be described with reference to FIG. 1.
- the client 10 includes at least a control section 11, a storage section 12, a communication section 13, an input section 14, a display section 15, and a sensor section 16.
- the control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
- the CPU controls the entire client 10 and each part by executing various processes and issuing commands according to programs stored in the ROM.
- the storage unit 12 is, for example, a large capacity storage medium such as a hard disk or flash memory.
- the storage unit 12 stores AR applications and various data used by the client 10.
- the communication unit 13 is a communication module that communicates with the server 20, the Internet, etc. Communication methods can be wired or wireless, including cellular communication, Wi-Fi, Bluetooth (registered trademark), NFC (Near Field Communication), Ethernet (registered trademark), and HDMI (registered trademark) (High-Definition Multimedia). interface), USB (Universal Serial Bus), etc.
- the input unit 14 is used by the provider to input information and give various instructions to the client 10.
- a control signal corresponding to the input is generated and supplied to the control unit 11.
- the control unit 11 then performs various processes corresponding to the control signals.
- the input unit 104 includes a touch panel, a touch screen integrated with a monitor, and the like.
- the display unit 15 is a display device such as a display that displays images, videos, a UI (User Interface) for using the client 10, and the like.
- the sensor unit 16 is various sensors for acquiring sensing data.
- a camera IMU (Inertial Measurement Unit), LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), ToF (Time of Flight) sensor, etc. can be used.
- sensing data may also be information that can be used to specify a location, such as location information obtained by GPS (Global Positioning System), network base station information, Wi-Fi access point name, etc.
- a camera is used as the sensor unit 16.
- the camera includes a lens, an image sensor, a video signal processing circuit, and the like, and can capture a camera image as sensing data for capturing an image of the real world and transmitting it to the server 20.
- the client 10 is configured as described above. Examples of the client 10 include a smartphone, a tablet terminal, a digital camera, a wearable device, a head-mounted display, and a personal computer.
- the control program that performs the processing according to the present technology may be installed in the client 10 in advance, or may be downloaded, distributed on a storage medium, etc., and installed by the provider himself.
- the server 20 includes at least a control section 21, a storage section 22, and a communication section 23.
- the control unit 21, storage unit 22, and communication unit 23 have the same functions as those provided in the client 10.
- the server 20 may be a cloud server, for example.
- the information processing system 1000 includes a plurality of clients 10 and a server 20 that performs processing as the information processing device 100.
- a plurality of clients 10 and servers 20 are connected via a network.
- the client 10 makes one or both of a 3D map update request and a position and orientation estimation request to the information processing apparatus 100.
- the client 10 functions as an AR device by executing an AR application, but a client that only requests a 3D map update and does not request position and orientation estimation may not function as an AR device.
- There is no particular limitation on the number of clients 10, and the clients 10 that utilize 3D map updating and position and orientation estimation by the information processing apparatus 100 are connected to the information processing apparatus 100 via a network.
- FIG. 4 the information processing apparatus 100 constituting the information processing system 1000 and two clients, client A (denoted as client 10A) and client B (denoted as client 10B), which are two of the plurality of clients, are extracted and shown. . Note that FIG. 4 shows only processing blocks related to information and signals related to the present technology.
- data transmission and reception between the client 10A and the information processing device 100 and data transmission and reception between the client 10B and the information processing device 100 are performed by communication functions provided in the client 10A, the client 10B, and the server 20.
- the client 10A requests a 3D map update by transmitting sensing data acquired by the sensor unit 16A to the server 20 as 3D map update data.
- the client 10A may function as an AR device by executing an AR application, or may not function as an AR device.
- the client 10B functions as an AR device by executing an AR application.
- the client 10B makes a request to estimate the position and orientation of the client 10B by transmitting sensing data acquired by the sensor unit 16B to the information processing device 100 as position and orientation estimation data.
- the client 10B includes an application control section 17B and a drawing section 18B.
- the application control unit 17B performs processing related to the AR application executed by the client 10, and acquires the position and orientation estimation results sent from the server 20 and outputs them to the drawing unit 18B.
- the drawing unit 18B draws a virtual object based on the position and orientation estimation results transmitted from the information processing device 100 to generate an output image, and outputs the output image to the display unit 15B. As a result, the output image is displayed on the display section 15B.
- Virtual objects include, for example, characters, figures, objects, and characters that represent information in a map service, and are arranged on images of the real world.
- the application control unit 17B and the drawing unit 18B are realized by the client 10B executing an AR application, and the application may be installed in the client 10B in advance, or may be downloaded or distributed on a storage medium, so that the user can You may also install it yourself. Further, the client 10B may be provided with the functions of the application control section 17B and the drawing section 18B in advance.
- the information processing device 100 includes an update request DB (Database) control unit 101, an update request DB 102, a 3D map usage record DB 103, a 3D map DB 104, a priority setting unit 105, a computer resource allocation unit 106, a 3D map update unit 107, and a 3D map selection unit. section 108 and position/orientation estimating section 109.
- the storage unit 22 of the server 20 functions as the update request DB 102, the 3D map usage record DB 103, and the 3D map DB 104, but each DB may be configured with a different storage medium.
- the update request DB control unit 101 receives 3D map update data sent from the client 10 and stores it in the update request DB 102. In the example of FIG. 4, the update request DB control unit 101 stores the 3D map update data sent from the client 10A in the update request DB 102.
- the update request DB 102 is a database that stores 3D map update data sent from the client 10.
- the 3D map usage record DB 103 is a database that records which 3D map was used to perform position and orientation estimation based on a position and orientation estimation request from the client 10.
- the 3D map DB 104 is a database that stores 3D maps that model real space. Multiple 3D maps exist for each point in real space. A 3D map can be generated by simultaneously estimating the three-dimensional shape and imaging point of multiple images from different viewpoints using techniques such as SfM (Structure from Motion) and VIO (Visual Inertial Odometry).
- the 3D map contains 3D shapes and feature points (Landmarks) that model objects in real space, feature points and features used for position and orientation estimation, images used for 3D shape estimation (Key Frames), etc. Contains data. Examples of feature points include the position of the object, the position of the vertex of an acute angle part of the object, and the outline of the object. In the present technology, it is assumed that a plurality of generated 3D maps are stored in the 3D map DB 104 in advance.
- the priority setting unit 105 sets priorities for each of the plurality of 3D maps that are the targets of update requests based on information regarding the 3D maps.
- the computer resource allocation unit 106 allocates computer resources to be used for updating the 3D map based on the priority.
- the computer is the server 20 that updates the 3D map
- the computer resources are the hardware of the server 20, such as the execution time on the server 20, the usage rate of the processor, and the amount of space occupied by the storage unit 22. It refers to the amount of occupancy.
- the 3D map update unit 107 updates the 3D map that is the target of the update request, using sensing data as 3D map update data and the allocated computer resources.
- the 3D map selection unit 108 selects a 3D map to be used for position and orientation estimation from among the plurality of 3D maps stored in the 3D map DB 104, based on sensing data that is data for position and orientation estimation transmitted from the client 10. select. In the example of FIG. 4, the 3D map selection unit 108 selects a 3D map based on the position and orientation estimation data transmitted from the client 10B.
- the position and orientation estimating unit 109 estimates the position and orientation of the client 10 based on the correspondence between the camera image, which is position and orientation estimation data, transmitted from the client 10 by the VPS and the 3D map selected by the 3D map selection unit 108. presume.
- VPS the position and orientation of the client 10 is estimated in the 3D map coordinate system using a method such as Perspective-n-Point based on the correspondence between feature points and feature amounts of the camera image and the 3D map.
- the position and orientation estimation result by the position and orientation estimation unit 109 is transmitted to the client 10 that made the position and orientation estimation request.
- the position and orientation estimating unit 109 estimates the position and orientation of the client 10B based on the correspondence between the position and orientation estimation data transmitted from the client 10B and the 3D map, and the information processing device 100 estimates the position and orientation of the client 10B.
- the drawing unit 18B of the client 10B draws a virtual object in AR using the position and orientation estimation results.
- the information processing system 1000 and the information processing device 100 are configured as described above.
- the information processing device 100 operates on the server 20, but the server 20 may have the function of the information processing device 100 in advance, or the server 20, which has the function of a computer, may execute the program.
- the information processing device 100 and the information processing method may be realized by the following.
- the control unit 21 may function as the information processing device 100 by executing a program.
- the program may be installed in advance on the server 20, or may be downloaded, distributed on a storage medium, etc., and installed by a user or the like.
- the information processing device 100 may be configured as a single device. Further, the information processing device 100 is not limited to the server 20, and may operate on a personal computer, a smartphone, a tablet terminal, or the like.
- step S101 the sensor unit 16A of the client 10A captures a surrounding camera image to obtain sensing data based on the user's input operation.
- step 102 the client 10A transmits the camera image, which is sensing data, to the information processing device 100 as 3D map update data.
- step S201 the information processing device 100 receives 3D map update data sent from the client 10A.
- step S202 the update request DB control unit 101 stores the 3D map update data in the update request DB 102, and further records the reception time of the 3D map update data.
- the priority setting unit 105 identifies the 3D map that is the target of the update request from among the plurality of 3D maps stored in advance in the 3D map DB 104 based on the 3D map update data.
- the priority setting unit 105 extracts the feature points and feature amounts of the camera image as the received 3D map update data, and uses the feature points and feature amounts and the features of each of the plurality of 3D maps stored in the 3D map DB 104. Points, feature amounts, etc. are compared, and a 3D map that matches or approximates them is identified as a 3D map that is the subject of an update request.
- step S301 the priority setting unit 105 sets priorities for all 3D maps that are the targets of update requests.
- a score is set for the x-th 3D map among the n 3D maps that are the targets of the update request based on information regarding the 3D map.
- the scores calculated from each piece of information regarding the 3D map are a x , b x , c x , d x , e x , f x , and g x, respectively.
- the priority level p x of the x-th 3D map is set from the total score.
- the information regarding the first 3D map is "the number of times position and orientation estimation requests have been made since the time when the 3D map update data was saved.” This is the number of times that position and orientation estimation was performed based on the 3D map before the 3D map that is the target of the update request was updated, even though the information processing device 100 received the 3D map update data. It is.
- the time when the 3D map update data is received is, for example, the time when the update request DB control unit 101 saves the 3D map update data in the update request DB 102, but it can also be the time when the server 20 receives the 3D map update data. good.
- This number of times can be determined by comparing the time of the position/orientation estimation request stored in the 3D map usage record DB 103 with the time of receiving the update request. This number of times is defined as the score ax . Even though the 3D map update data has been received, the position and orientation estimation is performed based on the 3D map before the update, so it is thought that the accuracy of position and orientation estimation using the 3D map has deteriorated. . Therefore, since the 3D map should be updated with priority, the priority is increased by the score ax .
- the information regarding the second 3D map is "the number of times it has been used for position and orientation estimation in the past.” This number of times is recorded in the 3D map usage record DB 103. This number of times is defined as the score bx . It is considered that the more times a 3D map has been used for position and orientation estimation in the past, the more times it will be used for position and orientation estimation in the future. Therefore, since that 3D map should be updated with priority, the priority is increased by the score b x .
- the information regarding the third 3D map is "the ratio of information that can be used for updating the 3D map among the data for updating the 3D map.” Let this ratio be the score cx .
- the 3D map update data sent from the client 10 may include information that cannot be used to update the 3D map. For example, there are camera images with insufficient brightness (light intensity) and blacked-out images, camera images that cannot be used for static 3D maps because they include many moving objects, and the like. Therefore, the higher the ratio of information that can be used to update the 3D map among the information included in the 3D map update data, the higher the priority is given by the score c x .
- Detection of areas with insufficient brightness in the camera image as 3D map update data may be performed, for example, based on whether the average brightness value of the entire specific area in the camera image is below a specific threshold value. Can be done. Furthermore, detection of a moving body region in an image as data for updating a 3D map can be performed based on the results of semantic segmentation or the like. The number of pixels in the area in the camera image that cannot be used for updating such a 3D map is calculated for each camera image, and the total value s is determined.
- the information regarding the fourth 3D map is "the degree of similarity between the 3D map update data and the existing 3D map.”
- the rate at which the semantic segmentation result of a camera image as 3D map update data matches the semantic segmentation result of a camera image included in an existing 3D map can be used as the degree of similarity.
- the degree of similarity expressed by the ratio is defined as the score 1/d x .
- camera images similar to the existing 3D map may exist as data for updating the 3D map. For example, a camera image taken of a place where the scenery has not changed corresponds to this. Since it is more necessary to update the 3D map in places where the scenery is changing, the lower the similarity between the 3D map update data and the existing 3D map, the higher the priority is given by the score 1/d x .
- the information regarding the fifth 3D map is "the number of update requests for the 3D map.” This number can be obtained by referring to the update request DB 102. This number of times is defined as the score e x . It is considered that the more updates are requested for the same 3D map, the higher the need for updating is, so the higher the number of update requests, the higher the priority is given by the score ex .
- the information regarding the sixth 3D map is "the priority level for the 3D map set by the person concerned with the AR application".
- the accuracy of position and orientation estimation required may vary depending on the type of virtual object to be displayed. For example, if a virtual object is floating in the air, the user will not be able to perceive it even if the position is slightly shifted, but if the virtual object is placed in contact with a real object, the user will not be able to perceive it if the position is shifted slightly. Therefore, highly accurate position and orientation estimation is required to draw the virtual object. Therefore, it is possible for those involved in the AR application to set the update priority of the 3D map according to the superimposition accuracy of the virtual object. Let this value be fx . Note that the person involved in the AR application is the developer of the AR application, but may also be an operator, an AR service provider, or the like.
- the information regarding the seventh 3D map is "priority for the 3D map set by the user". If a user perceives a superimposition position shift of a virtual object due to deterioration in the accuracy of position and orientation estimation while using an AR application, it is considered that the 3D map used for the position and orientation estimation needs to be updated. Therefore, the user is allowed to set the update priority of the 3D map in hopes of increasing the superimposition accuracy of virtual objects. Let this value be gx .
- the priority is calculated based on the score obtained from each piece of information regarding these first to seventh 3D maps.
- the priority p x for the xth 3D map can be calculated using Equation 1 below.
- Equation 1 the score obtained as the number of times is converted into a percentage to calculate the priority.
- the information regarding the first to seventh 3D maps described above is just an example, and the priority does not always have to be set based on the information regarding these seven 3D maps.
- the priority may be set based on any one of them, or the priority may be set based on a plurality of them. Therefore, the priority setting unit 105 sets the score obtained from the information regarding one 3D map or the total score obtained from the information regarding a plurality of 3D maps as the priority. Furthermore, the priority may be set based on information regarding 3D maps other than the seven. It is thought that the 3D map can be updated more quickly by reducing the amount of information related to the 3D map used for priority setting and reducing the time required for priority setting.
- the information regarding the plurality of 3D maps described above may be weighted according to various conditions. For example, an AR application developer's idea is to weight information regarding the sixth 3D map as being the most important.
- step S302 the computer resource allocation unit 106 sets the allocation of computer resources to each of the plurality of 3D maps that are the targets of the update request by performing proportional allocation based on the priority.
- the proportional distribution coefficient r x can be calculated using Equation 2 below.
- the computer resources V x that can be used to update the x-th 3D map can be calculated as shown in Equation 3 below.
- V x V ⁇ r x
- proportional allocation coefficient r x of 3D map ⁇ is 0.7 and the proportional allocation coefficient r x of 3D map ⁇ is 0.3, 70% of computer resources will be allocated to update 3D map ⁇ . , 30% is allocated to updating the 3D map ⁇ .
- step S303 the 3D map updating unit 107 uses the allocated computer resources to update the 3D map that is the target of the update request. Note that if the update of the 3D map was interrupted in the past, the update is restarted, and if not, a new update of the 3D map is executed. Similar to 3D map generation, the 3D map can be updated using a camera image as 3D map update data and techniques such as SfM and VIO.
- step S304 if the update of the 3D map is completed before the allocated computer resources are used up, the process proceeds to step S305 (Yes in step S304).
- step S305 the 3D map updating unit 107 replaces the 3D map that is the target of the update request and is stored in the 3D map DB 104 with the updated 3D map.
- step S306 the update request DB control unit 101 deletes the 3D map update data used for updating the 3D map from the update request DB 102.
- step S307 No in step S305.
- step S307 the 3D map update unit 107 saves the progress of the 3D map update in the 3D map DB 104 and interrupts the update midway.
- the information processing apparatus 100 executes steps S303 to S307 for each 3D map that is the subject of an update request.
- step S401 the sensor unit 16B of the client 10B captures a surrounding camera image and obtains sensing data based on the user's input operation.
- step 402 the client 10B transmits the camera image, which is sensing data, to the information processing device 100 as position and orientation estimation data.
- step S501 the information processing device 100 receives position and orientation estimation data transmitted from the client 10B.
- the 3D map selection unit 108 selects one of the plurality of 3D maps stored in the 3D map DB 104 for position and orientation estimation based on the characteristics of the camera image that is the received data for position and orientation estimation. Identify the 3D map. At this time, coarse position information such as GPS or mobile line base station information may be used to narrow down the 3D maps to be used for position and orientation estimation.
- the 3D map selection unit 108 acquires the selected 3D map from the 3D map DB 104 and outputs it to the position and orientation estimation unit 109.
- the 3D map selection unit 108 extracts feature points and feature amounts from the camera image as position and orientation estimation data, and compares the feature points and feature amounts with the feature points and feature amounts included in the 3D map. A 3D map that matches or approximates them is identified as a 3D map to be used for position and orientation estimation.
- step S503 the position and orientation estimating unit 109 estimates the position and orientation of the client 10B using the camera image and the 3D map, which are position and orientation estimation data.
- step S504 the information processing device 100 records information (ID, etc.) that identifies the 3D map used for position and orientation estimation and the time when position and orientation estimation was performed in the 3D map usage record DB 103.
- step S505 the information processing device 100 transmits the position and orientation estimation results to the client 10B.
- step S403 the client 10B receives the position and orientation estimation results transmitted from the information processing device 100.
- step 404 the client 10B executes the AR application using the position and orientation estimation results.
- the application control unit 17B acquires the position and orientation estimation results transmitted from the server 20 and outputs them to the drawing unit 18B, and the drawing unit 18B draws a virtual object based on the position and orientation estimation results to generate an output image,
- the output image is output to the display section 15B.
- the output image is displayed on the display section 15B.
- Processing in the information processing system 1000 and the information processing device 100 is performed as described above.
- step S601 the client 10A transmits a camera image as 3D map update data to the information processing device 100 based on an input instruction from user A to request an update of the 3D map.
- the client 10A has transmitted the camera image shown in FIG. 9 to the information processing apparatus 100.
- the client 10B transmits a camera image as sensing data to the information processing apparatus 100 based on the input instruction from the user B in step S602. Assume that a position and orientation estimation request is made in order to use an AR application.
- user B may or may not know that user A exists or that user A has requested an update of the 3D map.
- the client 10B transmits the camera image B shown in FIG. 10 to the information processing apparatus 100 as position and orientation estimation data.
- camera image A shown in FIG. 9 and camera image B shown in FIG. 10 are images taken from the same position and in the same direction. It is assumed that camera image A shown in FIG. 9 and camera image B shown in FIG. 10 were taken at close dates and times, although not at the same time, and the objects other than the moving vehicle, ie, the automobile, are the same. Therefore, the 3D map that is the target of the 3D map update request from the client 10A and the 3D map used for position and orientation estimation performed in response to the position and orientation estimation request from the client 10B are the same. Note that the same position is not limited to the completely same position, but includes an error of several meters or tens of meters. The same direction is not limited to the completely same direction, but includes an angular error of several degrees or tens of degrees. Moreover, the imaging times of camera image A and camera image B do not need to be the same.
- the information processing device 100 performs position and orientation estimation in response to a position and orientation estimation request from the client 10B, but at this time, the 3D map used for position and orientation estimation may be old.
- the 3D map used for position and orientation estimation may be old.
- the past camera images used for 3D map generation were as shown in FIG. 11, that is, the past scenery at the time of 3D map generation was in the state shown in FIG. In this case, the scenery has changed to something like this.
- the past camera image in FIG. 11 and the current camera image in FIGS. 9 and 10 the building in the area surrounded by the broken line in FIG. 11 has changed. In such a case, it is not possible to estimate the position and orientation using the characteristics of the changed area, so the position and orientation estimation will fail or an estimation result with low accuracy will be obtained.
- step S603 the information processing device 100 updates the 3D map in response to the update request from the client 10A.
- step S604 based on the input instruction from user B, the client 10B transmits the camera image to the information processing device 100 as data for position and orientation estimation, and requests position and orientation estimation again. It is assumed that the camera image at this time is an image taken at the same position and direction as the position and direction in which camera image B shown in FIG. 10 was taken.
- step S604 user B is only using the AR application and is not aware that the 3D map has been updated, and the client 10B is not notified that the 3D map has been updated. .
- User B simply used the AR application twice in step S602 and step S604.
- the information processing device 100 estimates the position and orientation of the client 10B using the received camera image and 3D map.
- the 3D map is updated with the camera image A, which is the current scenery, in step S603 between the position and orientation estimation request in step S602 and the position and orientation estimation request in step S604. Since camera image A is in a state close to camera image B, which is the current scenery, the position and orientation of the client 10B is estimated using the 3D map updated by camera image A and camera image B.
- the position and orientation estimation result for the estimation request has higher accuracy than the position and orientation estimation result for the position and orientation estimation request in step S602. By using this highly accurate position and orientation estimation result, the client 10B can draw a virtual object with higher accuracy.
- the processing in this technology is performed as described above.
- computer resources for updating a 3D map are allocated based on the priority, so the 3D map with a higher priority can be updated more quickly.
- the 3D map can be updated while suppressing a decrease in accuracy and an increase in generation cost.
- highly accurate position and orientation estimation becomes possible.
- the accuracy of position and orientation estimation in an environment with changing scenery can also be improved.
- sensing data is not limited to camera images, and can include acceleration information, distance information, position information, etc. acquired by IMU, LiDAR, ToF sensor, GPS (Global Positioning System), etc. It may be used in place of an image or in combination with a camera image.
- one information processing apparatus 100 performs processing in the server 20, but as shown in FIG. It may be divided into 100C and each may be processed by a different server. Note that the division of information processing devices and servers shown in FIG. 12 is just an example, and the division is not limited to this example.
- the client 10A that makes a 3D map update request and the client 10B that makes a position and orientation estimation request are described as different clients, but one client 10 makes both a 3D map update request and a position and orientation estimation request. Good too.
- the embodiment has been described as having one server 20, the number of servers 20 is not limited.
- the present technology may be combined with a 3D map update method that uses multiple servers 20. This can reduce the time required to update the 3D map.
- the present technology may be used in conjunction with existing low-precision 3D map generation methods. For example, upon receiving the update request, the information processing apparatus 100 first generates a low-precision 3D map at high speed and updates the 3D map. Although the accuracy of the position and orientation estimation results will not improve because it has been updated with a low-precision 3D map, the success rate of position and orientation estimation can be expected to improve because the position and orientation can be estimated using a 3D map based on the new scenery. After that, a high-precision 3D map is generated using the present technology, and the low-precision 3D map is updated with the high-precision 3D map.
- the position and orientation estimating unit 109 estimates the position and orientation of the client 10, but the information processing device 100 can use VPS to estimate only the position or only the orientation of the client 10. be.
- the present technology can also have the following configuration.
- a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
- An information processing device comprising: a 3D map update unit that updates the 3D map that is the target of an update request using sensing data based on the priority.
- the priority setting unit sets the priority based on information regarding the 3D map.
- the priority setting unit sets a value obtained from information regarding one of the 3D maps or a total of values obtained from information regarding a plurality of the 3D maps as the priority.
- the information regarding the 3D map is the information according to (2) or (3), which is the number of times a position/orientation estimation request using the 3D map that is the subject of the update request has been made after the update request has been made. Processing equipment.
- the information processing device according to any one of (2) to (6), wherein the information regarding the 3D map is a degree of similarity between the sensing data and the 3D map.
- the information processing device according to any one of (2) to (7), wherein the information regarding the 3D map is the number of past update requests for the 3D map.
- the information processing device according to any one of (2) to (8), wherein the information regarding the 3D map is a priority set by a person involved in an application that uses the result of position and orientation estimation using the 3D map.
- the information processing device according to any one of (2) to (9), wherein the information regarding the 3D map is a priority set by a user of an application that uses the result of position and orientation estimation using the 3D map.
- the information processing device according to any one of (1) to (10), wherein the priority setting unit identifies the 3D map that is the target of an update request from among the plurality of 3D maps based on the sensing data.
- the information processing apparatus according to any one of (1) to (11), including a computer resource allocation unit that allocates computer resources to be used for updating the 3D map for each of the 3D maps to be updated based on the priority. .
- the computer resource allocation unit allocates the computer resources to each 3D map by proportionally allocating the computer resources based on the priority.
- the information processing device according to any one of (1) to (13), wherein the sensing data is an image of real space captured by a camera.
- the information processing apparatus including a 3D map selection unit that selects a 3D map to be used for estimating the position and orientation of the device based on sensing data.
- the information processing apparatus further comprising a position and orientation estimation unit that estimates the position and orientation of the device using the sensing data and the selected 3D map.
- the device is an AR device.
- a client that sends sensing data acquired by the sensor unit and requests an update, a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
- an information processing device comprising: a 3D map update unit that updates the 3D map that is the target of the update request using sensing data based on the priority;
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Instructional Devices (AREA)
Abstract
An information processing device comprising a priority setting unit that sets a priority for update for each of a plurality of 3D maps which are targets of an update request and a 3D map update unit that uses sensing data to update the 3D maps on the basis of the priorities, the 3D maps being the targets of the update request.
Description
本技術は、情報処理装置、プログラムおよび情報処理システムに関する。
The present technology relates to an information processing device, a program, and an information processing system.
AR(拡張現実:Augmented Reality)においては、複数のデバイスに対し、現実空間の特定の場所に同じ3Dオブジェクト(仮想物)を表示する場合がある。そのためには、各デバイスの位置および姿勢を共通の座標系で特定する必要がある。これを実現する手段として、カメラ画像から各端末の位置および姿勢を推定するVPS(Visual Positioning System)と呼ばれる方法がある。
In AR (Augmented Reality), the same 3D object (virtual object) may be displayed on multiple devices at a specific location in real space. To do this, it is necessary to specify the position and orientation of each device using a common coordinate system. As a means for realizing this, there is a method called VPS (Visual Positioning System) that estimates the position and orientation of each terminal from camera images.
VPSでは現実空間の3Dマップを用い、各デバイスから送信されたカメラ画像と3Dマップの特徴を対応付けることで、カメラ画像が撮像された位置姿勢を3Dマップ座標系で推定する。3DマップはSfM(Structure from Motion)や、VIO(Visual Inertial Odometry)などの技術を利用して、視点の異なる複数枚の画像に対して、三次元形状と撮像地点を同時に推定することで生成できる。利用時に撮像するカメラ画像と過去データによって生成された3Dマップの対応関係から端末の位置姿勢推定を行う場合、3Dマップが生成された時点での現実空間の風景と現在の現実空間の風景が乖離していないことが望ましい。
In VPS, a 3D map of the real space is used, and by associating the camera images sent from each device with the features of the 3D map, the position and orientation at which the camera image was captured is estimated in the 3D map coordinate system. 3D maps can be generated by simultaneously estimating the three-dimensional shape and imaging point for multiple images from different viewpoints using technologies such as SfM (Structure from Motion) and VIO (Visual Inertial Odometry). . When estimating the position and orientation of a device from the correspondence between the camera image captured during use and the 3D map generated from past data, there is a discrepancy between the real space landscape at the time the 3D map was generated and the current real space landscape. It is desirable that you do not.
しかし、現実空間にある多くの物体は移動や変形が可能なため、過去の3Dマップ生成時点と現在において環境が変わっていることがある。この環境の変化が位置姿勢推定の成功率や精度に悪影響を及ぼし、仮想物が意図した位置に表示されないという問題が存在する。
However, since many objects in real space can move and deform, the environment may have changed between the past 3D map generation time and the present. There is a problem that this change in the environment has a negative effect on the success rate and accuracy of position and orientation estimation, and the virtual object is not displayed in the intended position.
物体が移動や変形するたびに現実空間の3Dマップを更新することでそのような問題を解決することはできるが、3Dマップの更新には長い処理時間がかかるため、3Dマップの更新が完了するまでは過去の3Dマップが利用され、位置姿勢推定の精度や成功率が低くなってしまうという問題がある。
Such problems can be solved by updating the 3D map of the real space every time the object moves or deforms, but updating the 3D map takes a long processing time, so the 3D map update is completed. Previously, past 3D maps were used, which resulted in a problem that the accuracy and success rate of position and orientation estimation were low.
そこで、データベース内の記憶されたデータセットと画像とを比較して、記憶されたデータセットに環境の1つ以上の詳細が不足していると判定した場合に追加データの取得を要求することにより環境の追加データを取得する技術が提案されている。
Therefore, by comparing the image with the stored dataset in the database and requesting the retrieval of additional data if it is determined that the stored dataset is missing one or more details of the environment. Techniques have been proposed to obtain additional data of the environment.
特許文献1の技術は、環境の3Dマップを作成するためのデータの収集に用いることはできると考えられるが、3Dマップの生成に時間がかかるという未解決の問題がある。
Although the technology of Patent Document 1 can be used to collect data for creating a 3D map of the environment, there is an unresolved problem that it takes time to create a 3D map.
本技術はこのような問題点に鑑みなされたものであり、3Dマップに更新の優先度を設定することにより、効率よく3Dマップの更新を行うことができる情報処理装置、プログラムおよび情報処理システムを提供することを目的とする。
This technology was developed in view of these problems, and provides an information processing device, a program, and an information processing system that can efficiently update a 3D map by setting update priorities for the 3D map. The purpose is to provide.
上述した課題を解決するために、第1の技術は、更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、優先度に基づいて、センシングデータを用いて更新要求の対象である3Dマップを更新する3Dマップ更新部とを備える情報処理装置である。
In order to solve the above-mentioned problems, a first technique includes a priority setting unit that sets a priority when updating each of a plurality of 3D maps that are the targets of an update request, The information processing apparatus includes a 3D map update unit that uses sensing data to update a 3D map that is the subject of an update request.
また、第2の技術は、更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定し、優先度に基づいて、センシングデータを用いて更新要求の対象である3Dマップを更新する情報処理方法をコンピュータに実行させるプログラムである。
In addition, the second technology sets a priority when updating each of a plurality of 3D maps that are the subject of an update request, and uses sensing data based on the priority to update the 3D map that is the subject of an update request. This is a program that causes a computer to execute an information processing method for updating a map.
さらに、第3の技術は、センサ部で取得したセンシングデータを送信して更新要求を行うクライアントと、更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、優先度に基づいて、センシングデータを用いて更新要求の対象である3Dマップを更新する3Dマップ更新部とを備える情報処理装置とからなる情報処理システムである。
Furthermore, the third technology is a client that sends sensing data acquired by a sensor unit and requests an update, and a priority that sets a priority when updating each of a plurality of 3D maps that are the targets of an update request. This information processing system includes an information processing apparatus including a degree setting section and a 3D map updating section that updates a 3D map that is the subject of an update request using sensing data based on the priority.
以下、本技術の実施の形態について図面を参照しながら説明する。なお、説明は以下の順序で行う。
<1.実施の形態>
[1-1.クライアント10とサーバ20の構成]
[1-2.情報処理システム1000および情報処理装置100の構成]
[1-3.情報処理システム1000および情報処理装置100における処理]
[1-4.情報処理システム1000および情報処理装置100における処理の具体例]
<2.変形例> Embodiments of the present technology will be described below with reference to the drawings. Note that the explanation will be given in the following order.
<1. Embodiment>
[1-1. Configuration ofclient 10 and server 20]
[1-2. Configuration ofinformation processing system 1000 and information processing device 100]
[1-3. Processing ininformation processing system 1000 and information processing device 100]
[1-4. Specific example of processing ininformation processing system 1000 and information processing device 100]
<2. Modified example>
<1.実施の形態>
[1-1.クライアント10とサーバ20の構成]
[1-2.情報処理システム1000および情報処理装置100の構成]
[1-3.情報処理システム1000および情報処理装置100における処理]
[1-4.情報処理システム1000および情報処理装置100における処理の具体例]
<2.変形例> Embodiments of the present technology will be described below with reference to the drawings. Note that the explanation will be given in the following order.
<1. Embodiment>
[1-1. Configuration of
[1-2. Configuration of
[1-3. Processing in
[1-4. Specific example of processing in
<2. Modified example>
<1.実施の形態>
[1-1.クライアント10とサーバ20の構成]
図1を参照して情報処理システム1000の構成するクライアント10について説明する。クライアント10は少なくとも制御部11、記憶部12、通信部13、入力部14、表示部15、センサ部16を備えている。 <1. Embodiment>
[1-1. Configuration ofclient 10 and server 20]
Theclient 10 included in the information processing system 1000 will be described with reference to FIG. 1. The client 10 includes at least a control section 11, a storage section 12, a communication section 13, an input section 14, a display section 15, and a sensor section 16.
[1-1.クライアント10とサーバ20の構成]
図1を参照して情報処理システム1000の構成するクライアント10について説明する。クライアント10は少なくとも制御部11、記憶部12、通信部13、入力部14、表示部15、センサ部16を備えている。 <1. Embodiment>
[1-1. Configuration of
The
制御部11は、CPU(Central Processing Unit)、RAM(Random Access Memory)およびROM(Read Only Memory)などから構成されている。CPUは、ROMに記憶されたプログラムに従い様々な処理を実行してコマンドの発行を行うことによってクライアント10の全体および各部の制御を行う。
The control unit 11 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like. The CPU controls the entire client 10 and each part by executing various processes and issuing commands according to programs stored in the ROM.
記憶部12は、例えばハードディスク、フラッシュメモリなどの大容量記憶媒体である。記憶部12にはクライアント10で使用するARアプリケーションや各種データなどが格納されている。
The storage unit 12 is, for example, a large capacity storage medium such as a hard disk or flash memory. The storage unit 12 stores AR applications and various data used by the client 10.
通信部13はサーバ20やインターネットなどとの通信を行う通信モジュールである。通信方式は有線または無線を問わず、具体的にはセルラー通信、Wi-Fi、Bluetooth(登録商標)、NFC(Near Field Communication)、イーサネット(登録商標)、HDMI(登録商標)(High-Definition Multimedia Interface)、USB(Universal Serial Bus)などを含みうる。
The communication unit 13 is a communication module that communicates with the server 20, the Internet, etc. Communication methods can be wired or wireless, including cellular communication, Wi-Fi, Bluetooth (registered trademark), NFC (Near Field Communication), Ethernet (registered trademark), and HDMI (registered trademark) (High-Definition Multimedia). interface), USB (Universal Serial Bus), etc.
入力部14は、クライアント10に対して提供者が情報の入力、各種指示など行うためのものである。入力部104に対してユーザから入力がなされると、その入力に応じた制御信号が生成されて制御部11に供給される。そして、制御部11はその制御信号に対応した各種処理を行う。入力部104は物理ボタンの他、タッチパネル、モニタと一体に構成されたタッチスクリーンなどがある。
The input unit 14 is used by the provider to input information and give various instructions to the client 10. When a user makes an input to the input unit 104, a control signal corresponding to the input is generated and supplied to the control unit 11. The control unit 11 then performs various processes corresponding to the control signals. In addition to physical buttons, the input unit 104 includes a touch panel, a touch screen integrated with a monitor, and the like.
表示部15は、画像や映像、クライアント10を使用するためのUI(User Interface)などを表示するディスプレイなどの表示デバイスである。
The display unit 15 is a display device such as a display that displays images, videos, a UI (User Interface) for using the client 10, and the like.
センサ部16はセンシングデータを取得するための各種のセンサである。センサ部16としてはカメラ、IMU(Inertial Measurement Unit)、LiDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)、ToF(Time of Flight)センサなどを用いることができる。さらに、また、GPS(Global Positioning System)で取得する位置情報、ネットワークの基地局情報、Wi-Fiのアクセスポイント名など、位置の特定に利用可能な情報をセンシングデータとしてもよい。
The sensor unit 16 is various sensors for acquiring sensing data. As the sensor unit 16, a camera, IMU (Inertial Measurement Unit), LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), ToF (Time of Flight) sensor, etc. can be used. Furthermore, sensing data may also be information that can be used to specify a location, such as location information obtained by GPS (Global Positioning System), network base station information, Wi-Fi access point name, etc.
本実施の形態ではセンサ部16としてカメラを用いるものとする。カメラはレンズ、撮像素子、映像信号処理回路などから構成され、現実世界を撮像してサーバ20に送信するためのセンシングデータとしてカメラ画像を撮像することができる。
In this embodiment, a camera is used as the sensor unit 16. The camera includes a lens, an image sensor, a video signal processing circuit, and the like, and can capture a camera image as sensing data for capturing an image of the real world and transmitting it to the server 20.
クライアント10は以上のようにして構成されている。クライアント10は例えばスマートフォン、タブレット端末、デジタルカメラ、ウェアラブルデバイス、ヘッドマウントディスプレイ、パーソナルコンピュータなどがある。本技術に係る処理を行う制御プログラムは予めクライアント10内にインストールされていてもよいし、ダウンロード、記憶媒体などで配布されて、提供者が自らインストールするようにしてもよい。
The client 10 is configured as described above. Examples of the client 10 include a smartphone, a tablet terminal, a digital camera, a wearable device, a head-mounted display, and a personal computer. The control program that performs the processing according to the present technology may be installed in the client 10 in advance, or may be downloaded, distributed on a storage medium, etc., and installed by the provider himself.
次に図2を参照して情報処理システム1000を構成するサーバ20について説明する。サーバ20は少なくとも制御部21、記憶部22、通信部23を備えて構成されている。制御部21、記憶部22、通信部23はクライアント10が備えるものと同様の機能を有するものである。サーバ20は例えばクラウドサーバでもよい。
Next, the server 20 that constitutes the information processing system 1000 will be explained with reference to FIG. 2. The server 20 includes at least a control section 21, a storage section 22, and a communication section 23. The control unit 21, storage unit 22, and communication unit 23 have the same functions as those provided in the client 10. The server 20 may be a cloud server, for example.
[1-2.情報処理システム1000および情報処理装置100の構成]
次に図3および図4を参照して情報処理システム1000と情報処理装置100の構成について説明する。 [1-2. Configuration ofinformation processing system 1000 and information processing device 100]
Next, the configurations of theinformation processing system 1000 and the information processing apparatus 100 will be described with reference to FIGS. 3 and 4.
次に図3および図4を参照して情報処理システム1000と情報処理装置100の構成について説明する。 [1-2. Configuration of
Next, the configurations of the
図3に示すように情報処理システム1000は複数のクライアント10と情報処理装置100としての処理を行うサーバ20により構成されている。複数のクライアント10とサーバ20はネットワークを介して接続されている。クライアント10は情報処理装置100に対して3Dマップの更新要求と位置姿勢推定要求のいずれか一方または両方を行うものである。なお、クライアント10はARアプリケーションを実行することによりARデバイスとして機能するものであるが、3Dマップの更新要求のみを行い、位置姿勢推定要求を行わないものはARデバイスとして機能しないものでもよい。クライアント10の数に特に限定はなく、情報処理装置100による3Dマップの更新と位置姿勢推定を利用するクライアント10はネットワークを介して情報処理装置100に接続される。
As shown in FIG. 3, the information processing system 1000 includes a plurality of clients 10 and a server 20 that performs processing as the information processing device 100. A plurality of clients 10 and servers 20 are connected via a network. The client 10 makes one or both of a 3D map update request and a position and orientation estimation request to the information processing apparatus 100. Note that the client 10 functions as an AR device by executing an AR application, but a client that only requests a 3D map update and does not request position and orientation estimation may not function as an AR device. There is no particular limitation on the number of clients 10, and the clients 10 that utilize 3D map updating and position and orientation estimation by the information processing apparatus 100 are connected to the information processing apparatus 100 via a network.
図4では情報処理システム1000を構成する情報処理装置100と、複数のクライアントのうちの2つのクライアントであるクライアントA(クライアント10Aと示す)およびクライアントB(クライアント10Bと示す)を抜き出して示している。なお、図4では本技術に関わる情報や信号に関わる処理ブロックのみを示している。
In FIG. 4, the information processing apparatus 100 constituting the information processing system 1000 and two clients, client A (denoted as client 10A) and client B (denoted as client 10B), which are two of the plurality of clients, are extracted and shown. . Note that FIG. 4 shows only processing blocks related to information and signals related to the present technology.
図4においてクライアント10Aと情報処理装置100間のデータの送受信、クライアント10Bと情報処理装置100間のデータの送受信はクライアント10A、クライアント10B、サーバ20が備える通信機能で実行される。
In FIG. 4, data transmission and reception between the client 10A and the information processing device 100 and data transmission and reception between the client 10B and the information processing device 100 are performed by communication functions provided in the client 10A, the client 10B, and the server 20.
クライアント10Aはセンサ部16Aで取得したセンシングデータを3Dマップ更新用データとしてサーバ20に送信することにより3Dマップの更新要求を行う。なお、クライアント10AはARアプリケーションを実行することによりARデバイスとして機能するものでもよいし、ARデバイスとしては機能しないものでもよい。
The client 10A requests a 3D map update by transmitting sensing data acquired by the sensor unit 16A to the server 20 as 3D map update data. Note that the client 10A may function as an AR device by executing an AR application, or may not function as an AR device.
また、クライアント10BはARアプリケーションを実行することによりARデバイスとして機能するものである。クライアント10Bはセンサ部16Bで取得したセンシングデータを位置姿勢推定用データとして情報処理装置100に送信することによりクライアント10Bの位置姿勢を推定する要求を行う。
Furthermore, the client 10B functions as an AR device by executing an AR application. The client 10B makes a request to estimate the position and orientation of the client 10B by transmitting sensing data acquired by the sensor unit 16B to the information processing device 100 as position and orientation estimation data.
クライアント10Bは図1に示す構成に加えてアプリケーション制御部17B、描画部18Bを備える。
In addition to the configuration shown in FIG. 1, the client 10B includes an application control section 17B and a drawing section 18B.
アプリケーション制御部17Bは、クライアント10が実行するARアプリケーションに関する処理を行うものであり、サーバ20から送信された位置姿勢推定結果を取得して描画部18Bに出力する。
The application control unit 17B performs processing related to the AR application executed by the client 10, and acquires the position and orientation estimation results sent from the server 20 and outputs them to the drawing unit 18B.
描画部18Bは情報処理装置100から送信された位置姿勢推定結果に基づいて仮想物を描画して出力画像を生成し、出力画像を表示部15Bに出力する。これにより出力画像は表示部15Bにおいて表示される。仮想物は例えば地図サービスにおいて情報を示す文字、図形、オブジェクト、現実世界を撮像した画像上に配置されるキャラクタなどである。
The drawing unit 18B draws a virtual object based on the position and orientation estimation results transmitted from the information processing device 100 to generate an output image, and outputs the output image to the display unit 15B. As a result, the output image is displayed on the display section 15B. Virtual objects include, for example, characters, figures, objects, and characters that represent information in a map service, and are arranged on images of the real world.
アプリケーション制御部17Bと描画部18Bはクライアント10BがAR用アプリケーションを実行することにより実現され、そのアプリケーションは予めクライアント10Bにインストールされていてもよいし、ダウンロード、記憶媒体などで配布されて、ユーザが自らインストールするようにしてもよい。また、クライアント10Bが予めアプリケーション制御部17Bと描画部18Bとしての機能を備えていてもよい。
The application control unit 17B and the drawing unit 18B are realized by the client 10B executing an AR application, and the application may be installed in the client 10B in advance, or may be downloaded or distributed on a storage medium, so that the user can You may also install it yourself. Further, the client 10B may be provided with the functions of the application control section 17B and the drawing section 18B in advance.
情報処理装置100は更新要求DB(Database)制御部101、更新要求DB102、3Dマップ利用記録DB103、3DマップDB104、優先度設定部105、計算機資源割当部106、3Dマップ更新部107、3Dマップ選択部108、位置姿勢推定部109を備える。本実施の形態ではサーバ20の記憶部22が更新要求DB102、3Dマップ利用記録DB103、3DマップDB104として機能するが、各DBがそれぞれ異なる記憶媒体により構成されてもよい。
The information processing device 100 includes an update request DB (Database) control unit 101, an update request DB 102, a 3D map usage record DB 103, a 3D map DB 104, a priority setting unit 105, a computer resource allocation unit 106, a 3D map update unit 107, and a 3D map selection unit. section 108 and position/orientation estimating section 109. In this embodiment, the storage unit 22 of the server 20 functions as the update request DB 102, the 3D map usage record DB 103, and the 3D map DB 104, but each DB may be configured with a different storage medium.
更新要求DB制御部101は、クライアント10から送信された3Dマップ更新用データを受信して更新要求DB102に保存する。図4の例では、更新要求DB制御部101はクライアント10Aから送信された3Dマップ更新用データを更新要求DB102に保存する。
The update request DB control unit 101 receives 3D map update data sent from the client 10 and stores it in the update request DB 102. In the example of FIG. 4, the update request DB control unit 101 stores the 3D map update data sent from the client 10A in the update request DB 102.
更新要求DB102はクライアント10から送信された3Dマップ更新用データを保存するデータベースである。
The update request DB 102 is a database that stores 3D map update data sent from the client 10.
3Dマップ利用記録DB103は、クライアント10からの位置姿勢推定要求に基づいて行われた位置姿勢推定がどの3Dマップを利用して行われたのかを記録するデータベースである。
The 3D map usage record DB 103 is a database that records which 3D map was used to perform position and orientation estimation based on a position and orientation estimation request from the client 10.
3DマップDB104は現実空間をモデル化した3Dマップを保存するデータベースである。3Dマップは現実空間の地点ごとに複数存在する。3DマップはSfM(Structure from Motion)やVIO(Visual Inertial Odometry)などの技術を利用して、視点の異なる複数枚の画像に対して三次元形状と撮像地点を同時に推定することで生成できる。3Dマップには、現実空間の物体をモデル化した3次元の形状や特徴点(Landmark)、位置姿勢推定時に利用する特徴点、特徴量、三次元形状推定に利用した画像(Key Frame)などのデータが含まれている。特徴点としては例えば、物体の位置、物体が有する鋭角部分の頂点の位置、物体の輪郭などがある。本技術では予め3DマップDB104に生成済の複数の3Dマップが保存されているものとする。
The 3D map DB 104 is a database that stores 3D maps that model real space. Multiple 3D maps exist for each point in real space. A 3D map can be generated by simultaneously estimating the three-dimensional shape and imaging point of multiple images from different viewpoints using techniques such as SfM (Structure from Motion) and VIO (Visual Inertial Odometry). The 3D map contains 3D shapes and feature points (Landmarks) that model objects in real space, feature points and features used for position and orientation estimation, images used for 3D shape estimation (Key Frames), etc. Contains data. Examples of feature points include the position of the object, the position of the vertex of an acute angle part of the object, and the outline of the object. In the present technology, it is assumed that a plurality of generated 3D maps are stored in the 3D map DB 104 in advance.
優先度設定部105は、更新要求の対象となる複数の3Dマップのそれぞれについて、3Dマップに関する情報に基づいて優先度を設定する。
The priority setting unit 105 sets priorities for each of the plurality of 3D maps that are the targets of update requests based on information regarding the 3D maps.
計算機資源割当部106は優先度に基づいて、3Dマップの更新のために利用する計算機資源を配分する。計算機とは本実施の形態では3Dマップの更新を行うサーバ20であり、計算機資源とは、サーバ20における実行時間や、プロセッサの利用率、記憶部22の占有量といった、サーバ20のハードウェアの占有量のことである。
The computer resource allocation unit 106 allocates computer resources to be used for updating the 3D map based on the priority. In this embodiment, the computer is the server 20 that updates the 3D map, and the computer resources are the hardware of the server 20, such as the execution time on the server 20, the usage rate of the processor, and the amount of space occupied by the storage unit 22. It refers to the amount of occupancy.
3Dマップ更新部107は、3Dマップ更新用データとしてのセンシングデータと、割り当てられた計算機資源を利用して、更新要求の対象である3Dマップを更新する。
The 3D map update unit 107 updates the 3D map that is the target of the update request, using sensing data as 3D map update data and the allocated computer resources.
3Dマップ選択部108は、クライアント10から送信された位置姿勢推定用データであるセンシングデータに基づいて、3DマップDB104に保存されている複数の3Dマップの中から位置姿勢推定において利用する3Dマップを選択する。図4の例では、3Dマップ選択部108はクライアント10Bから送信された位置姿勢推定用データに基づいて3Dマップを選択する。
The 3D map selection unit 108 selects a 3D map to be used for position and orientation estimation from among the plurality of 3D maps stored in the 3D map DB 104, based on sensing data that is data for position and orientation estimation transmitted from the client 10. select. In the example of FIG. 4, the 3D map selection unit 108 selects a 3D map based on the position and orientation estimation data transmitted from the client 10B.
位置姿勢推定部109は、VPSにより、クライアント10から送信された位置姿勢推定用データであるカメラ画像と3Dマップ選択部108が選択した3Dマップとの対応関係に基づいてクライアント10の位置および姿勢を推定する。VPSではカメラ画像と3Dマップの特徴点や特徴量の対応関係からPerspective-n-Pointなどの方法を用いてクライアント10の位置姿勢を3Dマップ座標系において推定する。
The position and orientation estimating unit 109 estimates the position and orientation of the client 10 based on the correspondence between the camera image, which is position and orientation estimation data, transmitted from the client 10 by the VPS and the 3D map selected by the 3D map selection unit 108. presume. In VPS, the position and orientation of the client 10 is estimated in the 3D map coordinate system using a method such as Perspective-n-Point based on the correspondence between feature points and feature amounts of the camera image and the 3D map.
位置姿勢推定部109による位置姿勢推定結果は位置姿勢推定要求を行ったクライアント10に送信される。図4の例では、位置姿勢推定部109はクライアント10Bから送信された位置姿勢推定用データと3Dマップとの対応関係に基づいてクライアント10Bの位置姿勢を推定し、情報処理装置100は位置姿勢推定結果をクライアント10Bに送信する。クライアント10Bの描画部18Bこの位置姿勢推定結果を用いてARにおける仮想物の描画を行う。
The position and orientation estimation result by the position and orientation estimation unit 109 is transmitted to the client 10 that made the position and orientation estimation request. In the example of FIG. 4, the position and orientation estimating unit 109 estimates the position and orientation of the client 10B based on the correspondence between the position and orientation estimation data transmitted from the client 10B and the 3D map, and the information processing device 100 estimates the position and orientation of the client 10B. Send the result to client 10B. The drawing unit 18B of the client 10B draws a virtual object in AR using the position and orientation estimation results.
情報処理システム1000と情報処理装置100は以上のようにして構成されている。本実施の形態では情報処理装置100はサーバ20において動作するが、予めサーバ20が情報処理装置100としての機能を備えていてもよいし、コンピュータとしての機能を備えるサーバ20がプログラムを実行することにより情報処理装置100および情報処理方法が実現されてもよい。また、プログラムの実行により制御部21が情報処理装置100として機能してもよい。そのプログラムは予めサーバ20にインストールされていてもよいし、ダウンロード、記憶媒体などで配布されて、ユーザなどがインストールするようにしてもよい。また、情報処理装置100は単体の装置として構成されてもよい。また、情報処理装置100はサーバ20に限らず、パーソナルコンピュータ、スマートフォン、タブレット端末などで動作するものでもよい。
The information processing system 1000 and the information processing device 100 are configured as described above. In this embodiment, the information processing device 100 operates on the server 20, but the server 20 may have the function of the information processing device 100 in advance, or the server 20, which has the function of a computer, may execute the program. The information processing device 100 and the information processing method may be realized by the following. Further, the control unit 21 may function as the information processing device 100 by executing a program. The program may be installed in advance on the server 20, or may be downloaded, distributed on a storage medium, etc., and installed by a user or the like. Further, the information processing device 100 may be configured as a single device. Further, the information processing device 100 is not limited to the server 20, and may operate on a personal computer, a smartphone, a tablet terminal, or the like.
[1-3.情報処理システム1000と情報処理装置100における処理]
次に図5乃至図7を参照して情報処理システム1000と情報処理装置100における処理について説明する。以下の説明ではセンサ部16が取得するセンシングデータはカメラで撮像されたカメラ画像であるとする。 [1-3. Processing ininformation processing system 1000 and information processing device 100]
Next, processing in theinformation processing system 1000 and the information processing apparatus 100 will be described with reference to FIGS. 5 to 7. In the following description, it is assumed that the sensing data acquired by the sensor unit 16 is a camera image captured by a camera.
次に図5乃至図7を参照して情報処理システム1000と情報処理装置100における処理について説明する。以下の説明ではセンサ部16が取得するセンシングデータはカメラで撮像されたカメラ画像であるとする。 [1-3. Processing in
Next, processing in the
図5に示すように、まずステップS101で、ユーザの入力操作に基づいてクライアント10Aのセンサ部16Aが周辺のカメラ画像を撮像してセンシングデータを取得する。
As shown in FIG. 5, first, in step S101, the sensor unit 16A of the client 10A captures a surrounding camera image to obtain sensing data based on the user's input operation.
次にステップ102で、クライアント10Aはセンシングデータであるカメラ画像を3Dマップ更新用データとして情報処理装置100に送信する。
Next, in step 102, the client 10A transmits the camera image, which is sensing data, to the information processing device 100 as 3D map update data.
次にステップS201で、情報処理装置100はクライアント10Aから送信された3Dマップ更新用データを受信する。
Next, in step S201, the information processing device 100 receives 3D map update data sent from the client 10A.
次にステップS202で、更新要求DB制御部101は更新要求DB102に3Dマップ更新用データを保存し、さらにその3Dマップ更新用データの受信時刻を記録する。
Next, in step S202, the update request DB control unit 101 stores the 3D map update data in the update request DB 102, and further records the reception time of the 3D map update data.
次にステップS203で、優先度設定部105が3Dマップ更新用データに基づいて3DマップDB104に予め保存されている複数の3Dマップのうち、更新要求の対象である3Dマップを特定する。優先度設定部105は、受信した3Dマップ更新用データとしてカメラ画像の特徴点や特徴量の抽出を行い、その特徴点や特徴量と3DマップDB104に保存されている複数の3Dマップそれぞれの特徴点や特徴量などを比較して、それらが一致または近似するものを更新要求の対象である3Dマップとして特定する。
Next, in step S203, the priority setting unit 105 identifies the 3D map that is the target of the update request from among the plurality of 3D maps stored in advance in the 3D map DB 104 based on the 3D map update data. The priority setting unit 105 extracts the feature points and feature amounts of the camera image as the received 3D map update data, and uses the feature points and feature amounts and the features of each of the plurality of 3D maps stored in the 3D map DB 104. Points, feature amounts, etc. are compared, and a 3D map that matches or approximates them is identified as a 3D map that is the subject of an update request.
なお、クライアント10Aの他に複数のクライアント10から3Dマップの更新要求があり、それら複数のクライアント10はそれぞれ更新要求の対象となる3Dマップは異なるものであるとする。
It is assumed that there are requests to update the 3D map from a plurality of clients 10 in addition to the client 10A, and each of these clients 10 requests a different 3D map.
次に図6を参照して情報処理装置100における優先度の設定、計算機資源の割り当て、3Dマップの更新について説明する。
Next, priority setting, computer resource allocation, and 3D map updating in the information processing device 100 will be described with reference to FIG. 6.
まずステップS301で、優先度設定部105が更新要求の対象である全ての3Dマップについて優先度を設定する。
First, in step S301, the priority setting unit 105 sets priorities for all 3D maps that are the targets of update requests.
ここで優先度の設定について説明する。ここでは具体的に、更新要求の対象であるn個の3Dマップのうちのx番目の3Dマップについて、3Dマップに関する情報に基づいてスコアを設定する。本実施の形態では3Dマップに関する情報は7種類であるとし、3Dマップに関する各情報から算出するスコアをそれぞれax,bx,cx,dx,ex,fx,gxとする。そして、そのスコアの合計から、x番目の3Dマップの優先度pxを設定する。
Here, the priority setting will be explained. Specifically, a score is set for the x-th 3D map among the n 3D maps that are the targets of the update request based on information regarding the 3D map. In this embodiment, it is assumed that there are seven types of information regarding the 3D map, and the scores calculated from each piece of information regarding the 3D map are a x , b x , c x , d x , e x , f x , and g x, respectively. Then, the priority level p x of the x-th 3D map is set from the total score.
第1の3Dマップに関する情報は「3Dマップ更新用データを保存した時点以降に位置姿勢推定要求が行われた回数」である。これは、3Dマップ更新用データを情報処理装置100が受信したにもかかわらず、その更新要求の対象である3Dマップが更新される前にその3Dマップに基づいて位置姿勢推定が行われた回数である。3Dマップ更新用データを受信した時点とは、例えば更新要求DB制御部101が3Dマップ更新用データを更新要求DB102に保存した時点であるが、サーバ20が3Dマップ更新用データを受信した時点でもよい。この回数は、3Dマップ利用記録DB103に保存された位置姿勢推定要求の時刻と更新要求を受信した時刻を比較することで求めることができる。この回数をスコアaxとする。3Dマップ更新用データを受信したにも関わらず、更新前の3Dマップに基づいて位置姿勢推定を行っていることから、その3Dマップを利用した位置姿勢推定の精度は悪化していると考えられる。したがって、その3Dマップは優先して更新すべきであるため、スコアaxで優先度を高める。
The information regarding the first 3D map is "the number of times position and orientation estimation requests have been made since the time when the 3D map update data was saved." This is the number of times that position and orientation estimation was performed based on the 3D map before the 3D map that is the target of the update request was updated, even though the information processing device 100 received the 3D map update data. It is. The time when the 3D map update data is received is, for example, the time when the update request DB control unit 101 saves the 3D map update data in the update request DB 102, but it can also be the time when the server 20 receives the 3D map update data. good. This number of times can be determined by comparing the time of the position/orientation estimation request stored in the 3D map usage record DB 103 with the time of receiving the update request. This number of times is defined as the score ax . Even though the 3D map update data has been received, the position and orientation estimation is performed based on the 3D map before the update, so it is thought that the accuracy of position and orientation estimation using the 3D map has deteriorated. . Therefore, since the 3D map should be updated with priority, the priority is increased by the score ax .
第2の3Dマップに関する情報は「過去に位置姿勢推定に利用された回数」である。この回数は3Dマップ利用記録DB103に記録されている。この回数をスコアbxとする。過去に位置姿勢推定に利用された回数が多い3Dマップほど、今後も位置姿勢推定に利用される回数が多いと考えられる。したがって、その3Dマップは優先して更新すべきであるため、スコアbxで優先度を高める。
The information regarding the second 3D map is "the number of times it has been used for position and orientation estimation in the past." This number of times is recorded in the 3D map usage record DB 103. This number of times is defined as the score bx . It is considered that the more times a 3D map has been used for position and orientation estimation in the past, the more times it will be used for position and orientation estimation in the future. Therefore, since that 3D map should be updated with priority, the priority is increased by the score b x .
第3の3Dマップに関する情報は「3Dマップ更新用データのうちの3Dマップの更新に利用可能な情報の割合」である。この割合をスコアcxとする。クライアント10から送信された3Dマップ更新用データには3Dマップの更新に利用できない情報が含まれている可能性がある。例えば、輝度(光量)が足りず黒潰れしているカメラ画像や、動物体が多く映っているために静的な3Dマップに利用できないカメラ画像などである。そこで、3Dマップ更新用データに含まれる情報のうち3Dマップの更新に利用できる情報の割合が高いほど、スコアcxで優先度を高める。
The information regarding the third 3D map is "the ratio of information that can be used for updating the 3D map among the data for updating the 3D map." Let this ratio be the score cx . The 3D map update data sent from the client 10 may include information that cannot be used to update the 3D map. For example, there are camera images with insufficient brightness (light intensity) and blacked-out images, camera images that cannot be used for static 3D maps because they include many moving objects, and the like. Therefore, the higher the ratio of information that can be used to update the 3D map among the information included in the 3D map update data, the higher the priority is given by the score c x .
3Dマップ更新用データとしてのカメラ画像中の輝度が足りていない領域の検出は、例えば、カメラ画像中の特定の領域全体の輝度値の平均が特定の閾値を下回っているかどうかに基づいて行うことができる。また、3Dマップ更新用データとしての画像中の動物体領域の検出は、セマンティックセグメンテーションの結果などから行うことができる。そのような3Dマップの更新に利用できないカメラ画像中の領域の画素数を各カメラ画像において算出し、その合計値sを求める。3Dマップの更新に利用できる更新データが多いほど優先度を高めるため、カメラ画像全体の画素数からsを引き算し、引き算の結果の値をカメラ画像全体の画素数gで割ることで、3Dマップ更新用データ中の利用可能なデータの割合を算出する。この割合をcxとする。
Detection of areas with insufficient brightness in the camera image as 3D map update data may be performed, for example, based on whether the average brightness value of the entire specific area in the camera image is below a specific threshold value. Can be done. Furthermore, detection of a moving body region in an image as data for updating a 3D map can be performed based on the results of semantic segmentation or the like. The number of pixels in the area in the camera image that cannot be used for updating such a 3D map is calculated for each camera image, and the total value s is determined. The more update data that can be used to update the 3D map, the higher the priority, so by subtracting s from the number of pixels in the entire camera image and dividing the value of the subtraction result by the number of pixels in the entire camera image, the 3D map can be updated. Calculate the percentage of usable data in the update data. Let this ratio be cx .
第4の3Dマップに関する情報は「3Dマップ更新用データと既存の3Dマップの類似度」である。例えば、3Dマップ更新用データとしてのカメラ画像のセマンティックセグメンテーション結果が既存の3Dマップに含まれるカメラ画像のセマンティックセグメンテーションの結果と一致する割合を類似度として利用することができる。その割合で表される類似度をスコア1/dxとする。3Dマップ更新用データとしてカメラ画像には既存の3Dマップと類似しているカメラ画像が存在している可能性がある。例えば、風景が変化していない場所を撮像したカメラ画像がこれに該当する。風景が変化している場所ほど3Dマップを更新する必要性が高いため、3Dマップ更新用データと既存の3Dマップの類似度が低いほど、スコア1/dxで優先度を高める。
The information regarding the fourth 3D map is "the degree of similarity between the 3D map update data and the existing 3D map." For example, the rate at which the semantic segmentation result of a camera image as 3D map update data matches the semantic segmentation result of a camera image included in an existing 3D map can be used as the degree of similarity. The degree of similarity expressed by the ratio is defined as the score 1/d x . There is a possibility that camera images similar to the existing 3D map may exist as data for updating the 3D map. For example, a camera image taken of a place where the scenery has not changed corresponds to this. Since it is more necessary to update the 3D map in places where the scenery is changing, the lower the similarity between the 3D map update data and the existing 3D map, the higher the priority is given by the score 1/d x .
第5の3Dマップに関する情報は「3Dマップに対する更新要求の回数」である。この回数は更新要求DB102を参照することで求めることができる。この回数をスコアexとする。同一の3Dマップに対して数多く更新が要求されている3Dマップほど、更新の必要性が高いと考えられるため、更新要求回数が多いほどスコアexで優先度を高める。
The information regarding the fifth 3D map is "the number of update requests for the 3D map." This number can be obtained by referring to the update request DB 102. This number of times is defined as the score e x . It is considered that the more updates are requested for the same 3D map, the higher the need for updating is, so the higher the number of update requests, the higher the priority is given by the score ex .
第6の3Dマップに関する情報は「ARアプリケーションの関係者が設定する3Dマップに対する優先度」である。位置姿勢推定結果を利用してARによる仮想物表示を行うとき、表示する仮想物の種類によって要求される位置姿勢推定の精度が異なる場合がある。例えば、空中に浮いている仮想物であれば位置が多少ずれていてもユーザは知覚できないが、実物体に接するように配置されている仮想物であれば、ユーザは物体の相対関係からずれていることを知覚してしまうため、その仮想物の描画のために精度の高い位置姿勢推定が要求される。そこで、仮想物の重畳精度に応じてARアプリケーションの関係者が3Dマップの更新優先度を設定できるようにする。この値をfxとする。なお、ARアプリケーションの関係者とはARアプリケーションの開発者であるが、他にも、運用者、ARサービスの提供者などででもよい。
The information regarding the sixth 3D map is "the priority level for the 3D map set by the person concerned with the AR application". When displaying a virtual object using AR using position and orientation estimation results, the accuracy of position and orientation estimation required may vary depending on the type of virtual object to be displayed. For example, if a virtual object is floating in the air, the user will not be able to perceive it even if the position is slightly shifted, but if the virtual object is placed in contact with a real object, the user will not be able to perceive it if the position is shifted slightly. Therefore, highly accurate position and orientation estimation is required to draw the virtual object. Therefore, it is possible for those involved in the AR application to set the update priority of the 3D map according to the superimposition accuracy of the virtual object. Let this value be fx . Note that the person involved in the AR application is the developer of the AR application, but may also be an operator, an AR service provider, or the like.
第7の3Dマップに関する情報は「ユーザが設定する3Dマップに対する優先度」である。ユーザがARアプリケーション利用中に位置姿勢推定の精度の悪化による仮想物の重畳位置ずれを知覚した場合、その位置姿勢推定に利用した3Dマップは更新する必要があると考えられる。そこでユーザが仮想物の重畳精度を高めることを期待して、3Dマップの更新優先度を設定できるようにする。この値をgxとする。
The information regarding the seventh 3D map is "priority for the 3D map set by the user". If a user perceives a superimposition position shift of a virtual object due to deterioration in the accuracy of position and orientation estimation while using an AR application, it is considered that the 3D map used for the position and orientation estimation needs to be updated. Therefore, the user is allowed to set the update priority of the 3D map in hopes of increasing the superimposition accuracy of virtual objects. Let this value be gx .
そして、これらの第1乃至第7の3Dマップに関する各情報から得られたスコアに基づいて優先度を算出する。例えば、下記の式1を用いて、x番目の3Dマップに対する優先度pxを算出できる。
Then, the priority is calculated based on the score obtained from each piece of information regarding these first to seventh 3D maps. For example, the priority p x for the xth 3D map can be calculated using Equation 1 below.
なお、式1においては回数の値として得られたスコアは割合に変換して優先度を算出している。
Note that in Equation 1, the score obtained as the number of times is converted into a percentage to calculate the priority.
上述の第1乃至第7の3Dマップに関する情報はあくまで例示であり、常にこの7つの3Dマップに関する情報に基づいて優先度を設定しなければならないわけではない。いずれか一つに基づいて優先度を設定してもよいし、いずれか複数に基づいて優先度を設定してもよい。よって、優先度設定部105は一つの3Dマップに関する情報から得たスコア、または複数の3Dマップに関する情報から得たスコアの合計を優先度として設定する。また、7つ以外の他の3Dマップに関する情報に基づいて優先度を設定してもよい。優先度設定に利用する3Dマップに関する情報の数を少なくして優先度設定にかかる時間を減らすことでより早く3Dマップを更新することができると考えられる。
The information regarding the first to seventh 3D maps described above is just an example, and the priority does not always have to be set based on the information regarding these seven 3D maps. The priority may be set based on any one of them, or the priority may be set based on a plurality of them. Therefore, the priority setting unit 105 sets the score obtained from the information regarding one 3D map or the total score obtained from the information regarding a plurality of 3D maps as the priority. Furthermore, the priority may be set based on information regarding 3D maps other than the seven. It is thought that the 3D map can be updated more quickly by reducing the amount of information related to the 3D map used for priority setting and reducing the time required for priority setting.
なお、上述した複数の3Dマップに関する情報は各種の条件などに応じて重み付けを行ってもよい。例えば、例えばARアプリケーションの開発者の考えは最も重要であるとして第6の3Dマップに関する情報に重み付けする、などである。
Note that the information regarding the plurality of 3D maps described above may be weighted according to various conditions. For example, an AR application developer's idea is to weight information regarding the sixth 3D map as being the most important.
フローチャートの説明に戻る。次にステップS302で、計算機資源割当部106は優先度に基づいて比例配分することにより、更新要求の対象である複数の3Dマップのそれぞれに対する計算機資源の割り当てを設定する。比例配分係数rxは下記の式2で算出できる。
Return to the explanation of the flowchart. Next, in step S302, the computer resource allocation unit 106 sets the allocation of computer resources to each of the plurality of 3D maps that are the targets of the update request by performing proportional allocation based on the priority. The proportional distribution coefficient r x can be calculated using Equation 2 below.
例えば、利用可能なサーバ20の計算機資源をVとしたとき、x番目の3Dマップの更新に利用可能な計算機資源Vxは、下記の式3のように算出することができる。
For example, when the available computer resources of the server 20 are V, the computer resources V x that can be used to update the x-th 3D map can be calculated as shown in Equation 3 below.
Vx=V・rx
V x = V・r x
これは、例えば、3Dマップαの比例配分係数rxが0.7、3Dマップβの比例配分係数rxが0.3であった場合、計算機資源の70%を3Dマップαの更新に割り当て、30%を3Dマップβの更新に割り当てるということである。
For example, if the proportional allocation coefficient r x of 3D map α is 0.7 and the proportional allocation coefficient r x of 3D map β is 0.3, 70% of computer resources will be allocated to update 3D map α. , 30% is allocated to updating the 3D map β.
次にステップS303で、3Dマップ更新部107は割り当てられた計算機資源を利用して更新要求の対象である3Dマップの更新を行う。なお、過去にその3Dマップの更新が中断されていた場合は更新を再開し、そうでない場合は新たに3Dマップの更新を実行する。3Dマップの更新は3Dマップの生成と同様に3Dマップ更新用データであるカメラ画像とSfMやVIOなどの技術を利用して行うことができる。
Next, in step S303, the 3D map updating unit 107 uses the allocated computer resources to update the 3D map that is the target of the update request. Note that if the update of the 3D map was interrupted in the past, the update is restarted, and if not, a new update of the 3D map is executed. Similar to 3D map generation, the 3D map can be updated using a camera image as 3D map update data and techniques such as SfM and VIO.
次にステップS304で、割り当てられた計算機資源を使い切るまでに3Dマップの更新が完了した場合、処理はステップS305に進む(ステップS304のYes)。
Next, in step S304, if the update of the 3D map is completed before the allocated computer resources are used up, the process proceeds to step S305 (Yes in step S304).
そしてステップS305で、3Dマップ更新部107は3DマップDB104において保存されている更新要求の対象の3Dマップを更新後の3Dマップに置き換える。
Then, in step S305, the 3D map updating unit 107 replaces the 3D map that is the target of the update request and is stored in the 3D map DB 104 with the updated 3D map.
またステップS306で、更新要求DB制御部101は3Dマップの更新に利用した3Dマップ更新用データを更新要求DB102から削除する。
Further, in step S306, the update request DB control unit 101 deletes the 3D map update data used for updating the 3D map from the update request DB 102.
一方、ステップS305で割り当てられた計算機資源を使い切るまでに3Dマップの更新が完了しなかった場合、処理はステップS307に進む(ステップS305のNo)。
On the other hand, if the update of the 3D map is not completed before the computer resources allocated in step S305 are used up, the process proceeds to step S307 (No in step S305).
そしてステップS307で、3Dマップ更新部107は3Dマップ更新の途中経過を3DマップDB104に保存して更新を途中で中断する。
Then, in step S307, the 3D map update unit 107 saves the progress of the 3D map update in the 3D map DB 104 and interrupts the update midway.
情報処理装置100はこのステップS303乃至ステップS307を更新要求の対象である3Dマップごとに実行する。
The information processing apparatus 100 executes steps S303 to S307 for each 3D map that is the subject of an update request.
次に図7を参照してクライアント10Bからの位置姿勢推定の要求と情報処理装置100における位置姿勢推定の実行について説明する。
Next, a request for position and orientation estimation from the client 10B and execution of position and orientation estimation in the information processing device 100 will be described with reference to FIG.
まずステップS401で、ユーザの入力操作に基づいてクライアント10Bのセンサ部16Bが周辺のカメラ画像を撮像してセンシングデータを取得する。
First, in step S401, the sensor unit 16B of the client 10B captures a surrounding camera image and obtains sensing data based on the user's input operation.
次にステップ402で、クライアント10Bはセンシングデータであるカメラ画像を位置姿勢推定用データとして情報処理装置100に送信する。
Next, in step 402, the client 10B transmits the camera image, which is sensing data, to the information processing device 100 as position and orientation estimation data.
次にステップS501で、情報処理装置100はクライアント10Bから送信された位置姿勢推定用データを受信する。
Next, in step S501, the information processing device 100 receives position and orientation estimation data transmitted from the client 10B.
次にステップS502で、3Dマップ選択部108は受信した位置姿勢推定用データであるカメラ画像の特徴を元に、3DマップDB104に保存されている複数の3Dマップの中から位置姿勢推定に利用する3Dマップを特定する。このとき、GPSや携帯回線の基地局情報などの粗い位置情報を利用して、位置姿勢推定に利用する3Dマップを絞り込んでもよい。3Dマップ選択部108は選択した3Dマップを3DマップDB104から取得して位置姿勢推定部109に出力する。3Dマップ選択部108は位置姿勢推定用データとしてのカメラ画像の特徴点や特徴量の抽出を行い、その特徴点や特徴量と3Dマップに含まれている特徴点や特徴量とを比較することでそれらが一致または近似する3Dマップを位置姿勢推定に利用する3Dマップとして特定する。
Next, in step S502, the 3D map selection unit 108 selects one of the plurality of 3D maps stored in the 3D map DB 104 for position and orientation estimation based on the characteristics of the camera image that is the received data for position and orientation estimation. Identify the 3D map. At this time, coarse position information such as GPS or mobile line base station information may be used to narrow down the 3D maps to be used for position and orientation estimation. The 3D map selection unit 108 acquires the selected 3D map from the 3D map DB 104 and outputs it to the position and orientation estimation unit 109. The 3D map selection unit 108 extracts feature points and feature amounts from the camera image as position and orientation estimation data, and compares the feature points and feature amounts with the feature points and feature amounts included in the 3D map. A 3D map that matches or approximates them is identified as a 3D map to be used for position and orientation estimation.
次にステップS503で、位置姿勢推定部109は位置姿勢推定用データであるカメラ画像と3Dマップを利用してクライアント10Bの位置姿勢を推定する。
Next, in step S503, the position and orientation estimating unit 109 estimates the position and orientation of the client 10B using the camera image and the 3D map, which are position and orientation estimation data.
次にステップS504で、情報処理装置100は3Dマップ利用記録DB103に位置姿勢推定に利用した3Dマップを特定する情報(IDなど)と位置姿勢推定を行った時刻を記録する。
Next, in step S504, the information processing device 100 records information (ID, etc.) that identifies the 3D map used for position and orientation estimation and the time when position and orientation estimation was performed in the 3D map usage record DB 103.
次にステップS505で情報処理装置100はクライアント10Bに位置姿勢推定結果を送信する。
Next, in step S505, the information processing device 100 transmits the position and orientation estimation results to the client 10B.
次にステップS403で、クライアント10Bは情報処理装置100から送信された位置姿勢推定結果を受信する。
Next, in step S403, the client 10B receives the position and orientation estimation results transmitted from the information processing device 100.
そしてステップ404で、クライアント10Bは位置姿勢推定結果を用いてARアプリケーションを実行する。アプリケーション制御部17Bは、サーバ20から送信された位置姿勢推定結果を取得して描画部18Bに出力し、描画部18Bは位置姿勢推定結果に基づいて仮想物を描画して出力画像を生成し、出力画像を表示部15Bに出力する。これにより出力画像は表示部15Bにおいて表示される。
Then, in step 404, the client 10B executes the AR application using the position and orientation estimation results. The application control unit 17B acquires the position and orientation estimation results transmitted from the server 20 and outputs them to the drawing unit 18B, and the drawing unit 18B draws a virtual object based on the position and orientation estimation results to generate an output image, The output image is output to the display section 15B. As a result, the output image is displayed on the display section 15B.
以上のようにして情報処理システム1000と情報処理装置100における処理が行われる。
Processing in the information processing system 1000 and the information processing device 100 is performed as described above.
[1-4.情報処理システム1000および情報処理装置100における処理の具体例]
次に具体例として同一空間にクライアント10Aを使用するユーザAと、クライアント10Bを使用するユーザBが存在している場合を考える。ここでは説明の便宜上、単純な例として、図8に示す順で3Dマップの更新要求と位置姿勢推定要求がなされた場合を考える。 [1-4. Specific example of processing ininformation processing system 1000 and information processing device 100]
Next, as a specific example, consider a case where user A who usesclient 10A and user B who uses client 10B exist in the same space. Here, for convenience of explanation, as a simple example, we will consider a case where a 3D map update request and a position/orientation estimation request are made in the order shown in FIG.
次に具体例として同一空間にクライアント10Aを使用するユーザAと、クライアント10Bを使用するユーザBが存在している場合を考える。ここでは説明の便宜上、単純な例として、図8に示す順で3Dマップの更新要求と位置姿勢推定要求がなされた場合を考える。 [1-4. Specific example of processing in
Next, as a specific example, consider a case where user A who uses
まずステップS601で、ユーザAによる入力指示に基づきクライアント10Aがカメラ画像を3Dマップ更新用データとして情報処理装置100に送信して3Dマップの更新を要求する。ここではクライアント10Aは図9に示すカメラ画像を情報処理装置100に送信したとする。
First, in step S601, the client 10A transmits a camera image as 3D map update data to the information processing device 100 based on an input instruction from user A to request an update of the 3D map. Here, it is assumed that the client 10A has transmitted the camera image shown in FIG. 9 to the information processing apparatus 100.
また、情報処理装置100が3Dマップの更新要求に応じて3Dマップを更新するよりも前にステップS602でユーザBの入力指示に基づきクライアント10Bがセンシングデータとしてのカメラ画像を情報処理装置100に送信してARアプリケーションを使用するために位置姿勢推定要求を行ったとする。
Furthermore, before the information processing apparatus 100 updates the 3D map in response to the 3D map update request, the client 10B transmits a camera image as sensing data to the information processing apparatus 100 based on the input instruction from the user B in step S602. Assume that a position and orientation estimation request is made in order to use an AR application.
なお、ユーザBはユーザAが存在することやユーザAが3Dマップの更新要求を行ったことは知っていてもよいし知らなくてもよい。ここではクライアント10Bは図10に示すカメラ画像Bを位置姿勢推定用データとして情報処理装置100に送信したとする。
Note that user B may or may not know that user A exists or that user A has requested an update of the 3D map. Here, it is assumed that the client 10B transmits the camera image B shown in FIG. 10 to the information processing apparatus 100 as position and orientation estimation data.
図9に示すカメラ画像Aと図10に示すカメラ画像Bは同じ位置から同じ方向を撮像した画像であるとする。図9に示すカメラ画像Aと図10に示すカメラ画像Bは同時ではないが近い日時に撮像されたものであるとし、移動体である自動車以外の被写体は同一である。よって、クライアント10Aからの3Dマップ更新要求の対象である3Dマップとクライアント10Bからの位置姿勢推定要求に応じて行う位置姿勢推定に利用する3Dマップは同一のものとなる。なお、同じ位置とは完全に同一の位置である場合に限られず、数メートルまたは数十メートルの誤差を含むものであるとする。同じ方向とは完全に同一の方向である場合に限られず、数度または数十度の角度の誤差を含むものであるとする。また、カメラ画像Aとカメラ画像Bの撮像時刻は同じである必要はない。
It is assumed that camera image A shown in FIG. 9 and camera image B shown in FIG. 10 are images taken from the same position and in the same direction. It is assumed that camera image A shown in FIG. 9 and camera image B shown in FIG. 10 were taken at close dates and times, although not at the same time, and the objects other than the moving vehicle, ie, the automobile, are the same. Therefore, the 3D map that is the target of the 3D map update request from the client 10A and the 3D map used for position and orientation estimation performed in response to the position and orientation estimation request from the client 10B are the same. Note that the same position is not limited to the completely same position, but includes an error of several meters or tens of meters. The same direction is not limited to the completely same direction, but includes an angular error of several degrees or tens of degrees. Moreover, the imaging times of camera image A and camera image B do not need to be the same.
クライアント10Bからの位置姿勢推定要求に応じて情報処理装置100は位置姿勢推定を行うが、このとき、位置姿勢推定に利用する3Dマップが古い場合がある。例えば、3Dマップ生成に利用した過去のカメラ画像が図11に示すものであった場合、すなわち、3Dマップ生成時の過去の風景は図11に示す状態だったが、現在は図9および図10のような風景に変わっているという場合である。図11における過去のカメラ画像と、図9および図10における現在のカメラ画像では図11において破線で囲う領域の建物が変化している。このような場合、変化した領域の特徴を使って位置姿勢を推定することができないため、位置姿勢推定は失敗するか、もしくは精度の低い推定結果が得られることになる。
The information processing device 100 performs position and orientation estimation in response to a position and orientation estimation request from the client 10B, but at this time, the 3D map used for position and orientation estimation may be old. For example, if the past camera images used for 3D map generation were as shown in FIG. 11, that is, the past scenery at the time of 3D map generation was in the state shown in FIG. In this case, the scenery has changed to something like this. In the past camera image in FIG. 11 and the current camera image in FIGS. 9 and 10, the building in the area surrounded by the broken line in FIG. 11 has changed. In such a case, it is not possible to estimate the position and orientation using the characteristics of the changed area, so the position and orientation estimation will fail or an estimation result with low accuracy will be obtained.
次にステップS603で、情報処理装置100はクライアント10Aからの更新要求に応じて3Dマップを更新する。
Next, in step S603, the information processing device 100 updates the 3D map in response to the update request from the client 10A.
そしてステップS604で、ユーザBの入力指示に基づきクライアント10Bはカメラ画像を位置姿勢推定用データとして情報処理装置100に送信して再び位置姿勢推定要求を行ったとする。このときのカメラ画像は図10に示すカメラ画像Bを撮像した位置および方向と同じ位置と方向で撮像した画像であるとする。
Then, in step S604, based on the input instruction from user B, the client 10B transmits the camera image to the information processing device 100 as data for position and orientation estimation, and requests position and orientation estimation again. It is assumed that the camera image at this time is an image taken at the same position and direction as the position and direction in which camera image B shown in FIG. 10 was taken.
なお、ステップS604においてユーザBはARアプリケーションを使用しているだけであり、3Dマップが更新されたことを認識してはおらず、3Dマップが更新されたことがクライアント10Bにも通知はされていない。ユーザBはステップS602とステップS604において単に2回ARアプリケーションを使用しただけである。
Note that in step S604, user B is only using the AR application and is not aware that the 3D map has been updated, and the client 10B is not notified that the 3D map has been updated. . User B simply used the AR application twice in step S602 and step S604.
この位置姿勢推定要求に応じて情報処理装置100は受信したカメラ画像と3Dマップを利用してクライアント10Bの位置姿勢を推定する。このとき、ステップS602における位置姿勢推定要求とステップS604における位置姿勢推定要求の間にステップS603において現在の風景であるカメラ画像Aにより3Dマップが更新されている。カメラ画像Aは現在の風景であるカメラ画像Bと近い状態であるため、そのカメラ画像Aにより更新された3Dマップとカメラ画像Bでクライアント10Bの位置姿勢を推定することにより、ステップS604の位置姿勢推定要求に対する位置姿勢推定結果はステップS602の位置姿勢推定要求に対する位置姿勢推定結果よりも高精度になる。この高精度な位置姿勢推定結果を用いることによりクライアント10Bはより高精度な仮想物の描画を行うことができる。
In response to this position and orientation estimation request, the information processing device 100 estimates the position and orientation of the client 10B using the received camera image and 3D map. At this time, the 3D map is updated with the camera image A, which is the current scenery, in step S603 between the position and orientation estimation request in step S602 and the position and orientation estimation request in step S604. Since camera image A is in a state close to camera image B, which is the current scenery, the position and orientation of the client 10B is estimated using the 3D map updated by camera image A and camera image B. The position and orientation estimation result for the estimation request has higher accuracy than the position and orientation estimation result for the position and orientation estimation request in step S602. By using this highly accurate position and orientation estimation result, the client 10B can draw a virtual object with higher accuracy.
以上のようにして本技術における処理が行われる。本技術によれば、優先度に基づいて3Dマップ更新のための計算機資源を割り当てるので、優先度が高い3Dマップほど早く更新することができる。また、精度低下や生成コスト増加を抑制しながら3Dマップを更新することができる。そして、その更新された3Dマップを用いて高精度な位置姿勢推定が可能となる。さらに、風景が変化する環境における位置姿勢推定の精度も向上させることができる。
The processing in this technology is performed as described above. According to the present technology, computer resources for updating a 3D map are allocated based on the priority, so the 3D map with a higher priority can be updated more quickly. Furthermore, the 3D map can be updated while suppressing a decrease in accuracy and an increase in generation cost. Then, using the updated 3D map, highly accurate position and orientation estimation becomes possible. Furthermore, the accuracy of position and orientation estimation in an environment with changing scenery can also be improved.
<2.変形例>
以上、本技術の実施の形態について具体的に説明したが、本技術は上述の実施の形態に限定されるものではなく、本技術の技術的思想に基づく各種の変形が可能である。 <2. Modified example>
Although the embodiments of the present technology have been specifically described above, the present technology is not limited to the above-described embodiments, and various modifications based on the technical idea of the present technology are possible.
以上、本技術の実施の形態について具体的に説明したが、本技術は上述の実施の形態に限定されるものではなく、本技術の技術的思想に基づく各種の変形が可能である。 <2. Modified example>
Although the embodiments of the present technology have been specifically described above, the present technology is not limited to the above-described embodiments, and various modifications based on the technical idea of the present technology are possible.
実施の形態ではセンシングデータとしてカメラ画像を使用したが、センシングデータはカメラ画像に限られず、IMU、LiDAR、ToFセンサ、GPS(Global Positioning System)で取得した加速度情報、距離情報、位置情報などをカメラ画像に代えて用いてもよいし、カメラ画像と併用してもよい。
Although camera images were used as sensing data in the embodiment, sensing data is not limited to camera images, and can include acceleration information, distance information, position information, etc. acquired by IMU, LiDAR, ToF sensor, GPS (Global Positioning System), etc. It may be used in place of an image or in combination with a camera image.
実施の形態では一つの情報処理装置100がサーバ20において処理を行うように説明したが、図12に示すように情報処理装置100をその機能により情報処理装置100A、情報処理装置100B、情報処理装置100Cに分け、それぞれが異なるサーバにおいて処理を行うようにしてもよい。なお、図12に示す情報処理装置およびサーバの分け方はあくまで一例であり、分け方はこの例に限られるものではない。
In the embodiment, one information processing apparatus 100 performs processing in the server 20, but as shown in FIG. It may be divided into 100C and each may be processed by a different server. Note that the division of information processing devices and servers shown in FIG. 12 is just an example, and the division is not limited to this example.
図4の例では3Dマップの更新要求を行うクライアント10Aと位置姿勢推定要求を行うクライアント10Bを異なるクライアントとして説明したが、一つのクライアント10が3Dマップ更新要求と位置姿勢推定要求の両方を行ってもよい。
In the example of FIG. 4, the client 10A that makes a 3D map update request and the client 10B that makes a position and orientation estimation request are described as different clients, but one client 10 makes both a 3D map update request and a position and orientation estimation request. Good too.
実施の形態ではサーバ20は一つであるとして説明を行ったが、サーバ20の数に限定はない。複数のサーバ20を用いる3Dマップ更新手法と本技術を組み合わせてもよい。これにより3Dマップ更新にかかる時間を削減できる。
Although the embodiment has been described as having one server 20, the number of servers 20 is not limited. The present technology may be combined with a 3D map update method that uses multiple servers 20. This can reduce the time required to update the 3D map.
また、本技術は既存の低精度な3Dマップ生成方法と併用してもよい。例えば、更新要求を受け取った情報処理装置100は、まずは低精度な3Dマップを高速に生成して3Dマップを更新する。低精度な3Dマップで更新したので位置姿勢推定結果の精度は向上しないが、新しい風景に基づいた3Dマップを利用して位置姿勢推定できるので位置姿勢推定の成功率の向上が見込める。その後、本技術を用いて高精度な3Dマップを生成してその低精度な3Dマップを高精度な3Dマップで更新する。
Additionally, the present technology may be used in conjunction with existing low-precision 3D map generation methods. For example, upon receiving the update request, the information processing apparatus 100 first generates a low-precision 3D map at high speed and updates the 3D map. Although the accuracy of the position and orientation estimation results will not improve because it has been updated with a low-precision 3D map, the success rate of position and orientation estimation can be expected to improve because the position and orientation can be estimated using a 3D map based on the new scenery. After that, a high-precision 3D map is generated using the present technology, and the low-precision 3D map is updated with the high-precision 3D map.
実施の形態では位置姿勢推定部109がクライアント10の位置姿勢を推定するとしたが、情報処理装置100はVPSを用いてクライアント10の位置のみを推定することも、姿勢のみを推定することも可能である。
In the embodiment, the position and orientation estimating unit 109 estimates the position and orientation of the client 10, but the information processing device 100 can use VPS to estimate only the position or only the orientation of the client 10. be.
本技術は以下のような構成も取ることができる。
(1)
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する3Dマップ更新部と
を備える情報処理装置。
(2)
前記優先度設定部は、前記3Dマップに関する情報に基づいて前記優先度を設定する(1)に記載の情報処理装置。
(3)
前記優先度設定部は、一つの前記3Dマップに関する情報から得た値、または複数の前記3Dマップに関する情報から得た値の合計を前記優先度として設定する(2)に記載の情報処理装置。
(4)
前記3Dマップに関する情報は、前記更新要求が行われた後に前記更新要求の対象である前記3Dマップを利用する位置姿勢推定要求が行われた回数である(2)または(3)に記載の情報処理装置。
(5)
前記3Dマップに関する情報は、過去に前記3Dマップが前記位置姿勢推定に利用された回数である(2)から(4)のいずれかに記載の情報処理装置。
(6)
前記3Dマップに関する情報は、前記センシングデータのうち、前記3Dマップの更新に利用できる情報の割合である(2)から(5)のいずれかに記載の情報処理装置。
(7)
前記3Dマップに関する情報は、前記センシングデータと前記3Dマップの類似度である(2)から(6)のいずれかに記載の情報処理装置。
(8)
前記3Dマップに関する情報は、過去の前記3Dマップに対する更新要求の回数である
(2)から(7)のいずれかに記載の情報処理装置。
(9)
前記3Dマップに関する情報は、前記3Dマップを利用した位置姿勢推定の結果を利用するアプリケーションの関係者が設定した優先度である(2)から(8)のいずれかに記載の情報処理装置。
(10)
前記3Dマップに関する情報は、前記3Dマップを利用した位置姿勢推定の結果を利用するアプリケーションのユーザが設定した優先度である(2)から(9)のいずれかに記載の情報処理装置。
(11)
前記優先度設定部は、前記センシングデータに基づいて複数の前記3Dマップの中から更新要求の対象である前記3Dマップを特定する(1)から(10)のいずれかに記載の情報処理装置。
(12)
前記優先度に基づいて前記3Dマップの更新に利用する計算機資源を更新の対象である前記3Dマップごとに割り当てる計算機資源割当部を備える(1)から(11)のいずれかに記載の情報処理装置。
(13)
前記計算機資源割当部は、前記優先度に基づいて前記計算機資源を比例配分することにより前記3Dマップごとに割り当てる(12)に記載の情報処理装置。
(14)
前記センシングデータはカメラにより現実空間を撮像した画像である(1)から(13)のいずれかに記載の情報処理装置。
(15)
センシングデータに基づいてデバイスの位置姿勢推定に利用する3Dマップを選択する3Dマップ選択部を備える(1)から(14)のいずれかに記載の情報処理装置。
(16)
前記センシングデータと、選択された前記3Dマップを用いて前記デバイスの位置姿勢を推定する位置姿勢推定部を備える(15)に記載の情報処理装置。
(17)
前記デバイスはARデバイスである(15)または(16)に記載の情報処理装置。
(18)
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定し、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する
情報処理方法をコンピュータに実行させるプログラム。
(19)
センサ部で取得したセンシングデータを送信して更新要求を行う
クライアントと、
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する3Dマップ更新部と
を備える情報処理装置と、
からなる情報処理システム。 The present technology can also have the following configuration.
(1)
a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
An information processing device comprising: a 3D map update unit that updates the 3D map that is the target of an update request using sensing data based on the priority.
(2)
The information processing device according to (1), wherein the priority setting unit sets the priority based on information regarding the 3D map.
(3)
The information processing device according to (2), wherein the priority setting unit sets a value obtained from information regarding one of the 3D maps or a total of values obtained from information regarding a plurality of the 3D maps as the priority.
(4)
The information regarding the 3D map is the information according to (2) or (3), which is the number of times a position/orientation estimation request using the 3D map that is the subject of the update request has been made after the update request has been made. Processing equipment.
(5)
The information processing device according to any one of (2) to (4), wherein the information regarding the 3D map is the number of times the 3D map has been used for the position and orientation estimation in the past.
(6)
The information processing device according to any one of (2) to (5), wherein the information regarding the 3D map is a ratio of information that can be used to update the 3D map among the sensing data.
(7)
The information processing device according to any one of (2) to (6), wherein the information regarding the 3D map is a degree of similarity between the sensing data and the 3D map.
(8)
The information processing device according to any one of (2) to (7), wherein the information regarding the 3D map is the number of past update requests for the 3D map.
(9)
The information processing device according to any one of (2) to (8), wherein the information regarding the 3D map is a priority set by a person involved in an application that uses the result of position and orientation estimation using the 3D map.
(10)
The information processing device according to any one of (2) to (9), wherein the information regarding the 3D map is a priority set by a user of an application that uses the result of position and orientation estimation using the 3D map.
(11)
The information processing device according to any one of (1) to (10), wherein the priority setting unit identifies the 3D map that is the target of an update request from among the plurality of 3D maps based on the sensing data.
(12)
The information processing apparatus according to any one of (1) to (11), including a computer resource allocation unit that allocates computer resources to be used for updating the 3D map for each of the 3D maps to be updated based on the priority. .
(13)
The information processing device according to (12), wherein the computer resource allocation unit allocates the computer resources to each 3D map by proportionally allocating the computer resources based on the priority.
(14)
The information processing device according to any one of (1) to (13), wherein the sensing data is an image of real space captured by a camera.
(15)
The information processing apparatus according to any one of (1) to (14), including a 3D map selection unit that selects a 3D map to be used for estimating the position and orientation of the device based on sensing data.
(16)
The information processing apparatus according to (15), further comprising a position and orientation estimation unit that estimates the position and orientation of the device using the sensing data and the selected 3D map.
(17)
The information processing apparatus according to (15) or (16), wherein the device is an AR device.
(18)
Set the priority when updating each of the multiple 3D maps that are the subject of the update request,
A program that causes a computer to execute an information processing method for updating the 3D map that is a target of an update request using sensing data based on the priority.
(19)
A client that sends sensing data acquired by the sensor unit and requests an update,
a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
an information processing device comprising: a 3D map update unit that updates the 3D map that is the target of the update request using sensing data based on the priority;
An information processing system consisting of
(1)
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する3Dマップ更新部と
を備える情報処理装置。
(2)
前記優先度設定部は、前記3Dマップに関する情報に基づいて前記優先度を設定する(1)に記載の情報処理装置。
(3)
前記優先度設定部は、一つの前記3Dマップに関する情報から得た値、または複数の前記3Dマップに関する情報から得た値の合計を前記優先度として設定する(2)に記載の情報処理装置。
(4)
前記3Dマップに関する情報は、前記更新要求が行われた後に前記更新要求の対象である前記3Dマップを利用する位置姿勢推定要求が行われた回数である(2)または(3)に記載の情報処理装置。
(5)
前記3Dマップに関する情報は、過去に前記3Dマップが前記位置姿勢推定に利用された回数である(2)から(4)のいずれかに記載の情報処理装置。
(6)
前記3Dマップに関する情報は、前記センシングデータのうち、前記3Dマップの更新に利用できる情報の割合である(2)から(5)のいずれかに記載の情報処理装置。
(7)
前記3Dマップに関する情報は、前記センシングデータと前記3Dマップの類似度である(2)から(6)のいずれかに記載の情報処理装置。
(8)
前記3Dマップに関する情報は、過去の前記3Dマップに対する更新要求の回数である
(2)から(7)のいずれかに記載の情報処理装置。
(9)
前記3Dマップに関する情報は、前記3Dマップを利用した位置姿勢推定の結果を利用するアプリケーションの関係者が設定した優先度である(2)から(8)のいずれかに記載の情報処理装置。
(10)
前記3Dマップに関する情報は、前記3Dマップを利用した位置姿勢推定の結果を利用するアプリケーションのユーザが設定した優先度である(2)から(9)のいずれかに記載の情報処理装置。
(11)
前記優先度設定部は、前記センシングデータに基づいて複数の前記3Dマップの中から更新要求の対象である前記3Dマップを特定する(1)から(10)のいずれかに記載の情報処理装置。
(12)
前記優先度に基づいて前記3Dマップの更新に利用する計算機資源を更新の対象である前記3Dマップごとに割り当てる計算機資源割当部を備える(1)から(11)のいずれかに記載の情報処理装置。
(13)
前記計算機資源割当部は、前記優先度に基づいて前記計算機資源を比例配分することにより前記3Dマップごとに割り当てる(12)に記載の情報処理装置。
(14)
前記センシングデータはカメラにより現実空間を撮像した画像である(1)から(13)のいずれかに記載の情報処理装置。
(15)
センシングデータに基づいてデバイスの位置姿勢推定に利用する3Dマップを選択する3Dマップ選択部を備える(1)から(14)のいずれかに記載の情報処理装置。
(16)
前記センシングデータと、選択された前記3Dマップを用いて前記デバイスの位置姿勢を推定する位置姿勢推定部を備える(15)に記載の情報処理装置。
(17)
前記デバイスはARデバイスである(15)または(16)に記載の情報処理装置。
(18)
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定し、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する
情報処理方法をコンピュータに実行させるプログラム。
(19)
センサ部で取得したセンシングデータを送信して更新要求を行う
クライアントと、
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する3Dマップ更新部と
を備える情報処理装置と、
からなる情報処理システム。 The present technology can also have the following configuration.
(1)
a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
An information processing device comprising: a 3D map update unit that updates the 3D map that is the target of an update request using sensing data based on the priority.
(2)
The information processing device according to (1), wherein the priority setting unit sets the priority based on information regarding the 3D map.
(3)
The information processing device according to (2), wherein the priority setting unit sets a value obtained from information regarding one of the 3D maps or a total of values obtained from information regarding a plurality of the 3D maps as the priority.
(4)
The information regarding the 3D map is the information according to (2) or (3), which is the number of times a position/orientation estimation request using the 3D map that is the subject of the update request has been made after the update request has been made. Processing equipment.
(5)
The information processing device according to any one of (2) to (4), wherein the information regarding the 3D map is the number of times the 3D map has been used for the position and orientation estimation in the past.
(6)
The information processing device according to any one of (2) to (5), wherein the information regarding the 3D map is a ratio of information that can be used to update the 3D map among the sensing data.
(7)
The information processing device according to any one of (2) to (6), wherein the information regarding the 3D map is a degree of similarity between the sensing data and the 3D map.
(8)
The information processing device according to any one of (2) to (7), wherein the information regarding the 3D map is the number of past update requests for the 3D map.
(9)
The information processing device according to any one of (2) to (8), wherein the information regarding the 3D map is a priority set by a person involved in an application that uses the result of position and orientation estimation using the 3D map.
(10)
The information processing device according to any one of (2) to (9), wherein the information regarding the 3D map is a priority set by a user of an application that uses the result of position and orientation estimation using the 3D map.
(11)
The information processing device according to any one of (1) to (10), wherein the priority setting unit identifies the 3D map that is the target of an update request from among the plurality of 3D maps based on the sensing data.
(12)
The information processing apparatus according to any one of (1) to (11), including a computer resource allocation unit that allocates computer resources to be used for updating the 3D map for each of the 3D maps to be updated based on the priority. .
(13)
The information processing device according to (12), wherein the computer resource allocation unit allocates the computer resources to each 3D map by proportionally allocating the computer resources based on the priority.
(14)
The information processing device according to any one of (1) to (13), wherein the sensing data is an image of real space captured by a camera.
(15)
The information processing apparatus according to any one of (1) to (14), including a 3D map selection unit that selects a 3D map to be used for estimating the position and orientation of the device based on sensing data.
(16)
The information processing apparatus according to (15), further comprising a position and orientation estimation unit that estimates the position and orientation of the device using the sensing data and the selected 3D map.
(17)
The information processing apparatus according to (15) or (16), wherein the device is an AR device.
(18)
Set the priority when updating each of the multiple 3D maps that are the subject of the update request,
A program that causes a computer to execute an information processing method for updating the 3D map that is a target of an update request using sensing data based on the priority.
(19)
A client that sends sensing data acquired by the sensor unit and requests an update,
a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
an information processing device comprising: a 3D map update unit that updates the 3D map that is the target of the update request using sensing data based on the priority;
An information processing system consisting of
10・・・・クライアント
16・・・・センサ部
100・・・情報処理装置
105・・・優先度設定部
106・・・計算機資源割当部
107・・・3Dマップ更新部
108・・・3Dマップ選択部
109・・・位置姿勢推定部
1000・・情報処理システム 10...Client 16...Sensor unit 100...Information processing device 105...Priority setting unit 106...Computer resource allocation unit 107...3D map update unit 108...3D map Selection unit 109...Position and orientation estimation unit 1000...Information processing system
16・・・・センサ部
100・・・情報処理装置
105・・・優先度設定部
106・・・計算機資源割当部
107・・・3Dマップ更新部
108・・・3Dマップ選択部
109・・・位置姿勢推定部
1000・・情報処理システム 10...
Claims (19)
- 更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する3Dマップ更新部と
を備える情報処理装置。 a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
An information processing device comprising: a 3D map update unit that updates the 3D map that is the target of an update request using sensing data based on the priority. - 前記優先度設定部は、前記3Dマップに関する情報に基づいて前記優先度を設定する
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the priority setting unit sets the priority based on information regarding the 3D map. - 前記優先度設定部は、一つの前記3Dマップに関する情報から得た値、または複数の前記3Dマップに関する情報から得た値の合計を前記優先度として設定する
請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the priority setting unit sets a value obtained from information regarding one of the 3D maps or a total of values obtained from information regarding a plurality of the 3D maps as the priority. - 前記3Dマップに関する情報は、前記更新要求が行われた後に前記更新要求の対象である前記3Dマップを利用する位置姿勢推定要求が行われた回数である
請求項2に記載の情報処理装置。 3. The information processing apparatus according to claim 2, wherein the information regarding the 3D map is the number of times a position and orientation estimation request using the 3D map that is the target of the update request has been made after the update request has been made. - 前記3Dマップに関する情報は、過去に前記3Dマップが前記位置姿勢推定に利用された回数である
請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the information regarding the 3D map is the number of times the 3D map has been used for the position and orientation estimation in the past. - 前記3Dマップに関する情報は、前記センシングデータのうち、前記3Dマップの更新に利用できる情報の割合である
請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the information regarding the 3D map is a ratio of information that can be used for updating the 3D map, out of the sensing data. - 前記3Dマップに関する情報は、前記センシングデータと前記3Dマップの類似度である
請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the information regarding the 3D map is a degree of similarity between the sensing data and the 3D map. - 前記3Dマップに関する情報は、過去の前記3Dマップに対する更新要求の回数である
請求項2に情報処理装置。 3. The information processing apparatus according to claim 2, wherein the information regarding the 3D map is the number of past update requests for the 3D map. - 前記3Dマップに関する情報は、前記3Dマップを利用した位置姿勢推定の結果を利用するアプリケーションの関係者が設定した優先度である
請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the information regarding the 3D map is a priority set by a person involved in an application that uses the result of position and orientation estimation using the 3D map. - 前記3Dマップに関する情報は、前記3Dマップを利用した位置姿勢推定の結果を利用するアプリケーションのユーザが設定した優先度である
請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the information regarding the 3D map is a priority set by a user of an application that uses a result of position and orientation estimation using the 3D map. - 前記優先度設定部は、前記センシングデータに基づいて複数の前記3Dマップの中から更新要求の対象である前記3Dマップを特定する
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the priority setting unit identifies the 3D map that is the target of an update request from among the plurality of 3D maps based on the sensing data. - 前記優先度に基づいて前記3Dマップの更新に利用する計算機資源を更新の対象である前記3Dマップごとに割り当てる計算機資源割当部を備える
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, further comprising a computer resource allocation unit that allocates computer resources to be used for updating the 3D map for each of the 3D maps to be updated based on the priority. - 前記計算機資源割当部は、前記優先度に基づいて前記計算機資源を比例配分することにより前記3Dマップごとに割り当てる
請求項12に記載の情報処理装置。 The information processing apparatus according to claim 12, wherein the computer resource allocation unit allocates the computer resources to each of the 3D maps by proportionally allocating the computer resources based on the priority. - 前記センシングデータはカメラにより現実空間を撮像した画像である
請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the sensing data is an image of real space captured by a camera. - センシングデータに基づいてデバイスの位置姿勢推定に利用する3Dマップを選択する3Dマップ選択部を備える
請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, further comprising a 3D map selection unit that selects a 3D map to be used for estimating the position and orientation of the device based on sensing data. - 前記センシングデータと、選択された前記3Dマップを用いて前記デバイスの位置姿勢を推定する位置姿勢推定部を備える
請求項15に記載の情報処理装置。 The information processing apparatus according to claim 15, further comprising a position and orientation estimation unit that estimates the position and orientation of the device using the sensing data and the selected 3D map. - 前記デバイスはARデバイスである
請求項15に記載の情報処理装置。 The information processing apparatus according to claim 15, wherein the device is an AR device. - 更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定し、
前記優先度に基づいて、センシングデータを用いて更新要求の対象である前記3Dマップを更新する
情報処理方法をコンピュータに実行させるプログラム。 Set the priority when updating each of the multiple 3D maps that are the subject of the update request,
A program that causes a computer to execute an information processing method for updating the 3D map that is a target of an update request using sensing data based on the priority. - センサ部で取得したセンシングデータを送信して更新要求を行う
クライアントと、
更新要求の対象である複数の3Dマップのそれぞれについて更新を行う際の優先度を設定する優先度設定部と、
前記優先度に基づいて、前記センシングデータを用いて更新要求の対象である前記3Dマップを更新する3Dマップ更新部と
を備える情報処理装置と、
からなる情報処理システム。 A client that sends sensing data acquired by the sensor unit and requests an update,
a priority setting unit that sets a priority when updating each of the plurality of 3D maps that are the targets of the update request;
an information processing device comprising: a 3D map update unit that updates the 3D map that is the target of the update request using the sensing data based on the priority;
An information processing system consisting of
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022096256 | 2022-06-15 | ||
JP2022-096256 | 2022-06-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023243558A1 true WO2023243558A1 (en) | 2023-12-21 |
Family
ID=89191259
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/021513 WO2023243558A1 (en) | 2022-06-15 | 2023-06-09 | Information processing device, program, and information processing system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023243558A1 (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070124546A1 (en) * | 2005-11-29 | 2007-05-31 | Anton Blanchard | Automatic yielding on lock contention for a multi-threaded processor |
JP2009294534A (en) * | 2008-06-06 | 2009-12-17 | Mitsubishi Electric Corp | Map information processing apparatus |
WO2013161081A1 (en) * | 2012-04-27 | 2013-10-31 | 株式会社日立製作所 | Database management system, computer, and database management method |
WO2017057055A1 (en) * | 2015-09-30 | 2017-04-06 | ソニー株式会社 | Information processing device, information terminal and information processing method |
JP2020077363A (en) * | 2018-09-26 | 2020-05-21 | アップル インコーポレイテッドApple Inc. | Localization for mobile devices |
KR20220001274A (en) * | 2020-06-29 | 2022-01-05 | (주)뉴빌리티 | 3D map change area update system and method |
-
2023
- 2023-06-09 WO PCT/JP2023/021513 patent/WO2023243558A1/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070124546A1 (en) * | 2005-11-29 | 2007-05-31 | Anton Blanchard | Automatic yielding on lock contention for a multi-threaded processor |
JP2009294534A (en) * | 2008-06-06 | 2009-12-17 | Mitsubishi Electric Corp | Map information processing apparatus |
WO2013161081A1 (en) * | 2012-04-27 | 2013-10-31 | 株式会社日立製作所 | Database management system, computer, and database management method |
WO2017057055A1 (en) * | 2015-09-30 | 2017-04-06 | ソニー株式会社 | Information processing device, information terminal and information processing method |
JP2020077363A (en) * | 2018-09-26 | 2020-05-21 | アップル インコーポレイテッドApple Inc. | Localization for mobile devices |
KR20220001274A (en) * | 2020-06-29 | 2022-01-05 | (주)뉴빌리티 | 3D map change area update system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113811920B (en) | Distributed pose estimation | |
EP3827411B1 (en) | Conditional modification of augmented reality object | |
CN111081199B (en) | Selecting a temporally distributed panoramic image for display | |
JP2020509633A (en) | Location Determination for Mixed Reality Systems | |
US20140015858A1 (en) | Augmented reality system | |
US20240118083A1 (en) | Localisation of mobile device using image and non-image sensor data in server processing | |
CN107204044B (en) | Picture display method based on virtual reality and related equipment | |
US11295142B2 (en) | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium | |
JP2017505923A (en) | System and method for geolocation of images | |
US11727648B2 (en) | Method and device for synchronizing augmented reality coordinate systems | |
US11627302B1 (en) | Stereoscopic viewer | |
US20140152562A1 (en) | Display controller, display system, storage medium and method | |
CN111127584A (en) | Method and device for establishing visual map, electronic equipment and storage medium | |
CN113610702B (en) | Picture construction method and device, electronic equipment and storage medium | |
WO2023243558A1 (en) | Information processing device, program, and information processing system | |
CN112870714A (en) | Map display method and device | |
US11847750B2 (en) | Smooth object correction for augmented reality devices | |
KR20210051002A (en) | Method and apparatus for estimating pose, computer-readable storage medium and computer program for controlling the holder device | |
JP2017184136A (en) | Information processing device, information processing method, information processing system, and program | |
KR101877901B1 (en) | Method and appratus for providing vr image | |
CN114450970B (en) | Method, electronic device and medium for displaying different content to different users | |
JP5932107B2 (en) | Image processing server and imaging apparatus | |
WO2024057779A1 (en) | Information processing device, program, and information processing system | |
US10599940B2 (en) | Information processing apparatus, information processing system, and information processing method | |
JP7394813B2 (en) | Terminal device, control method and control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23823848 Country of ref document: EP Kind code of ref document: A1 |