WO2021075307A1 - Information processing device, information processing method, and information processing program - Google Patents

Information processing device, information processing method, and information processing program Download PDF

Info

Publication number
WO2021075307A1
WO2021075307A1 PCT/JP2020/037806 JP2020037806W WO2021075307A1 WO 2021075307 A1 WO2021075307 A1 WO 2021075307A1 JP 2020037806 W JP2020037806 W JP 2020037806W WO 2021075307 A1 WO2021075307 A1 WO 2021075307A1
Authority
WO
WIPO (PCT)
Prior art keywords
score
information
image
information processing
virtual
Prior art date
Application number
PCT/JP2020/037806
Other languages
French (fr)
Japanese (ja)
Inventor
浩一 川崎
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2021075307A1 publication Critical patent/WO2021075307A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • This technology relates to information processing devices, information processing methods, and information processing programs.
  • AR augmented reality
  • MR mixed reality
  • autonomous mobile bodies that move autonomously such as cleaning robots and pet-type robots, are also attracting attention.
  • SLAM Simultaneous Localization And Mapping
  • SLAM is a technology that simultaneously estimates the self-position and creates an environmental map from information acquired from various sensors. For example, using the feature points detected from the input image, the position of the feature points and the position of the camera in the environment. And posture recognition are performed at the same time.
  • the accuracy of the environment recognition technology based on the feature point cloud in the image depends on the distribution of the feature points in the image. The larger the number of feature points, the higher the accuracy, and the smaller the number of feature points, the lower the accuracy. ..
  • the user who uses AR or the like does not pay attention to the characteristics of such an environment recognition technology, and depending on the usage mode of the user, the user does not recognize the environment by using it in a direction unfavorable for the environment recognition technology. It becomes stable and may interfere with its use. Therefore, a mechanism has been proposed that can avoid such a situation when providing it to a user such as AR (Patent Document 1).
  • This technology was made in view of these points, and aims to provide an information processing device, an information processing method, and an information processing program that can easily grasp information about the environment.
  • the first technique is to arrange an information acquisition unit that acquires information within the shooting range from an image, a plot processing unit that plots information on the surface of a virtual three-dimensional object, and a virtual three-dimensional object. It is an information processing device including an arrangement processing unit to be arranged on a target.
  • the second technique is an information processing method in which information within the shooting range is acquired from an image, the information is plotted on the surface of a virtual three-dimensional object, and the virtual three-dimensional object is placed as an arrangement target.
  • the third technology is an information processing program that acquires information within the shooting range from an image, plots the information on the surface of a virtual three-dimensional object, and causes a computer to execute an information processing method for arranging the virtual three-dimensional object on the placement target. Is.
  • FIG. 3A is an explanatory diagram of the score plot on the sphere
  • FIG. 3B is an explanatory diagram of the arrangement of the sphere.
  • FIG. 18A is an explanatory diagram of an example showing a score value on a sphere
  • FIG. 18B is an explanatory diagram of score complementation. It is explanatory drawing of the presentation of a recommended route.
  • Embodiment> [1-1. Configuration of terminal device 100] [1-2. Configuration of information processing device 200] [1-3. Processing by information processing device 200] ⁇ 2. Modification example>
  • the terminal device 100 captures an image for calculating a score by the information processing device 200, displays information calculated by the information processing device 200, and the like.
  • the terminal device 100 includes a control unit 101, a camera unit 102, a sensor unit 103, an input unit 104, a display unit 105, a communication unit 106, a storage unit 107, and an information processing device 200. ..
  • the control unit 101 is composed of a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
  • the ROM stores a program or the like that is read and operated by the CPU.
  • the RAM is used as the work memory of the CPU.
  • the CPU controls the entire terminal device 100 and each part by executing various processes according to the program stored in the ROM and issuing commands.
  • the camera unit 102 includes an image sensor, an image processing engine, and the like, and has a function as a camera capable of capturing RGB or monochrome two-dimensional still images and moving images.
  • the camera unit 102 is provided in the terminal device 100 itself, or may be a digital camera, a single-lens reflex camera, or the like which is separate from the terminal device 100 and can communicate with the terminal device 100, or may be attached to the head or clothes. It may be a so-called action camera that can shoot.
  • the sensor unit 103 is a sensor that can detect the position of a GPS (Global Positioning System) module or the like.
  • GPS Global Positioning System
  • GPS is a system that knows the current position by receiving a signal from an artificial satellite located around the earth with a receiver.
  • the sensor unit 103 may include a sensor capable of measuring a distance such as LiDAR (Laser Imaging Detection and Ringing). LiDAR measures scattered light with respect to laser irradiation that emits pulsed light, and analyzes the distance to an object at a long distance and the properties of the object.
  • LiDAR Laser Imaging Detection and Ringing
  • the sensor unit 103 may include a sensor that detects acceleration and angular velocity, such as an IMU (Inertial Measurement Unit) module.
  • the IMU module is an inertial measurement unit that detects the attitude, tilt, etc. of the information processing device 200 by obtaining three-dimensional angular velocity and acceleration with an acceleration sensor, angular velocity sensor, gyro sensor, etc. in two or three axial directions. To do.
  • the sensor unit 103 detects the shooting position and the shooting posture and supplies the information processing device 200 to the information processing device 200.
  • the input unit 104 is for the user to input for the operation of the terminal device 100.
  • a control signal corresponding to the input is generated and supplied to the control unit 101.
  • the control unit 101 performs various processes corresponding to the control signal.
  • the input unit 104 includes a touch screen integrally configured with the display which is the display unit 105, a physical button, a voice input function by voice recognition, and the like.
  • the input unit 104 also includes a release button for taking an image.
  • the display unit 105 is a display device such as a display that displays an image / video, a GUI (Graphical User Interface), a through image at the time of shooting by the camera unit 102, information processed by the information processing device 200, and the like.
  • a display device such as a display that displays an image / video, a GUI (Graphical User Interface), a through image at the time of shooting by the camera unit 102, information processed by the information processing device 200, and the like.
  • the communication unit 106 is a communication module for transmitting / receiving data to / from an external server or the like.
  • Communication is wireless LAN (Local Area Network), WAN (Wide Area Network), WiFi (Wireless Fidelity), 4G (4th generation mobile communication system), 5G (5th generation mobile communication system), Bluetooth (registered trademark), It is performed by wireless communication such as ZigBee (registered trademark).
  • the communication with the image pickup apparatus 400 may be a wired communication such as a USB (Universal Serial Bus) communication in addition to the wireless communication.
  • the storage unit 107 is a storage medium composed of, for example, an HDD (Hard Disc Drive), a semiconductor memory, an SSD (solid state drive), and the like, and the data, score, image shooting position, and shooting posture detected by the sensor unit 103. Information necessary for processing by the information processing apparatus 200 is stored.
  • an HDD Hard Disc Drive
  • SSD solid state drive
  • the information processing device 200 includes an information acquisition unit 201, a plot processing unit 202, and an arrangement processing unit 203.
  • the information acquisition unit 201 acquires information about the photographing range from the image captured by the camera unit 102 of the terminal device 100.
  • the information has a score and a reliability. Details of the score and reliability will be described later.
  • the information acquisition unit 201 also performs a process of estimating a score outside the shooting range of the image using the calculated score.
  • the image may be a still image or a frame image constituting a moving image. Further, the image may be an image taken by another camera and supplied to the information processing apparatus 200 in addition to the image taken by the camera unit 102.
  • the plot processing unit 202 performs a process of plotting the score and reliability calculated and estimated by the information acquisition unit 201 on a sphere which is a virtual three-dimensional object.
  • the placement processing unit 203 performs a process of arranging a sphere on which a score and a reliability are plotted as an arrangement target.
  • the placement target is spatial data, and includes, for example, two-dimensional or three-dimensional map data, images and videos of the space, and virtual space in AR and MR. The details of the arrangement of the spheres will be described later.
  • the information processing device 200 is configured as described above.
  • the information processing device 200 may operate in the terminal device 100 as in the present embodiment, or may operate in another device different from the terminal device 100, another external server, or the like.
  • the information processing device 200 is realized by executing a program, and the program may be installed in the terminal device 100 or the like in advance, or may be distributed by download, storage medium, or the like so that the user can install it by himself / herself. Good. Further, the information processing device 200 is not only realized by a program, but may also be realized by combining a dedicated device, a circuit, or the like by hardware having the function.
  • the score is an index showing the characteristics of the environment within the shooting range of the image, which is determined in a complex manner from the number, position, and amount of feature points that can be acquired from the image.
  • the score can represent, for example, the ease and accuracy of SLAM self-position estimation and environmental map creation.
  • the score is calculated as a value from the feature points within the shooting range detected from the image, if the shooting direction (viewpoint) is different and the feature points in the shooting range are different even at the same position, The score will also be different.
  • the score is calculated so that one score corresponds to one combination of the shooting position and the shooting posture.
  • Reliability is an index showing the reliability of the score.
  • the score calculated directly from the image has a reliability of 100%, and the score estimated using other scores has a reliability of 100% or less.
  • the score is plotted on the surface of the sphere as a virtual three-dimensional object using the hue of the HSV color space.
  • 3A-a in FIG. 3A is a front view of the sphere
  • FIG. 3A-b is a top view of the sphere
  • FIG. 3A-c is a side view of the sphere.
  • Confidence is also plotted on the surface of the sphere using saturation or lightness.
  • this sphere is arranged at a position corresponding to the position where the image used for calculating the score was taken on the two-dimensional map data. Then, it is displayed on the display unit 105 or the like of the terminal device 100. This allows the user to easily grasp the score and reliability.
  • the present technology creates a map (score map) having information acquired from an image by arranging a sphere on which a score is plotted in this way as an arrangement target.
  • FIG. 4 is a flowchart showing a score acquisition process performed by the information acquisition unit 201.
  • FIG. 5 is a flowchart showing a plot process by the score plot processing unit 202 and a sphere arrangement process by the arrangement processing unit 203.
  • the score acquisition process first, an image captured in the range for calculating the score captured by the camera unit 102 of the terminal device 100 in step S101 is input. Next, the shooting position and shooting posture when the image is shot in step S102 are estimated.
  • the shooting position and shooting posture can be estimated from the shooting position information and the shooting posture information which are the sensor information acquired by the GPS, LiDAR, IMU module, etc. as the sensor unit 103.
  • step S103 the score is calculated from the image.
  • the score calculation method will be described with reference to FIGS. 6 and 7.
  • the image is divided into 16 squares by dividing the image into L layers.
  • the number of squares is only one example, and the present technology is not limited to this.
  • the weight Wl is calculated for each layer l.
  • the weight Wl can be calculated by the following equation 1.
  • the feature points are detected for each layer l, and the number Nl of the cells containing the feature points is counted.
  • a method of detecting feature points for example, there is a method called a luminance gradient.
  • the luminance gradient first, attention is paid to one pixel in the image, and when the luminance gradient with the adjacent pixel appears by a predetermined amount or more, that one pixel is detected as a feature point. The larger the gradient of brightness, the larger the feature amount of the feature point. This process is performed on all the pixels that make up the image.
  • Equation 2 the score is represented as S.
  • the score is calculated as 66.
  • the score is calculated as 80, in the example of FIG. 7B, the score is calculated as 146, and in the example of FIG. 7C, the score is calculated as 200.
  • the rectangles in FIGS. 6B and 7A to 7C indicate an image, and the points in the rectangle represent the feature points detected in the image.
  • step S104 the score is estimated and the reliability of the estimated score is calculated.
  • the score is estimated when it is necessary to plot the score corresponding to the position other than the shooting position where the score was calculated or the posture other than the shooting posture on the sphere. Therefore, score estimation is not an essential process. It is preferable that the user can decide whether or not to estimate the score.
  • the solid rectangular frame in FIG. 8A indicates the shooting range (field of view) of the image shot by the camera unit 102, and the score can be calculated from the image for this shooting range by the method described above.
  • the reliability of the score calculated from the captured image is 100%.
  • the score is estimated using the score calculated from this image.
  • the broken line rectangular frame in FIG. 8A indicates the first estimation range for estimating the score. Further, the alternate long and short dash line rectangular frame in FIG. 8A indicates a second estimation range for estimating the score.
  • the estimated range for estimating the score needs to partially overlap with the shooting range of the image for which the score has been calculated.
  • the score of the first estimated range that overlaps with the shooting range is estimated to be the same as the score of the shooting range. Therefore, for example, when the score of the shooting range is calculated to be 100, the score of the first estimated range is estimated to be 100. It is estimated that the score of the second estimated range, which partially overlaps the shooting range, is also the same value as the score of the shooting range.
  • the reliability of the score of the first estimated range is calculated from the ratio of the solid angle where the shooting range and the first estimated range overlap. For example, when the ratio of the solid angles overlapping the shooting range and the first estimated range is 20%, the reliability of the score in the first estimated range is 20%.
  • the reliability of the score of the second estimated range is calculated from the ratio of the solid angle where the shooting range and the second estimated range overlap. For example, when the ratio of the solid angles overlapping in the shooting range and the second estimation range is 40%, the reliability of the score in the second estimation range is 40%.
  • the solid line on the left side in FIG. 8B shows the shooting range (field of view) of the image shot by the camera unit 102, and the score can be calculated from the image for this shooting range.
  • the points in FIG. 8B show the feature points detected from the image for score calculation. In the example of FIG. 8B, it is assumed that eight feature points (black points and white points) are detected from the image. The feature points are detected as three-dimensional coordinates.
  • the reliability of the score is 100%.
  • the score is estimated using the score calculated from this image.
  • the broken line on the right side in FIG. 8B shows the estimated range (field of view) from the position / posture at which the score is estimated.
  • the score of the estimated range in which the shooting range and a part of the field of view overlap is estimated to be the same as the score of the shooting range. Therefore, for example, when the score of the shooting range in FIG. 8B is calculated to be 100, the score of the estimated range of the broken line where the shooting range and a part of the field of view overlap is estimated to be 100.
  • the reliability of the score in the estimation range can be calculated by the following equation 3 using the number of feature points.
  • step S105 the shooting position and shooting posture of the image are associated with the score.
  • the score associated with each shooting position / shooting posture is stored in, for example, the storage unit 107 of the terminal device 100.
  • the score acquisition process is performed in this way. The score acquisition process is performed every time a photograph is taken to acquire a score.
  • the score acquisition process When the sphere placement process is performed later without performing the score acquisition process and the sphere placement process in parallel, in the score acquisition process, the score is made to correspond to the shooting position and the shooting posture in which the image was taken to calculate the score. Need to keep. Then, a score map can be created by arranging the spheres on the grid corresponding to the shooting position information in the later arrangement process. In order to realize this, it is necessary to detect the shooting position and shooting posture with GPS, LiDAR, AR marker, etc. at the time of shooting.
  • step S201 the plot processing unit 202 reads out the score and reliability for each shooting position / shooting posture stored in the storage unit 107.
  • step S202 the arrangement processing unit 203 determines the position and orientation in which the sphere is arranged.
  • a grid of a predetermined size is set on the map data showing the entire score map creation area in which an image is taken and the spheres are arranged.
  • the size of this grid corresponds to the size of the sphere to be finally placed, and the size may be determined in advance by default, or the user may arbitrarily set the size. For example, if you want to know the score roughly, make the grid larger and make the spheres placed larger. On the other hand, if you want to know the score in detail, you can make the grid smaller and the spheres to be placed smaller.
  • the grid may or may not be superimposed on the map data and displayed on the display unit 105.
  • the sensor unit 103 of the terminal device 100 acquires the shooting position and shooting posture each time the shooting is performed. Then, when the information acquisition unit 201 calculates the score from the image, as shown in FIG. 10A, a sphere plotting the score calculated from the image taken at the shooting position is arranged on the grid where the shooting position exists inside. The size of the sphere must be less than or equal to the size of the grid, and the sphere in each grid will be plotted with all the scores calculated from the image whose shooting position is within that grid.
  • the sphere is not placed on the grid where shooting is not performed, that is, there is no shooting position inside.
  • the spheres can be arranged by estimating the score from the scores plotted on the spheres arranged on the adjacent grids.
  • step S203 the score included in the grid on which the spheres are arranged (having the position information in the grid) is read from the storage unit 107 and listed.
  • step S204 the plot processing unit 202 plots the score and reliability of the sphere.
  • the plotting process will be described with reference to FIGS. 11 and 12.
  • the determination of the reference position (plot reference position) when plotting the score on the surface of the sphere will be explained.
  • FIG. 11A it is assumed that the image P1, the image P2, and the image P3 are acquired by shooting at three shooting positions.
  • FIG. 11B it is assumed that the score of the image P1 is calculated to be 100, the score of the image P2 is 200, and the score of the image P3 is 300 according to the shooting position and the shooting posture based on each image.
  • the score and the shooting position / shooting posture are associated with each other by assuming a pseudo sphere of a predetermined or arbitrary size and matching the shooting position with the center of the sphere. Then, the score is associated with the point where the surface of the sphere intersects with the extension line in the opposite direction of the line segment extending from the shooting position in the shooting direction.
  • the three scores are placed on the grid as if they were shot at the same position in the same grid as shown in FIG. 12B. Plot on one corresponding sphere.
  • the score plot reference position on the surface of the sphere is an extension line in the opposite direction of the line segment extending from the shooting position of each score in the shooting direction and the sphere. It becomes a point where the surfaces of are intersected.
  • the score will be different depending on the shooting posture even if the shooting position is the same, but by plotting the score at each shooting position on the surface of the sphere in this way, the score based on each shooting posture can be shown on one sphere. As a result, the user can grasp the score plotted on the sphere even when viewed from one direction without moving and changing his / her viewpoint.
  • the plot reference position is determined for the sphere set for each shooting position and shooting posture in the state shown in FIG. 11B. You may.
  • the sphere is a solid that approximates a sphere composed of only triangular surfaces, and the score for each triangular surface of the solid is estimated based on the score calculated from the image. To do.
  • the first method of score estimation for each triangular surface was calculated from an image on a triangular surface where an extension line in the opposite direction of a line segment extending from the center of the sphere in the shooting direction intersects as shown in FIG. It is a method of associating scores.
  • the point where the extension lines intersect on the triangular surface is the plot reference position.
  • the second method of calculating the score for each triangular surface is a method of calculating the score for the triangular surface to which the score cannot be associated with the first method.
  • the plot reference position in the triangular plane to which the score is associated in the first method and the plot reference position in another triangular plane to which the score is associated are connected by a line segment. Then, when the line segment overlaps the triangular surface to which the score is not associated, the score of the surface to which the score is not associated is calculated by linear interpolation.
  • the triangular faces to which the scores are associated by the first method are face a and face b
  • connecting the plot reference position in the face a and the plot reference position in the face b is a face. Since it overlaps with c, the score of surface c is calculated by linear interpolation from the score corresponding to surface a and the score corresponding to surface b.
  • the third method of calculating the score for each triangular surface is a score in which the average of the scores in the adjacent triangular surfaces is associated with the triangular surface for the triangular surfaces whose scores cannot be associated with either the first method or the second method. And.
  • the score can be associated with the entire surface of the sphere.
  • the score on the entire surface of the sphere is shown in FIG. 15 in HSV color space.
  • the score uses the hue of the HSV color space, for example, the color corresponding to the highest score is red, and as the score decreases, it changes to orange, yellow, green, light blue, and purple.
  • the scores of all the captured images are compared, normalized from the minimum to the maximum score, and then 1.0 is red and 0.0 is purple.
  • the color corresponding to the score of an arbitrary position / posture can be determined by the following equation 4.
  • Such a color change is a method used when data with a height difference such as a high temperature is represented by a color, and it is possible to intuitively grasp a part with a high score and a part with a low score by visual inspection.
  • the plot reference position has the highest score, and the score decreases concentrically around the plot reference position.
  • the hue scale value or RGB value and score value may be directly associated and plotted, or a unique standard for associating score and hue may be established and the color may be based on the standard. May be used to plot the score on a sphere.
  • FIGS. 15B and 15C are diagrams showing a display state in which the actual color of the sphere on which the score is plotted changes in a gradation shape for convenience of explanation and drawing of drawings (FIGS. 15B-a and 15B-c). 15C-a, 15C-c) and diagrams (FIGS. 15B-b, 15B-d, 15C-b, 15C-d) showing colors hierarchically to explain color changes. Expressed by.
  • FIG. 15B shows a case where the score calculated from the image is one.
  • FIG. 15B-b shows the hierarchical change of the color of FIG. 15B-a shown by the gradation
  • FIG. 15B-d shows the hierarchical change of the color of FIG. 15B-c shown by the gradation.
  • the plot reference position on the surface of the sphere has the highest score, and the score is plotted so that the score decreases concentrically from the plot reference position in red, orange, yellow, green, light blue, and blue. To.
  • FIG. 15C shows a case where there are two scores calculated from the image.
  • FIG. 15C-b shows the hierarchical change of the color of FIG. 15C-a shown by the gradation
  • FIG. 15C-d shows the hierarchical change of the color of FIG. 15C-c shown by the gradation.
  • Each plot position of the two scores is displayed in the color with the highest score, and the scores are plotted so as to decrease concentrically around each of the two plot reference positions.
  • the reliability is also plotted on the sphere by changing the saturation using the saturation. Since the score calculated from the image corresponds to the plot reference position, the reliability is the highest. Therefore, the reliability is plotted so that the plot reference position has the highest saturation and the saturation decreases concentrically from the plot reference position.
  • the two plot reference positions correspond to different shooting postures.
  • the plotting method of FIG. 15C is a process of plotting information on a plurality of environments corresponding to a plurality of shooting postures on a single virtual three-dimensional object corresponding to a single shooting position.
  • the user can simultaneously and three-dimensionally confirm information on a plurality of environments corresponding to a plurality of shooting postures with respect to a single shooting position. Therefore, the convenience (usability) of the user when the user changes the shooting posture and re-shoots is improved.
  • the score calculated from the image plotted on the sphere may be 3 or more, and the number is not particularly limited.
  • step S205 the arrangement processing unit 203 arranges the sphere on the arrangement target to make the sphere visible to the user.
  • the spheres can be arranged on the map data as shown in FIG.
  • the sphere can be arranged on a two-dimensional map as shown in FIG. 16A, or can be arranged on a three-dimensional map as shown in FIG. 16B.
  • the display on the display unit 105 may not be updated until the ball placement process in step S205 is completed, and the display may be updated to display the ball on which the score is plotted after the ball placement process is completed. , The display may be updated every time each process is completed.
  • the present technology is not limited to either.
  • the grid to which the shooting position belongs is grasped based on the position information of the shooting position acquired by the sensor unit 103 in advance. Then, it can be realized by arranging a sphere on which the score calculated from the image taken at the shooting position is plotted at the position on the map of the grid to which the shooting position belongs.
  • the score map can be created in this way.
  • a two-dimensional image of a sphere can be combined and arranged on the image.
  • the image on which the sphere is placed may be the same as or different from the image taken to calculate the score.
  • the sphere can be arranged on the image by arranging the sphere on the frame image constituting the image.
  • the position information of the shooting position is acquired in advance by the sensor unit 103.
  • the position information of the shooting position of the image is also acquired. Then, it can be realized by superimposing a sphere displaying the score on an image having position information corresponding to the position information of the shooting position for score calculation and synthesizing it.
  • FIG. 17B a sphere can be superimposed and displayed on a real space image in AR and MR. This makes it possible to present the score to the user as if the sphere were placed in the real world.
  • FIG. 17B is an example in which a sphere is superimposed and displayed on a real space image captured and displayed by the camera of the smartphone SP.
  • the position information of the shooting position is acquired in advance by the sensor unit 103. Also, create a sphere displaying the score as an AR object. Furthermore, an AR marker associated with the display of the sphere is installed in the real world corresponding to the position information. Then, when the user observes the AR marker in the real world through an AR device such as a smartphone, a wearable device, or a head-mounted display, or an MR device, the sphere is displayed on the display of the AR device or MR device. In addition to the method using the AR marker, a method of displaying the AR object on the AR device or MR device using GPS information may be used.
  • the sphere is placed at the shooting position.
  • the sphere is placed at the shooting position, but even for the position where shooting is not performed, the estimated score is calculated at the position where shooting is not performed by estimating the score by the score estimation method described above. Plotted spheres can be placed.
  • spheres with plots of scores can be placed even at positions where shooting is not performed to complement the score information, or scores can be estimated and spheres can be placed in a grid pattern so as to fill positions where shooting is not performed. It is possible to make the score information more detailed.
  • the spheres are arranged in a semi-transparent manner in order to prevent the spheres from being filled up in space due to the dense arrangement of the spheres and making the spheres difficult to see. You may do things such as automatically setting an interval.
  • the processing by the information processing device 200 is performed as described above. It is also possible to synchronously execute the score acquisition process and the score plot process in parallel processing, or it is also possible to acquire the score first and then perform the score display process.
  • shooting, score acquisition processing, score plotting, and placement processing are synchronously executed in parallel processing, the user moves from grid to grid while shooting with the camera unit 102, and shoots and moves while arranging spheres on the passed grid. It is also possible to continue.
  • the score may be plotted on the sphere, or only the reliability may be plotted.
  • the reliability of the position and posture for which the score has not been calculated is 0% and plotting it on the sphere, the position and posture for which the score has not been calculated can be confirmed at a glance. For example, when this reliability of 0% is represented by a black sphere, a score map can be efficiently created by calculating and plotting the score so as to erase the black sphere.
  • the score may be plotted on the sphere in the HSV color space, and the numerical value of the score may be plotted on the sphere. Further, in addition to the score value, the reliability value may be plotted.
  • the score plotted on the sphere 3 thus calculated has a color scheme indicating an intermediate value of the scores plotted on the sphere 1 and the sphere 2 as shown in FIGS. 18B-b and 18B-c.
  • the position and posture in which the score is not calculated or the reliability is lower than the predetermined value are presented and the image is taken at that position and posture. May be encouraged.
  • the thick line on the score map represents the position and the recommended movement route for photographing the posture without the score, and the thin arrow line indicates the direction without the score.
  • the presentation of this recommended route is also possible in an image in which spheres are superimposed as shown in FIG. 19B.
  • the processing by the information processing device 200 is performed as described above.
  • the user can easily grasp the score, which is the information obtained from the captured image, and its reliability only by visual inspection.
  • the score and reliability in color on the surface of the virtual sphere, the user can intuitively grasp the score.
  • plotting the score and reliability on a spherical surface instead of a flat surface it is possible to grasp the score and reliability that differ depending on the shooting posture even when viewed from one direction.
  • the terminal device 100 may be a personal computer, a tablet terminal, a digital camera, a mobile phone, a portable game machine, a head-mounted display, a wearable device, or the like, in addition to a smartphone.
  • the terminal device 100 used as a camera by using the camera unit 102 and the terminal device 100 used as a display device for displaying a sphere which is a virtual three-dimensional object may be the same device, or may be separate devices. There may be. Further, the device used as a camera by using the camera unit 102, the device for displaying a sphere which is a virtual three-dimensional object, and the device operating as the information processing device 200 may all be separate devices.
  • the present technology can also have the following configurations.
  • An information acquisition unit that acquires information on the environment within the shooting range from the image
  • a plot processing unit that plots the above information on the surface of a virtual three-dimensional object
  • An information processing device including an arrangement processing unit for arranging the virtual three-dimensional object as an arrangement target.
  • the plot processing unit plots the information on the virtual three-dimensional object based on the shooting position and shooting posture of the image.
  • the information processing apparatus wherein the information is the reliability of the score.
  • the information processing device calculates a score as the information from the image.
  • the information processing apparatus estimates a score outside the shooting range of the image from the score calculated from the image.
  • the information processing device according to any one of (1) to (9), wherein the virtual three-dimensional object is a sphere.
  • the information processing device according to any one of (1) to (10), wherein the arrangement target is spatial data.
  • the information processing device (13) The information processing device according to (11) or (12), wherein the arrangement target is an image and / or a moving image.
  • the information processing device according to any one of (11) to (13), which is a virtual space.
  • the plot processing unit plots the information on the virtual three-dimensional object using the HSV color space.
  • the plot processing unit plots the score on the virtual three-dimensional object using hue, and plots the reliability on the virtual three-dimensional object using saturation.
  • the arrangement processing unit sets a grid in the real space as an area for capturing the image, and arranges the sphere as the virtual three-dimensional object at a specific position in the spatial data as the arrangement target corresponding to the grid.
  • the information processing apparatus according to any one of (1) to (16), wherein the plot processing unit plots a score calculated from the image on the surface of the sphere.
  • (18) Obtain information within the shooting range from the image and The above information is plotted on the surface of a virtual three-dimensional object.
  • (19) Obtain information within the shooting range from the image and The above information is plotted on the surface of a virtual three-dimensional object.
  • An information processing program that causes a computer to execute an information processing method for arranging the virtual three-dimensional object as an arrangement target.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

This information processing device comprises: an information acquisition unit that acquires, from an image, information within an image-capture range; a plotting unit that plots the information onto the surface of a virtual three-dimensional object; and a disposing unit that disposes the three-dimensional object at a disposition target.

Description

情報処理装置、情報処理方法および情報処理プログラムInformation processing equipment, information processing methods and information processing programs
 本技術は、情報処理装置、情報処理方法および情報処理プログラムに関する。 This technology relates to information processing devices, information processing methods, and information processing programs.
 近年、実空間に付加的な情報を重畳してユーザに提示する拡張現実(AR:Augmented Reality)や、複合現実(MR:Mixed Reality)と呼ばれる技術が注目されている。また、掃除ロボットやペット型ロボットなど、自律して移動する自律移動体も注目されている。 In recent years, technologies called augmented reality (AR) and mixed reality (MR), which superimpose additional information on the real space and present it to the user, have been attracting attention. In addition, autonomous mobile bodies that move autonomously, such as cleaning robots and pet-type robots, are also attracting attention.
 そのようなAR技術、MR技術、自律移動体技術などにおいては実空間および実空間内のオブジェクトの3次元構造の認識が必要になり、認識の手法としては、例えば、SLAM(Simultaneous Localization And Mapping)と呼ばれる手法がある。SLAMとは、各種センサから取得した情報から自己位置推定と環境地図作成を同時に行う技術であり、例えば、入力画像から検出される特徴点を用いて、特徴点の位置と環境内のカメラの位置及び姿勢の認識とが同時に実行される。 In such AR technology, MR technology, autonomous mobile technology, etc., it is necessary to recognize the three-dimensional structure of real space and objects in real space, and as a recognition method, for example, SLAM (Simultaneous Localization And Mapping) There is a technique called. SLAM is a technology that simultaneously estimates the self-position and creates an environmental map from information acquired from various sensors. For example, using the feature points detected from the input image, the position of the feature points and the position of the camera in the environment. And posture recognition are performed at the same time.
 画像内の特徴点群に基づく環境認識技術の精度は、画像内の特徴点の分布に依存し、特徴点の数がより多いほど精度は高くなり、特徴点の数が少ないほど精度は低くなる。しかし、ARなどを利用するユーザは、そのような環境認識技術の特性に注意を払うことはなく、ユーザの利用態様によっては環境認識技術にとって好ましくない方向に向けて利用して環境の認識が不安定になり、利用に支障が出る場合もある。そこで、ARなどユーザに提供する際にこのような事態を回避することのできる仕組みが提案されている(特許文献1)。 The accuracy of the environment recognition technology based on the feature point cloud in the image depends on the distribution of the feature points in the image. The larger the number of feature points, the higher the accuracy, and the smaller the number of feature points, the lower the accuracy. .. However, the user who uses AR or the like does not pay attention to the characteristics of such an environment recognition technology, and depending on the usage mode of the user, the user does not recognize the environment by using it in a direction unfavorable for the environment recognition technology. It becomes stable and may interfere with its use. Therefore, a mechanism has been proposed that can avoid such a situation when providing it to a user such as AR (Patent Document 1).
特開2013-225245号公報Japanese Unexamined Patent Publication No. 2013-225245
 なお、SLAMにおいては取得できる自己位置が誤差を含んでいる場合、誤差が蓄積されて実空間から徐々にずれていくという問題もある。この問題を解決するためには、誤差を減らす、または誤差をリセットすることが必要である。それを行うためには実空間内における特徴点の検出の精度が低くなるという環境に関する情報を事前に把握しておく必要がある。 In SLAM, if the self-position that can be acquired includes an error, there is also a problem that the error is accumulated and gradually deviates from the real space. To solve this problem, it is necessary to reduce the error or reset the error. In order to do so, it is necessary to grasp in advance information about the environment that the accuracy of detecting feature points in the real space is low.
 本技術はこのような点に鑑みなされたものであり、環境に関する情報を容易に把握することができる情報処理装置、情報処理方法および情報処理プログラムを提供することを目的とする。 This technology was made in view of these points, and aims to provide an information processing device, an information processing method, and an information processing program that can easily grasp information about the environment.
 上述した課題を解決するために、第1の技術は、画像から撮影範囲内の情報を取得する情報取得部と、情報を仮想立体物の表面にプロットするプロット処理部と、仮想立体物を配置対象に配置する配置処理部とを備える情報処理装置である。 In order to solve the above-mentioned problems, the first technique is to arrange an information acquisition unit that acquires information within the shooting range from an image, a plot processing unit that plots information on the surface of a virtual three-dimensional object, and a virtual three-dimensional object. It is an information processing device including an arrangement processing unit to be arranged on a target.
 また、第2の技術は、画像から撮影範囲内の情報を取得し、情報を仮想立体物の表面にプロットし、仮想立体物を配置対象に配置する情報処理方法である。 The second technique is an information processing method in which information within the shooting range is acquired from an image, the information is plotted on the surface of a virtual three-dimensional object, and the virtual three-dimensional object is placed as an arrangement target.
 さらに、第3の技術は、画像から撮影範囲内の情報を取得し、情報を仮想立体物の表面にプロットし、仮想立体物を配置対象に配置する情報処理方法をコンピュータに実行させる情報処理プログラムである。 Further, the third technology is an information processing program that acquires information within the shooting range from an image, plots the information on the surface of a virtual three-dimensional object, and causes a computer to execute an information processing method for arranging the virtual three-dimensional object on the placement target. Is.
端末装置100の構成を示すブロック図である。It is a block diagram which shows the structure of the terminal apparatus 100. 情報処理装置200の構成を示すブロック図である。It is a block diagram which shows the structure of an information processing apparatus 200. 図3Aは球におけるスコアのプロットの説明図であり、図3Bは球の配置の説明図である。FIG. 3A is an explanatory diagram of the score plot on the sphere, and FIG. 3B is an explanatory diagram of the arrangement of the sphere. スコアの取得処理を示すフローチャートである。It is a flowchart which shows the score acquisition process. 球へのプロット処理と球の配置処理を示すフローチャートである。It is a flowchart which shows the plot processing to a sphere and the arrangement process of a sphere. スコアの算出の説明図である。It is explanatory drawing of the calculation of a score. スコアの算出の説明図である。It is explanatory drawing of the calculation of a score. スコアの推定の説明図である。It is explanatory drawing of the estimation of a score. 球の配置位置を決定する処理の説明図である。It is explanatory drawing of the process of determining the arrangement position of a sphere. 球の配置位置を決定する処理の説明図である。It is explanatory drawing of the process of determining the arrangement position of a sphere. 球へのスコアのプロットの説明図である。It is explanatory drawing of the plot of the score on a sphere. 球へのスコアのプロットの説明図である。It is explanatory drawing of the plot of the score on a sphere. 球への表面全体におけるスコア算出の説明図である。It is explanatory drawing of the score calculation on the whole surface to a sphere. 球への表面全体におけるスコア算出の説明図である。It is explanatory drawing of the score calculation on the whole surface to a sphere. 球へのスコアのプロットの説明図である。It is explanatory drawing of the plot of the score on a sphere. 球の配置の具体例を示す図である。It is a figure which shows the specific example of the arrangement of a sphere. 球の配置の具体例を示す図である。It is a figure which shows the specific example of the arrangement of a sphere. 図18Aは球にスコア値を示す例の説明図であり、図18Bはスコアの補完の説明図である。FIG. 18A is an explanatory diagram of an example showing a score value on a sphere, and FIG. 18B is an explanatory diagram of score complementation. 推奨ルートの提示の説明図である。It is explanatory drawing of the presentation of a recommended route.
 以下、本技術の実施の形態について図面を参照しながら説明する。なお、説明は以下の順序で行う。 Hereinafter, embodiments of the present technology will be described with reference to the drawings. The explanation will be given in the following order.
<1.実施の形態>
[1-1.端末装置100の構成]
[1-2.情報処理装置200の構成]
[1-3.情報処理装置200による処理]
<2.変形例>
<1. Embodiment>
[1-1. Configuration of terminal device 100]
[1-2. Configuration of information processing device 200]
[1-3. Processing by information processing device 200]
<2. Modification example>
<1.実施の形態>
[1-1.端末装置100の構成]
 まず、本実施の形態における端末装置100と情報処理装置200の構成について説明する。端末装置100は情報処理装置200によるスコアの算出のための画像の撮影、情報処理装置200により算出された情報の表示などを行うものである。図1に示すように端末装置100は、制御部101、カメラ部102、センサ部103、入力部104、表示部105、通信部106、記憶部107、情報処理装置200を備えて構成されている。
<1. Embodiment>
[1-1. Configuration of terminal device 100]
First, the configuration of the terminal device 100 and the information processing device 200 in the present embodiment will be described. The terminal device 100 captures an image for calculating a score by the information processing device 200, displays information calculated by the information processing device 200, and the like. As shown in FIG. 1, the terminal device 100 includes a control unit 101, a camera unit 102, a sensor unit 103, an input unit 104, a display unit 105, a communication unit 106, a storage unit 107, and an information processing device 200. ..
 制御部101は、CPU(Central Processing Unit)、RAM(Random Access Memory)およびROM(Read Only Memory)などから構成されている。ROMには、CPUにより読み込まれ動作されるプログラムなどが記憶されている。RAMは、CPUのワークメモリとして用いられる。CPUは、ROMに記憶されたプログラムに従い様々な処理を実行してコマンドの発行を行うことによって、端末装置100全体および各部を制御する。 The control unit 101 is composed of a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like. The ROM stores a program or the like that is read and operated by the CPU. The RAM is used as the work memory of the CPU. The CPU controls the entire terminal device 100 and each part by executing various processes according to the program stored in the ROM and issuing commands.
 カメラ部102は撮像素子、画像処理エンジンなどからなり、RGBまたはモノクロの2次元の静止画および動画を撮影することができるカメラとしての機能を備えるものである。カメラ部102は端末装置100自体が備えるものの他、端末装置100とは別体であり、端末装置100と通信可能なデジタルカメラ、一眼レフカメラなどでもよいし、頭部や衣服などに装着して撮影できるいわゆるアクションカメラなどでもよい。 The camera unit 102 includes an image sensor, an image processing engine, and the like, and has a function as a camera capable of capturing RGB or monochrome two-dimensional still images and moving images. The camera unit 102 is provided in the terminal device 100 itself, or may be a digital camera, a single-lens reflex camera, or the like which is separate from the terminal device 100 and can communicate with the terminal device 100, or may be attached to the head or clothes. It may be a so-called action camera that can shoot.
 センサ部103は、GPS(Global Positioning System)モジュールなどの位置を検出することができるセンサである。GPSは地球の周囲に位置する人工衛星からの信号を受信機で受け取ることにより現在位置を知るシステムである。 The sensor unit 103 is a sensor that can detect the position of a GPS (Global Positioning System) module or the like. GPS is a system that knows the current position by receiving a signal from an artificial satellite located around the earth with a receiver.
 なお、センサ部103はGPSの他にもLiDAR(Laser Imaging Detection and Ranging)などの距離を測定することができるセンサを含んでいてもよい。LiDARはパルス状に発光するレーザー照射に対する散乱光を測定し、遠距離にある対象までの距離やその対象の性質を分析するものである。 In addition to GPS, the sensor unit 103 may include a sensor capable of measuring a distance such as LiDAR (Laser Imaging Detection and Ringing). LiDAR measures scattered light with respect to laser irradiation that emits pulsed light, and analyzes the distance to an object at a long distance and the properties of the object.
 さらに、センサ部103はIMU(Inertial Measurement Unit)モジュールなどの加速度、角速度を検出するセンサを含んでいてもよい。IMUモジュールは、慣性計測装置であり、2軸または3軸方向に対する加速度センサ、角度速度センサ、ジャイロセンサなどによって、3次元の角速度と加速度を求めることにより、情報処理装置200の姿勢、傾きなど検出する。 Further, the sensor unit 103 may include a sensor that detects acceleration and angular velocity, such as an IMU (Inertial Measurement Unit) module. The IMU module is an inertial measurement unit that detects the attitude, tilt, etc. of the information processing device 200 by obtaining three-dimensional angular velocity and acceleration with an acceleration sensor, angular velocity sensor, gyro sensor, etc. in two or three axial directions. To do.
 本実施の形態においてセンサ部103は、ユーザがカメラ部102により撮影を行うと撮影位置および撮影姿勢を検出して情報処理装置200に供給する。 In the present embodiment, when the user takes a picture with the camera unit 102, the sensor unit 103 detects the shooting position and the shooting posture and supplies the information processing device 200 to the information processing device 200.
 入力部104は、ユーザが端末装置100の操作のための入力を行うものである。入力部104に対してユーザから入力がなされると、その入力に応じた制御信号が生成されて制御部101に供給される。そして、制御部101はその制御信号に対応した各種処理を行う。入力部104としては、表示部105であるディスプレイと一体に構成されたタッチスクリーン、物理ボタン、音声認識による音声入力機能などがある。また、画像撮影のためのレリーズボタンも入力部104に含まれる。 The input unit 104 is for the user to input for the operation of the terminal device 100. When an input is made to the input unit 104 by the user, a control signal corresponding to the input is generated and supplied to the control unit 101. Then, the control unit 101 performs various processes corresponding to the control signal. The input unit 104 includes a touch screen integrally configured with the display which is the display unit 105, a physical button, a voice input function by voice recognition, and the like. The input unit 104 also includes a release button for taking an image.
 表示部105は、画像/映像、GUI(Graphical User Interface)、カメラ部102での撮影時におけるスルー画、情報処理装置200により処理される情報などを表示するディスプレイなどの表示デバイスである。 The display unit 105 is a display device such as a display that displays an image / video, a GUI (Graphical User Interface), a through image at the time of shooting by the camera unit 102, information processed by the information processing device 200, and the like.
 通信部106は、外部サーバなどとデータの送受信を行なうための通信モジュールである。通信は、無線LAN(Local Area Network)やWAN(Wide Area Network)、WiFi(Wireless Fidelity)、4G(第4世代移動通信システム)、5G(第5世代移動通信システム)、Bluetooth(登録商標)、ZigBee(登録商標)などの無線通信で行われる。また、撮像装置400との通信は無線通信のほか、USB(Universal Serial Bus)通信などの有線通信でもよい。 The communication unit 106 is a communication module for transmitting / receiving data to / from an external server or the like. Communication is wireless LAN (Local Area Network), WAN (Wide Area Network), WiFi (Wireless Fidelity), 4G (4th generation mobile communication system), 5G (5th generation mobile communication system), Bluetooth (registered trademark), It is performed by wireless communication such as ZigBee (registered trademark). Further, the communication with the image pickup apparatus 400 may be a wired communication such as a USB (Universal Serial Bus) communication in addition to the wireless communication.
 記憶部107は、例えば、HDD(Hard Disc Drive)、半導体メモリ、SSD(solid state drive)などにより構成された記憶媒体であり、センサ部103が検出したデータ、スコア、画像の撮影位置、撮影姿勢など情報処理装置200による処理に必要な情報を保存するものである。 The storage unit 107 is a storage medium composed of, for example, an HDD (Hard Disc Drive), a semiconductor memory, an SSD (solid state drive), and the like, and the data, score, image shooting position, and shooting posture detected by the sensor unit 103. Information necessary for processing by the information processing apparatus 200 is stored.
[1-2.情報処理装置200の構成]
 次に図2を参照して情報処理装置200の構成について説明する。情報処理装置200は情報取得部201、プロット処理部202、配置処理部203を備えて構成されている。
[1-2. Configuration of information processing device 200]
Next, the configuration of the information processing apparatus 200 will be described with reference to FIG. The information processing device 200 includes an information acquisition unit 201, a plot processing unit 202, and an arrangement processing unit 203.
 情報取得部201は、端末装置100のカメラ部102で撮影された画像から撮影範囲についての情報を取得するものである。本実施の形態において情報にはスコアと信頼度がある。スコアと信頼度の詳細は後述する。また、情報取得部201は算出したスコアを用いて画像の撮影範囲外のスコアを推定する処理も行う。なお、画像は静止画像でもよいし、動画を構成するフレーム画像であってもよい。また、画像はカメラ部102で撮影したもの以外にも他のカメラで撮影して情報処理装置200に供給された画像でもよい。 The information acquisition unit 201 acquires information about the photographing range from the image captured by the camera unit 102 of the terminal device 100. In this embodiment, the information has a score and a reliability. Details of the score and reliability will be described later. In addition, the information acquisition unit 201 also performs a process of estimating a score outside the shooting range of the image using the calculated score. The image may be a still image or a frame image constituting a moving image. Further, the image may be an image taken by another camera and supplied to the information processing apparatus 200 in addition to the image taken by the camera unit 102.
 プロット処理部202は、情報取得部201で算出、推定したスコアおよび信頼度を仮想立体物である球にプロットする処理を行うものである。 The plot processing unit 202 performs a process of plotting the score and reliability calculated and estimated by the information acquisition unit 201 on a sphere which is a virtual three-dimensional object.
 配置処理部203は、スコアおよび信頼度がプロットされた球を配置対象に配置する処理を行うものである。配置対象とは空間データであり、例えば、二次元または三次元の地図データ、空間を撮影した画像や映像、ARやMRにおける仮想空間などがある。球の配置の詳細については後述する。 The placement processing unit 203 performs a process of arranging a sphere on which a score and a reliability are plotted as an arrangement target. The placement target is spatial data, and includes, for example, two-dimensional or three-dimensional map data, images and videos of the space, and virtual space in AR and MR. The details of the arrangement of the spheres will be described later.
 情報処理装置200は以上のようにして構成されている。情報処理装置200は、本実施の形態のように端末装置100において動作してもよいし、端末装置100とは異なる他の装置、他の外部サーバなどにおいて動作してもよい。なお、情報処理装置200はプログラムの実行により実現され、そのプログラムは予め端末装置100などにインストールされていてもよいし、ダウンロード、記憶媒体などで配布されて、ユーザが自らインストールするようにしてもよい。さらに、情報処理装置200はプログラムによって実現されるのみでなく、その機能を有するハードウェアによる専用の装置、回路などを組み合わせて実現されてもよい。 The information processing device 200 is configured as described above. The information processing device 200 may operate in the terminal device 100 as in the present embodiment, or may operate in another device different from the terminal device 100, another external server, or the like. The information processing device 200 is realized by executing a program, and the program may be installed in the terminal device 100 or the like in advance, or may be distributed by download, storage medium, or the like so that the user can install it by himself / herself. Good. Further, the information processing device 200 is not only realized by a program, but may also be realized by combining a dedicated device, a circuit, or the like by hardware having the function.
[1-3.情報処理装置200による処理]
 次に情報処理装置200による処理について説明する。まず、本実施の形態における情報としてのスコアおよび信頼度と、スコアおよび信頼度の仮想立体物としての球へのプロットと、球の配置の概要について説明する。
[1-3. Processing by information processing device 200]
Next, the processing by the information processing apparatus 200 will be described. First, the score and reliability as information in the present embodiment, the plot of the score and reliability on the sphere as a virtual three-dimensional object, and the outline of the arrangement of the sphere will be described.
 スコアとは画像から取得できる特徴点の数、位置、特徴量から複合的に決定される、画像の撮影範囲内における環境の特徴を示す指標である。スコアは例えばSLAMの自己位置推定と環境地図作成の実施のしやすさ、精度などを表すことができる。 The score is an index showing the characteristics of the environment within the shooting range of the image, which is determined in a complex manner from the number, position, and amount of feature points that can be acquired from the image. The score can represent, for example, the ease and accuracy of SLAM self-position estimation and environmental map creation.
 スコアは画像から検出される撮影範囲内の特徴点から値として算出されるものであるため、同じ位置であってもその位置を撮影する方向(視点)が異なり撮影範囲の特徴点が異なると、スコアも異なる値となる。スコアは撮影位置および撮影姿勢の1つの組み合わせに対して1つのスコアが対応するように算出される。 Since the score is calculated as a value from the feature points within the shooting range detected from the image, if the shooting direction (viewpoint) is different and the feature points in the shooting range are different even at the same position, The score will also be different. The score is calculated so that one score corresponds to one combination of the shooting position and the shooting posture.
 信頼度とはスコアの信頼性を示す指標である。画像から直接算出されたスコアは信頼度が100%となり、他のスコアを用いて推定されたスコアは信頼度が100%以下となる。 Reliability is an index showing the reliability of the score. The score calculated directly from the image has a reliability of 100%, and the score estimated using other scores has a reliability of 100% or less.
 本実施の形態においては図3Aに示すようにスコアは仮想立体物としての球の表面にHSV色空間の色相を用いてプロットされる。図3A中の図3A-aは球の正面図であり、図3A-bは球の上面図であり、図3A-cは球の側面図である。また、信頼度は球の表面に彩度または明度を用いてプロットされる。 In the present embodiment, as shown in FIG. 3A, the score is plotted on the surface of the sphere as a virtual three-dimensional object using the hue of the HSV color space. 3A-a in FIG. 3A is a front view of the sphere, FIG. 3A-b is a top view of the sphere, and FIG. 3A-c is a side view of the sphere. Confidence is also plotted on the surface of the sphere using saturation or lightness.
 この球は例えば図3Bに示すように、2次元地図データ上においてスコアを算出に用いられた画像を撮影した位置に対応した位置に配置される。そして、端末装置100の表示部105などに表示される。これによりユーザはスコアと信頼度と容易に把握することができる。本技術はこのようにスコアをプロットした球を配置対象に配置することにより画像から取得した情報を有するマップ(スコアマップ)を作成するものである。 As shown in FIG. 3B, for example, this sphere is arranged at a position corresponding to the position where the image used for calculating the score was taken on the two-dimensional map data. Then, it is displayed on the display unit 105 or the like of the terminal device 100. This allows the user to easily grasp the score and reliability. The present technology creates a map (score map) having information acquired from an image by arranging a sphere on which a score is plotted in this way as an arrangement target.
 次に情報処理装置200による具体的な処理について説明する。図4は情報取得部201により行われるスコア取得処理を示すフローチャートである。また、図5はスコアのプロット処理部202によるプロット処理および配置処理部203による球の配置処理を示すフローチャートである。 Next, specific processing by the information processing device 200 will be described. FIG. 4 is a flowchart showing a score acquisition process performed by the information acquisition unit 201. Further, FIG. 5 is a flowchart showing a plot process by the score plot processing unit 202 and a sphere arrangement process by the arrangement processing unit 203.
 スコア取得処理においては、まずステップS101で端末装置100のカメラ部102により撮影されたスコアを算出する範囲を撮影した画像が入力される。次にステップS102で画像を撮影した際の撮影位置および撮影姿勢の推定が行われる。撮影位置および撮影姿勢の推定はセンサ部103としてのGPS、LiDAR、IMUモジュールなどで取得したセンサ情報である撮影位置情報、撮影姿勢情報により行うことができる。 In the score acquisition process, first, an image captured in the range for calculating the score captured by the camera unit 102 of the terminal device 100 in step S101 is input. Next, the shooting position and shooting posture when the image is shot in step S102 are estimated. The shooting position and shooting posture can be estimated from the shooting position information and the shooting posture information which are the sensor information acquired by the GPS, LiDAR, IMU module, etc. as the sensor unit 103.
 次にステップS103で画像からスコアを算出する。ここで図6および図7を参照してスコア算出方法の一例について説明する。 Next, in step S103, the score is calculated from the image. Here, an example of the score calculation method will be described with reference to FIGS. 6 and 7.
 まず図6Aに示すように画像をL階層に四分木分割して16個のマスに分割する。なお、マスの数はあくまで1例であり、本技術はそれに限定されるものではない。次に階層lごとに重みWlを算出する。重みWlは下記の式1で算出できる。 First, as shown in FIG. 6A, the image is divided into 16 squares by dividing the image into L layers. The number of squares is only one example, and the present technology is not limited to this. Next, the weight Wl is calculated for each layer l. The weight Wl can be calculated by the following equation 1.
[式1]
Wl=2
[Equation 1]
Wl = 2 l
 次に階層lごとに特徴点を検出し、特徴点が含まれているマスの数Nlを数える。特徴点の検出方法としては例えば輝度勾配と称される方法がある。輝度勾配では、まず画像における一のピクセルに注目し、それに隣接するピクセルとの間で輝度の勾配が所定量以上出る場合その一のピクセルを特徴点として検出する。そして輝度の勾配が大きければ大きいほどその特徴点の特徴量は大きくなる。この処理を画像を構成する全てのピクセルに対して行う。 Next, the feature points are detected for each layer l, and the number Nl of the cells containing the feature points is counted. As a method of detecting feature points, for example, there is a method called a luminance gradient. In the luminance gradient, first, attention is paid to one pixel in the image, and when the luminance gradient with the adjacent pixel appears by a predetermined amount or more, that one pixel is detected as a feature point. The larger the gradient of brightness, the larger the feature amount of the feature point. This process is performed on all the pixels that make up the image.
 そして下記の式2によりスコアを算出する。式2ではスコアをSと表している。 Then, calculate the score by the following formula 2. In Equation 2, the score is represented as S.
[式2]
Figure JPOXMLDOC01-appb-I000001
[Equation 2]
Figure JPOXMLDOC01-appb-I000001
 これにより例えば、図6Bに示す例ではスコアは66と算出される。また、図7Aの例ではスコアは80、図7Bの例ではスコアは146、図7Cの例ではスコアは200と算出される。なお、図6B、図7A乃至図7Cにおける矩形は画像を示し、矩形中の点は画像中から検出された特徴点を表している。 As a result, for example, in the example shown in FIG. 6B, the score is calculated as 66. Further, in the example of FIG. 7A, the score is calculated as 80, in the example of FIG. 7B, the score is calculated as 146, and in the example of FIG. 7C, the score is calculated as 200. The rectangles in FIGS. 6B and 7A to 7C indicate an image, and the points in the rectangle represent the feature points detected in the image.
 特徴点は画像中の様々な箇所に存在する。しかし、例えば画像中の被写体が白一色の壁などの特徴点がない、または特徴点の検出が難しいような場合には画像全体としてのスコアは低くなる。 Feature points exist in various places in the image. However, if the subject in the image does not have feature points such as a white wall, or if it is difficult to detect the feature points, the score of the image as a whole becomes low.
 図4のフローチャートの説明に戻る。次にステップS104でスコアの推定処理および推定したスコアの信頼度の算出を行う。スコアの推定は、スコアを算出した撮影位置以外の位置または撮影姿勢以外の姿勢に対応するスコアを球にプロットする必要がある場合に行うものである。よってスコアの推定は必須の処理ではない。スコアの推定を行うか否かはユーザが決定できるようにするとよい。 Return to the explanation of the flowchart in FIG. Next, in step S104, the score is estimated and the reliability of the estimated score is calculated. The score is estimated when it is necessary to plot the score corresponding to the position other than the shooting position where the score was calculated or the posture other than the shooting posture on the sphere. Therefore, score estimation is not an essential process. It is preferable that the user can decide whether or not to estimate the score.
 まず図8Aを参照してスコアの推定と信頼度の算出の第1の方法について説明する。図8A中の実線矩形枠はカメラ部102により撮影した画像の撮影範囲(視野)を示すものであり、この撮影範囲については上述した手法で画像からスコアを算出することができる。撮影した画像から算出したスコアの信頼度は100%である。スコアの推定はこの画像から算出したスコアを用いて行う。 First, the first method of estimating the score and calculating the reliability will be described with reference to FIG. 8A. The solid rectangular frame in FIG. 8A indicates the shooting range (field of view) of the image shot by the camera unit 102, and the score can be calculated from the image for this shooting range by the method described above. The reliability of the score calculated from the captured image is 100%. The score is estimated using the score calculated from this image.
 図8A中の破線矩形枠はスコアを推定する第1の推定範囲を示すものである。また、図8A中の一点鎖線矩形枠はスコアを推定する第2の推定範囲を示すものである。この第1の方法では図8Aに示すようにスコアを推定する推定範囲はスコアを算出した画像の撮影範囲と一部が重なっている必要がある。 The broken line rectangular frame in FIG. 8A indicates the first estimation range for estimating the score. Further, the alternate long and short dash line rectangular frame in FIG. 8A indicates a second estimation range for estimating the score. In this first method, as shown in FIG. 8A, the estimated range for estimating the score needs to partially overlap with the shooting range of the image for which the score has been calculated.
 第1の方法においては、撮影範囲と重なっている第1の推定範囲のスコアは撮影範囲のスコアと同一ものと推定する。よって、例えば撮影範囲のスコアが100と算出された場合、第1の推定範囲のスコアを100と推定する。撮影範囲と一部が重なっている第2の推定範囲のスコアも同様に撮影範囲のスコアと同一の値だと推定する。 In the first method, the score of the first estimated range that overlaps with the shooting range is estimated to be the same as the score of the shooting range. Therefore, for example, when the score of the shooting range is calculated to be 100, the score of the first estimated range is estimated to be 100. It is estimated that the score of the second estimated range, which partially overlaps the shooting range, is also the same value as the score of the shooting range.
 そして撮影範囲と第1の推定範囲の重なっている立体角の割合から第1の推定範囲のスコアの信頼度を算出する。例えば、撮影範囲と第1の推定範囲において重なっている立体角の割合が20%である場合、第1の推定範囲のスコアの信頼度は20%となる。 Then, the reliability of the score of the first estimated range is calculated from the ratio of the solid angle where the shooting range and the first estimated range overlap. For example, when the ratio of the solid angles overlapping the shooting range and the first estimated range is 20%, the reliability of the score in the first estimated range is 20%.
 同様に、撮影範囲と第2の推定範囲の重なっている立体角の割合から第2の推定範囲のスコアの信頼度を算出する。例えば、撮影範囲と第2の推定範囲において重なっている立体角の割合が40%である場合、第2の推定範囲のスコアの信頼度は40%となる。 Similarly, the reliability of the score of the second estimated range is calculated from the ratio of the solid angle where the shooting range and the second estimated range overlap. For example, when the ratio of the solid angles overlapping in the shooting range and the second estimation range is 40%, the reliability of the score in the second estimation range is 40%.
 次に図8Bを参照してスコアの推定と信頼度の算出の第2の方法について説明する。 Next, the second method of estimating the score and calculating the reliability will be described with reference to FIG. 8B.
 図8B中の左側の実線はカメラ部102により撮影した画像の撮影範囲(視野)を示すものであり、この撮影範囲については画像からスコアを算出することができる。図8B中の点はスコア算出のために画像から検出された特徴点を示すものである。図8Bの例では画像からは8個の特徴点(黒点および白点)が検出されているとする。なお、特徴点は3次元座標として検出されるものである。 The solid line on the left side in FIG. 8B shows the shooting range (field of view) of the image shot by the camera unit 102, and the score can be calculated from the image for this shooting range. The points in FIG. 8B show the feature points detected from the image for score calculation. In the example of FIG. 8B, it is assumed that eight feature points (black points and white points) are detected from the image. The feature points are detected as three-dimensional coordinates.
 撮影範囲においては画像からスコアを算出できるため、そのスコアの信頼度は100%である。スコアの推定はこの画像から算出したスコアを用いて行う。 Since the score can be calculated from the image in the shooting range, the reliability of the score is 100%. The score is estimated using the score calculated from this image.
 図8B中の右側の破線はスコアを推定する位置・姿勢からの推定範囲(視野)を示すものである。第2の方法においては撮影範囲と視野の一部が重なっている推定範囲のスコアは撮影範囲のスコアと同一ものと推定する。よって、例えば図8Bにおける撮影範囲のスコアが100と算出された場合、その撮影範囲と視野の一部が重なっている破線の推定範囲のスコアを100と推定する。そして、推定範囲におけるスコアの信頼度は特徴点の数を用いて下記の式3で算出することができる。 The broken line on the right side in FIG. 8B shows the estimated range (field of view) from the position / posture at which the score is estimated. In the second method, the score of the estimated range in which the shooting range and a part of the field of view overlap is estimated to be the same as the score of the shooting range. Therefore, for example, when the score of the shooting range in FIG. 8B is calculated to be 100, the score of the estimated range of the broken line where the shooting range and a part of the field of view overlap is estimated to be 100. Then, the reliability of the score in the estimation range can be calculated by the following equation 3 using the number of feature points.
[式3]
 (推定範囲において撮影範囲と共通する特徴点の個数/撮影範囲における特徴点の個数)×100
[Equation 3]
(Number of feature points common to the shooting range in the estimated range / Number of feature points in the shooting range) x 100
 図8Bの例では撮影範囲では8個の特徴点(黒点および白点)が検出されており、推定範囲では6個の特徴点(黒点)が検出されており、その6個の特徴点が撮影範囲の視野における特徴点と共通しているため、信頼度は75%であると算出することができる。 In the example of FIG. 8B, eight feature points (black spots and white spots) are detected in the shooting range, six feature points (black spots) are detected in the estimated range, and the six feature points are shot. Since it is common to the feature points in the field of view of the range, the reliability can be calculated to be 75%.
 図4のフローチャートの説明に戻る。次にステップS105で画像の撮影位置および撮影姿勢とスコアを対応付ける。撮影位置・撮影姿勢ごとに対応付けられたスコアは例えば端末装置100の記憶部107に格納される。このようにしてスコア取得処理が行われる。スコア取得処理はスコア取得のための撮影のたびに行われる。 Return to the explanation of the flowchart in FIG. Next, in step S105, the shooting position and shooting posture of the image are associated with the score. The score associated with each shooting position / shooting posture is stored in, for example, the storage unit 107 of the terminal device 100. The score acquisition process is performed in this way. The score acquisition process is performed every time a photograph is taken to acquire a score.
 スコア取得処理と球の配置処理を並行して行わずに後に球の配置処理を行う場合、スコア取得処理において、スコアをそのスコア算出のために画像を撮影した撮影位置と撮影姿勢と対応させておく必要がある。そして後の配置処理でその撮影位置情報に対応するグリッドに球を配置することによりスコアマップを作成することができる。これを実現するためには、撮影時にGPS、LiDAR、ARマーカなどで撮影位置と撮影姿勢を検出しておく必要がある。 When the sphere placement process is performed later without performing the score acquisition process and the sphere placement process in parallel, in the score acquisition process, the score is made to correspond to the shooting position and the shooting posture in which the image was taken to calculate the score. Need to keep. Then, a score map can be created by arranging the spheres on the grid corresponding to the shooting position information in the later arrangement process. In order to realize this, it is necessary to detect the shooting position and shooting posture with GPS, LiDAR, AR marker, etc. at the time of shooting.
 次に図5のフローチャートを参照して仮想立体物である球におけるスコアおよび信頼度のプロット処理および球の配置処理について説明する。まずステップS201で、プロット処理部202は記憶部107に格納されている撮影位置・撮影姿勢ごとのスコアと信頼度を読み出す。次にステップS202で配置処理部203は球を配置する位置および姿勢を決定する。 Next, the score and reliability plot processing and the sphere arrangement processing in the sphere, which is a virtual three-dimensional object, will be described with reference to the flowchart of FIG. First, in step S201, the plot processing unit 202 reads out the score and reliability for each shooting position / shooting posture stored in the storage unit 107. Next, in step S202, the arrangement processing unit 203 determines the position and orientation in which the sphere is arranged.
 ここで球の配置位置の決定方法について説明する。まず図9Aに示すように画像を撮影して球の配置処理を行うスコアマップ作成領域全体を示す地図データ上に所定のサイズのグリッドを設定する。このグリッドのサイズは最終的に配置する球のサイズに対応しており、サイズは予めデフォルトで決定していてもよいし、ユーザが任意にサイズに設定してもよい。例えば、大雑把にスコアを知りたい場合はグリッドを大きくして配置される球も大きくする。一方、細かく詳細にスコアを知りたい場合はグリッドを小さくして配置される球も小さくする、というようなことも可能である。なお、グリッドは地図データ上に重畳して表示部105において表示してもよいし、表示しなくてもよい。 Here, the method of determining the placement position of the sphere will be explained. First, as shown in FIG. 9A, a grid of a predetermined size is set on the map data showing the entire score map creation area in which an image is taken and the spheres are arranged. The size of this grid corresponds to the size of the sphere to be finally placed, and the size may be determined in advance by default, or the user may arbitrarily set the size. For example, if you want to know the score roughly, make the grid larger and make the spheres placed larger. On the other hand, if you want to know the score in detail, you can make the grid smaller and the spheres to be placed smaller. The grid may or may not be superimposed on the map data and displayed on the display unit 105.
 そして図9B中の矢印に示すように、ユーザが移動しながらスコアマップ作成領域内で撮影を行うと、端末装置100のセンサ部103が撮影のたびに撮影位置と撮影姿勢を取得する。そして情報取得部201が画像からスコアを算出すると、図10Aに示すように内部に撮影位置が存在するグリッドにその撮影位置で撮影した画像から算出したスコアをプロットした球を配置する。球のサイズはグリッドのサイズ以下である必要があり、各グリッドにおける球にはそのグリッド内に撮影位置がある画像から算出したスコアの全てがプロットされることになる。 Then, as shown by the arrow in FIG. 9B, when the user moves and shoots in the score map creation area, the sensor unit 103 of the terminal device 100 acquires the shooting position and shooting posture each time the shooting is performed. Then, when the information acquisition unit 201 calculates the score from the image, as shown in FIG. 10A, a sphere plotting the score calculated from the image taken at the shooting position is arranged on the grid where the shooting position exists inside. The size of the sphere must be less than or equal to the size of the grid, and the sphere in each grid will be plotted with all the scores calculated from the image whose shooting position is within that grid.
 撮影が行われていない、すなわち内部に撮影位置が存在しないグリッドには球は配置されない。ただし図10Bに示すように、撮影を行っていないグリッドであっても隣接するグリッドに配置された球にプロットされたスコアからスコアを推定することより球を配置することが可能である。 The sphere is not placed on the grid where shooting is not performed, that is, there is no shooting position inside. However, as shown in FIG. 10B, even if the grid is not photographed, the spheres can be arranged by estimating the score from the scores plotted on the spheres arranged on the adjacent grids.
 図5のフローチャートの説明に戻る。次にステップS203で球を配置したグリッドに内包される(グリッド内に位置情報を有する)スコアを記憶部107から読み出してリストアップする。 Return to the explanation of the flowchart of FIG. Next, in step S203, the score included in the grid on which the spheres are arranged (having the position information in the grid) is read from the storage unit 107 and listed.
 次にステップS204で、プロット処理部202が球に対するスコアと信頼度のプロット処理を行う。プロット処理について図11および図12を参照して説明する。 Next, in step S204, the plot processing unit 202 plots the score and reliability of the sphere. The plotting process will be described with reference to FIGS. 11 and 12.
 まず球の表面においてスコアをプロットする際の基準となる位置(プロット基準位置)の決定について説明する。例えば図11Aに示すように3つの撮影位置での撮影により画像P1、画像P2、画像P3が取得されたとする。そして図11Bに示すように、各画像に基づいて撮影位置・撮影姿勢に対応させて画像P1のスコアが100、画像P2のスコアが200、画像P3のスコアが300と算出されたとする。 First, the determination of the reference position (plot reference position) when plotting the score on the surface of the sphere will be explained. For example, as shown in FIG. 11A, it is assumed that the image P1, the image P2, and the image P3 are acquired by shooting at three shooting positions. Then, as shown in FIG. 11B, it is assumed that the score of the image P1 is calculated to be 100, the score of the image P2 is 200, and the score of the image P3 is 300 according to the shooting position and the shooting posture based on each image.
 スコアと撮影位置・撮影姿勢の対応付けは図11Bに示すように、所定または任意のサイズの擬似的な球を想定し、撮影位置と球の中心を対応させる。そして撮影位置から撮影方向へ伸ばした線分の逆方向への延長線と球の表面が交差する点にスコアが対応付けられる。 As shown in FIG. 11B, the score and the shooting position / shooting posture are associated with each other by assuming a pseudo sphere of a predetermined or arbitrary size and matching the shooting position with the center of the sphere. Then, the score is associated with the point where the surface of the sphere intersects with the extension line in the opposite direction of the line segment extending from the shooting position in the shooting direction.
 そして図12Aに示すように3つのスコアの撮影位置が1つのグリッド内に位置する場合、図12Bに示すようにこの3つのスコアをあたかも同じグリッド内の同じ位置で撮影されたものとしてそのグリッドに対応する1つの球にプロットする。図12Bに示すようにスコアをグリッドに対応した球にプロットする場合、球の表面におけるスコアのプロット基準位置は各スコアの撮影位置から撮影方向へ伸ばした線分の逆方向への延長線と球の表面が交差する点となる。 Then, when the shooting positions of the three scores are located in one grid as shown in FIG. 12A, the three scores are placed on the grid as if they were shot at the same position in the same grid as shown in FIG. 12B. Plot on one corresponding sphere. When the score is plotted on a sphere corresponding to the grid as shown in FIG. 12B, the score plot reference position on the surface of the sphere is an extension line in the opposite direction of the line segment extending from the shooting position of each score in the shooting direction and the sphere. It becomes a point where the surfaces of are intersected.
 スコアは撮影位置が同じでも撮影の姿勢によって異なる値になるが、このように球の表面に各撮影位置におけるスコアをプロットすることにより1つの球に各撮影姿勢に基づくスコアを示すことができる。これによりユーザが移動して自身の視点を変化させることなく一方向から見ても球にプロットされたスコアを把握できる。 The score will be different depending on the shooting posture even if the shooting position is the same, but by plotting the score at each shooting position on the surface of the sphere in this way, the score based on each shooting posture can be shown on one sphere. As a result, the user can grasp the score plotted on the sphere even when viewed from one direction without moving and changing his / her viewpoint.
 同じグリッド内で撮影した画像から算出された全てのスコアは同じ位置で撮影(ただし姿勢は異なる)したものとして球にプロットされるので、グリッドのサイズが撮影位置の誤差許容範囲の基準にもなる。 All scores calculated from images taken in the same grid are plotted on the sphere as if they were taken at the same position (but with different postures), so the grid size is also the basis for the error tolerance of the shooting position. ..
 なお、グリッドの設定がない場合、もしくはグリッドのサイズにより撮影位置・撮影姿勢同士が接触していなければ図11Bに示す状態で撮影位置・撮影姿勢ごとに設定した球に対してプロット基準位置を決定してもよい。 If there is no grid setting, or if the shooting position and shooting posture are not in contact with each other due to the size of the grid, the plot reference position is determined for the sphere set for each shooting position and shooting posture in the state shown in FIG. 11B. You may.
 このように球の表面におけるスコアのプロット基準位置が決定すると、次に球の表面全体にプロット基準位置を中心とした同心円状にHSV色空間でスコアをプロットするために球の表面全体におけるプロットするスコアを推定する必要がある。 Once the plot reference position of the score on the surface of the sphere is determined in this way, then plot on the entire surface of the sphere to plot the score concentrically around the plot reference position in the HSV color space over the entire surface of the sphere. You need to estimate the score.
 球の表面全体におけるスコアの算出について図13および図14を参照して説明する。球の表面全体におけるスコアの算出においては図13Aに示すように球を三角形の面のみで構成された球に近似した立体とし、画像から算出したスコアに基づいて立体の三角形面ごとのスコアを推定する。 The calculation of the score on the entire surface of the sphere will be described with reference to FIGS. 13 and 14. In calculating the score on the entire surface of the sphere, as shown in FIG. 13A, the sphere is a solid that approximates a sphere composed of only triangular surfaces, and the score for each triangular surface of the solid is estimated based on the score calculated from the image. To do.
 三角形の面ごとのスコア推定の第1の方法は、図14に示すように球の中心から撮影方向へ伸ばした線分の逆方向への延長線が交差する三角形面には画像から算出されたスコアを対応付ける、という方法である。三角形面において延長線が交差した点はプロット基準位置となる。 The first method of score estimation for each triangular surface was calculated from an image on a triangular surface where an extension line in the opposite direction of a line segment extending from the center of the sphere in the shooting direction intersects as shown in FIG. It is a method of associating scores. The point where the extension lines intersect on the triangular surface is the plot reference position.
 三角形面ごとのスコア算出の第2の方法は、第1の方法ではスコアを対応付けられない三角形面についてのスコアの算出方法である。第2の方法では、第1の方法でスコアが対応付けられた三角形面内のプロット基準位置と、スコアが対応付けられた別の三角形面内のプロット基準位置を線分で結ぶ。そして、その線分がスコアが対応付けられていない三角形面に重なる場合には線形補間によりそのスコアが対応付けられていない面のスコアを算出する。 The second method of calculating the score for each triangular surface is a method of calculating the score for the triangular surface to which the score cannot be associated with the first method. In the second method, the plot reference position in the triangular plane to which the score is associated in the first method and the plot reference position in another triangular plane to which the score is associated are connected by a line segment. Then, when the line segment overlaps the triangular surface to which the score is not associated, the score of the surface to which the score is not associated is calculated by linear interpolation.
 図13Bに示すように、第1の方法でスコアが対応付けられた三角形の面を面a、面bと仮定すると、面a内のプロット基準位置と面b内のプロット基準位置を結ぶと面cに重なるため、面aに対応したスコアと面bに対応したスコアとから線形補間で面cのスコアを算出する。 As shown in FIG. 13B, assuming that the triangular faces to which the scores are associated by the first method are face a and face b, connecting the plot reference position in the face a and the plot reference position in the face b is a face. Since it overlaps with c, the score of surface c is calculated by linear interpolation from the score corresponding to surface a and the score corresponding to surface b.
 三角形面ごとのスコア算出の第3の方法は、第1の方法、第2の方法のいずれでもスコアを対応付けられない三角形面については隣接する三角形面におけるスコアの平均をその三角形面に対応付けるスコアとする。 The third method of calculating the score for each triangular surface is a score in which the average of the scores in the adjacent triangular surfaces is associated with the triangular surface for the triangular surfaces whose scores cannot be associated with either the first method or the second method. And.
 この第1乃至第3の方法を全ての三角形面に実施することにより、球の表面全体にスコアを対応付けることができる。そして球の表面全体におけるスコアをHSV色空間で表すと図15に示すようになる。 By implementing the first to third methods on all triangular surfaces, the score can be associated with the entire surface of the sphere. The score on the entire surface of the sphere is shown in FIG. 15 in HSV color space.
 図15Aに示すようにスコアはHSV色空間の色相を用い、例えば最も高いスコアに対応する色を赤とし、スコアが下がっていく従い、橙色、黄色、緑色、水色、紫色と変化していくようにする。例えば、撮影したすべて画像のスコアを比較し、スコアの最小から最高までで正規化したうえで、1.0を赤色、0.0を紫色にする。任意の位置・姿勢のスコアに対応する色の決定は下記の式4で行うことができる。 As shown in FIG. 15A, the score uses the hue of the HSV color space, for example, the color corresponding to the highest score is red, and as the score decreases, it changes to orange, yellow, green, light blue, and purple. To. For example, the scores of all the captured images are compared, normalized from the minimum to the maximum score, and then 1.0 is red and 0.0 is purple. The color corresponding to the score of an arbitrary position / posture can be determined by the following equation 4.
[式4]
(S-Smin)/(Smax-Smin)
[Equation 4]
(S-Smin) / (Smax-Smin)
 このような色の変化は気温の高さなど高低差があるデータを色で表す場合に用いられる手法であり、目視により直感的にスコアが高い箇所と低い箇所を把握することができる。このように球の表面全体にプロットするスコアを算出することにより、プロット基準位置が最もスコアが高く、プロット基準位置を中心として同心円状にスコアが下がっていくようになる。なお、色相とスコアの対応については、色相のスケールの値やRGB値とスコアの値を直接対応させてプロットしてもよいし、スコアと色相を対応付ける独自の基準を設けて、それに基づいて色を用いてスコアを球にプロットするようにしてもよい。 Such a color change is a method used when data with a height difference such as a high temperature is represented by a color, and it is possible to intuitively grasp a part with a high score and a part with a low score by visual inspection. By calculating the score plotted on the entire surface of the sphere in this way, the plot reference position has the highest score, and the score decreases concentrically around the plot reference position. Regarding the correspondence between hue and score, the hue scale value or RGB value and score value may be directly associated and plotted, or a unique standard for associating score and hue may be established and the color may be based on the standard. May be used to plot the score on a sphere.
 スコアがプロットされた球の具体例を図15Bおよび図15Cに示す。なお、図15Bおよび図15Cは説明の便宜上および図面の描画の都合上、スコアがプロットされた球は実際の色がグラデーション状に変化する表示状態を示す図(図15B-a、図15B-c、図15C-a、図15C-c)と、色の変化を説明するために色を階層的に示す図(図15B-b、図15B-d、図15C-b、図15C-d)とにより表現する。 Specific examples of the sphere on which the score is plotted are shown in FIGS. 15B and 15C. It should be noted that FIGS. 15B and 15C are diagrams showing a display state in which the actual color of the sphere on which the score is plotted changes in a gradation shape for convenience of explanation and drawing of drawings (FIGS. 15B-a and 15B-c). 15C-a, 15C-c) and diagrams (FIGS. 15B-b, 15B-d, 15C-b, 15C-d) showing colors hierarchically to explain color changes. Expressed by.
 図15Bは画像から算出したスコアが1つの場合である。グラデーションで示す図15B-aの色の階層的変化を示すのが図15B-bであり、グラデーションで示す図15B-cの色の階層的変化を示すのが図15B-dである。球の表面におけるプロット基準位置が最もスコアが高く、そのプロット基準位置から同心円状にスコアが下がっていくように赤色、橙色、黄色、緑色、水色、青色と変化していくようにスコアがプロットされる。 FIG. 15B shows a case where the score calculated from the image is one. FIG. 15B-b shows the hierarchical change of the color of FIG. 15B-a shown by the gradation, and FIG. 15B-d shows the hierarchical change of the color of FIG. 15B-c shown by the gradation. The plot reference position on the surface of the sphere has the highest score, and the score is plotted so that the score decreases concentrically from the plot reference position in red, orange, yellow, green, light blue, and blue. To.
 図15Cは画像から算出したスコアが二つの場合である。グラデーションで示す図15C-aの色の階層的変化を示すのが図15C-bであり、グラデーションで示す図15C-cの色の階層的変化を示すのが図15C-dである。二つのスコアのそれぞれのプロット位置が最もスコアが高い色で表示され、二つのプロット基準位置のそれぞれを中心とした同心円状にスコアが下がっていくようにプロットされる。 FIG. 15C shows a case where there are two scores calculated from the image. FIG. 15C-b shows the hierarchical change of the color of FIG. 15C-a shown by the gradation, and FIG. 15C-d shows the hierarchical change of the color of FIG. 15C-c shown by the gradation. Each plot position of the two scores is displayed in the color with the highest score, and the scores are plotted so as to decrease concentrically around each of the two plot reference positions.
 信頼度も彩度を用いて彩度の変化で球にプロットされる。プロット基準位置は画像から算出したスコアが対応しているため、最も信頼度が高い。よってプロット基準位置が最も彩度が高く、プロット基準位置から同心円状に彩度が低くなっていくように信頼度をプロットする。 The reliability is also plotted on the sphere by changing the saturation using the saturation. Since the score calculated from the image corresponds to the plot reference position, the reliability is the highest. Therefore, the reliability is plotted so that the plot reference position has the highest saturation and the saturation decreases concentrically from the plot reference position.
 図15Cのプロット手法において、二つのプロット基準位置は互いに異なる撮影姿勢に対応している。言い換えれば、図15Cのプロット手法は、複数の撮影姿勢に対応する複数の環境の情報を、単一の撮影位置に対応した単一の仮想立体物にプロットする処理である。このプロット手法によれば、ユーザは、単一の撮影位置に関し、複数の撮影姿勢に対応する複数の環境の情報を同時にかつ三次元的に確認することができる。したがって、ユーザが撮影姿勢を変更して再撮影する際のユーザの利便性(ユーザビリティ)が向上する。なお、球にプロットする画像から算出したスコアは3つ以上でもよく、特に数に制限はない。 In the plotting method of FIG. 15C, the two plot reference positions correspond to different shooting postures. In other words, the plotting method of FIG. 15C is a process of plotting information on a plurality of environments corresponding to a plurality of shooting postures on a single virtual three-dimensional object corresponding to a single shooting position. According to this plotting method, the user can simultaneously and three-dimensionally confirm information on a plurality of environments corresponding to a plurality of shooting postures with respect to a single shooting position. Therefore, the convenience (usability) of the user when the user changes the shooting posture and re-shoots is improved. The score calculated from the image plotted on the sphere may be 3 or more, and the number is not particularly limited.
 フローチャートの説明に戻る。次にステップS205で、配置処理部203は球を配置対象上に配置して球をユーザに視認可能な状態とする。球の配置は図16に示すように地図データ上に配置することができる。球は図16Aに示すように二次元地図上に配置することができるし、図16Bに示すように三次元地図上に配置することもできる。なお、ステップS205の球の配置処理が完了するまでは表示部105における表示を更新せず、球の配置処理完了後に表示を更新してスコアがプロットされた球を表示するようにしてもよいし、各処理が完了するごとに表示を更新するようにしてもよい。本技術はいずれかに限定されるものではない。 Return to the explanation of the flowchart. Next, in step S205, the arrangement processing unit 203 arranges the sphere on the arrangement target to make the sphere visible to the user. The spheres can be arranged on the map data as shown in FIG. The sphere can be arranged on a two-dimensional map as shown in FIG. 16A, or can be arranged on a three-dimensional map as shown in FIG. 16B. The display on the display unit 105 may not be updated until the ball placement process in step S205 is completed, and the display may be updated to display the ball on which the score is plotted after the ball placement process is completed. , The display may be updated every time each process is completed. The present technology is not limited to either.
 この地図上における球の配置を行うには、まず、予めセンサ部103で取得した撮影位置の位置情報に基づき、撮影位置が属するグリッドを把握しておく。そして、撮影位置が属するグリッドの地図上の位置にその撮影位置で撮影した画像から算出したスコアをプロットした球を配置することにより実現することができる。このようにしてスコアマップを作成することができる。 In order to arrange the spheres on this map, first, the grid to which the shooting position belongs is grasped based on the position information of the shooting position acquired by the sensor unit 103 in advance. Then, it can be realized by arranging a sphere on which the score calculated from the image taken at the shooting position is plotted at the position on the map of the grid to which the shooting position belongs. The score map can be created in this way.
 また、図17Aに示すように画像上に球の2次元画像を合成して配置することもできる。この球を配置する対象である画像はスコアを算出するために撮影した画像と同じものであってもよいし、違うものであってもよい。また、球は映像を構成するフレーム画像上に配置することにより、映像上に配置することもできる。 Further, as shown in FIG. 17A, a two-dimensional image of a sphere can be combined and arranged on the image. The image on which the sphere is placed may be the same as or different from the image taken to calculate the score. Further, the sphere can be arranged on the image by arranging the sphere on the frame image constituting the image.
 このように画像上における球を合成して配置するには、まず、予め撮影位置の位置情報をセンサ部103で取得しておく。また、画像の撮影位置の位置情報も取得しておく。そして、スコア算出のための撮影位置の位置情報に対応する位置情報を有する画像にスコアを表示した球を重畳して合成することにより実現することができる。 In order to synthesize and arrange the spheres on the image in this way, first, the position information of the shooting position is acquired in advance by the sensor unit 103. In addition, the position information of the shooting position of the image is also acquired. Then, it can be realized by superimposing a sphere displaying the score on an image having position information corresponding to the position information of the shooting position for score calculation and synthesizing it.
 さらに図17Bに示すように、AR、MRにおける実空間映像に球を重畳表示することもできる。これにより、あたかも実世界上に球が配置されているかのようにしてユーザにスコアを提示することもできる。図17BはスマートフォンSPのカメラで撮影して表示されている実空間の映像に球を重畳表示した例である。 Furthermore, as shown in FIG. 17B, a sphere can be superimposed and displayed on a real space image in AR and MR. This makes it possible to present the score to the user as if the sphere were placed in the real world. FIG. 17B is an example in which a sphere is superimposed and displayed on a real space image captured and displayed by the camera of the smartphone SP.
 このAR、MRにおける球の配置を行うには、予め撮影位置の位置情報をセンサ部103で取得しておく。また、スコアを表示した球をARオブジェクトとして作成しておく。さらに、その位置情報に対応する実世界に球の表示と関連付けたARマーカを設置しておく。そしてユーザがスマートフォンやウェアラブルデバイス、ヘッドマウントディスプレイなどのARデバイス、MRデバイスを通して実世界のARマーカを観測することにより、球がARデバイス、MRデバイスのディスプレイなどに表示される。なお、ARマーカを用いる方法以外にも、GPS情報を用いてARオブジェクトをARデバイス、MRデバイスに表示する方法を用いてもよい。 In order to arrange the spheres in the AR and MR, the position information of the shooting position is acquired in advance by the sensor unit 103. Also, create a sphere displaying the score as an AR object. Furthermore, an AR marker associated with the display of the sphere is installed in the real world corresponding to the position information. Then, when the user observes the AR marker in the real world through an AR device such as a smartphone, a wearable device, or a head-mounted display, or an MR device, the sphere is displayed on the display of the AR device or MR device. In addition to the method using the AR marker, a method of displaying the AR object on the AR device or MR device using GPS information may be used.
 このようにして撮影位置に球を配置する。なお、上述の説明では撮影位置に球を配置しているが、撮影を行っていない位置についても上述したスコアの推定方法によりスコアを推定することで撮影を行っていない位置には推定したスコアをプロットした球を配置することができる。これにより、撮影を行っていない位置にもスコアをプロットした球を配置してスコア情報を補完したり、撮影を行っていない位置を埋めるようにスコアを推定して格子状に球を配置することでよりスコア情報を詳細にする、といったことが可能になる。 In this way, the sphere is placed at the shooting position. In the above description, the sphere is placed at the shooting position, but even for the position where shooting is not performed, the estimated score is calculated at the position where shooting is not performed by estimating the score by the score estimation method described above. Plotted spheres can be placed. As a result, spheres with plots of scores can be placed even at positions where shooting is not performed to complement the score information, or scores can be estimated and spheres can be placed in a grid pattern so as to fill positions where shooting is not performed. It is possible to make the score information more detailed.
 なお、上述したいずれの球の配置方法においても、球が密に配置されることにより空間が埋め尽くされて球が見えにくくなることを防ぐために、球を半透明化して表示する、球の配置に自動的に間隔を設ける、ことなどを行ってもよい。 In any of the above-mentioned sphere arrangement methods, the spheres are arranged in a semi-transparent manner in order to prevent the spheres from being filled up in space due to the dense arrangement of the spheres and making the spheres difficult to see. You may do things such as automatically setting an interval.
 以上のようにして情報処理装置200による処理が行われる。なお、スコアの取得処理とスコアのプロット処理を並列処理で同期実行することも可能であるし、先にスコアの取得を行い、後でスコアの表示処理を行うことも可能である。なお、撮影、スコア取得処理、スコアのプロットおよび配置処理を並列処理で同期実行すると、ユーザがカメラ部102で撮影しながらグリッドからグリッドへ移動し、通過したグリッドに球を配置しながら撮影と移動を続ける、といったことも可能になる。 The processing by the information processing device 200 is performed as described above. It is also possible to synchronously execute the score acquisition process and the score plot process in parallel processing, or it is also possible to acquire the score first and then perform the score display process. When shooting, score acquisition processing, score plotting, and placement processing are synchronously executed in parallel processing, the user moves from grid to grid while shooting with the camera unit 102, and shoots and moves while arranging spheres on the passed grid. It is also possible to continue.
 なお、球にはスコアのみをプロットしてもよいし、信頼度のみをプロットしてもよい。スコアが算出されていない位置、姿勢の信頼度を0%と定義してそれを球にプロットすることにより、スコアを算出してない位置、姿勢をひと目で確認することができるようになる。例えば、この信頼度0%を黒一色の球で表した場合、その黒い球を消していくようにスコアを算出とプロットを行っていけば効率よくスコアマップを作成することができる。 Note that only the score may be plotted on the sphere, or only the reliability may be plotted. By defining the reliability of the position and posture for which the score has not been calculated as 0% and plotting it on the sphere, the position and posture for which the score has not been calculated can be confirmed at a glance. For example, when this reliability of 0% is represented by a black sphere, a score map can be efficiently created by calculating and plotting the score so as to erase the black sphere.
 図18Aに示すように、球にHSV色空間でスコアをプロットし、かつ、球にスコアの数値をプロットしてもよい。さらに、スコアの数値に加えて信頼度の数値もプロットしてもよい。 As shown in FIG. 18A, the score may be plotted on the sphere in the HSV color space, and the numerical value of the score may be plotted on the sphere. Further, in addition to the score value, the reliability value may be plotted.
 また、球にプロットされたスコアからスコアが算出されていない位置におけるスコアを推定することも可能である。図18B-aに示すように画像から算出したスコアがプロットされた2つの球1、球2が配置され、その間の空間には球は配置されておらず、球1と球2の中間点に球3を配置しようとする場合を考える。この場合、スコアがプロットされた2つの球1、球2におけるプロット位置を線で結び、その線がスコアがプロットされていない球3に重なる場合、球3のスコアを球1のプロット位置におけるスコアと球1のプロット位置におけるスコアから線形補間で算出することができる。そのようにして算出した球3にプロットするスコアは図18B-bおよび図18B-cに示すように球1と球2にプロットされたスコアの中間値を示す配色となる。 It is also possible to estimate the score at the position where the score is not calculated from the score plotted on the sphere. As shown in FIG. 18B-a, two spheres 1 and 2 on which the score calculated from the image is plotted are arranged, and no sphere is arranged in the space between them, and the sphere is not arranged at the midpoint between the sphere 1 and the sphere 2. Consider the case where the sphere 3 is to be arranged. In this case, if the plot positions on the two spheres 1 and 2 on which the score is plotted are connected by a line and the line overlaps the sphere 3 on which the score is not plotted, the score of the sphere 3 is the score at the plot position of the sphere 1. It can be calculated by linear interpolation from the score at the plot position of sphere 1. The score plotted on the sphere 3 thus calculated has a color scheme indicating an intermediate value of the scores plotted on the sphere 1 and the sphere 2 as shown in FIGS. 18B-b and 18B-c.
 また、スコアをプロットした球を配置したスコアマップにおいて、図19Aに示すようにスコアが算出されてないまたは信頼度が所定値よりも低い位置、姿勢を提示してその位置、姿勢で撮影することを促すようにしてもよい。図19Aにおいて、スコアマップ上の太線はスコアがない位置、姿勢を撮影するための推奨移動ルートを表しており、細矢印線はスコアがない方向を示している。この推奨ルートの提示は図19Bに示すように球を重畳した画像においても可能である。 Further, in the score map in which the sphere on which the score is plotted is arranged, as shown in FIG. 19A, the position and posture in which the score is not calculated or the reliability is lower than the predetermined value are presented and the image is taken at that position and posture. May be encouraged. In FIG. 19A, the thick line on the score map represents the position and the recommended movement route for photographing the posture without the score, and the thin arrow line indicates the direction without the score. The presentation of this recommended route is also possible in an image in which spheres are superimposed as shown in FIG. 19B.
 以上のようにして情報処理装置200による処理が行われる。本技術によれば、撮影した画像により得られる情報であるスコアとその信頼度をユーザが目視だけで容易に把握することができる。また、スコアと信頼度を仮想的な球の表面に色で表すことにより、ユーザは直感的にスコアを把握することができる。また、平面ではなく球面にスコアと信頼度をプロットことにより、撮影姿勢ごとに異なるスコアおよび信頼度を一方向から見ても把握することができる。 The processing by the information processing device 200 is performed as described above. According to the present technology, the user can easily grasp the score, which is the information obtained from the captured image, and its reliability only by visual inspection. In addition, by expressing the score and reliability in color on the surface of the virtual sphere, the user can intuitively grasp the score. In addition, by plotting the score and reliability on a spherical surface instead of a flat surface, it is possible to grasp the score and reliability that differ depending on the shooting posture even when viewed from one direction.
<2.変形例> <2. Modification example>
 以上、本技術の実施の形態について具体的に説明したが、本技術は上述の実施の形態に限定されるものではなく、本技術の技術的思想に基づく各種の変形が可能である。 Although the embodiment of the present technology has been specifically described above, the present technology is not limited to the above-described embodiment, and various modifications based on the technical idea of the present technology are possible.
 端末装置100としては、スマートフォンの他、パーソナルコンピュータ、タブレット端末、デジタルカメラ、携帯電話機、携帯ゲーム機、ヘッドマウントディスプレイ、ウェアラブルデバイスなどでもよい。 The terminal device 100 may be a personal computer, a tablet terminal, a digital camera, a mobile phone, a portable game machine, a head-mounted display, a wearable device, or the like, in addition to a smartphone.
 また、カメラ部102を用いてカメラとして使用する端末装置100と、仮想立体物である球を表示する表示装置として使用する端末装置100とは同一の装置であってもよいし、別個の装置であってもよい。さらに、カメラ部102を用いてカメラとして使用する装置と、仮想立体物である球を表示する装置、情報処理装置200として動作する装置は全て別個の装置であってもよい。 Further, the terminal device 100 used as a camera by using the camera unit 102 and the terminal device 100 used as a display device for displaying a sphere which is a virtual three-dimensional object may be the same device, or may be separate devices. There may be. Further, the device used as a camera by using the camera unit 102, the device for displaying a sphere which is a virtual three-dimensional object, and the device operating as the information processing device 200 may all be separate devices.
 本技術は以下のような構成も取ることができる。
(1)
 画像から撮影範囲内の環境の情報を取得する情報取得部と、
 前記情報を仮想立体物の表面にプロットするプロット処理部と、
 前記仮想立体物を配置対象に配置する配置処理部とを備える情報処理装置。
(2)
 前記プロット処理部は、前記画像の撮影位置および撮影姿勢に基づいて前記情報を前記仮想立体物にプロットする(1)に記載の情報処理装置。
(3)
 前記プロット処理部は、複数の前記撮影姿勢に対応する前記情報を、単一の前記撮影位置に対応する単一の前記仮想立体物にプロットする(1)または(2)に記載の情報処理装置。
(4)
 前記配置処理部は、前記画像の撮影位置に対応した前記配置対象における位置に前記仮想立体物を配置する(1)から(3)のいずれかに記載の情報処理装置。
(5)
 前記画像の撮影を行う領域にグリッドを設定し、
 前記配置処理部は、内部に前記撮影位置が存在する前記グリッド内に前記仮想立体物を配置し、
 前記プロット処理部は、前記グリッド内に前記撮影位置が存在する前記画像から取得された前記情報を前記グリッドに配置された前記仮想立体物にプロットする(4)に記載の情報処理装置。
(6)
 前記情報は前記画像の前記撮影範囲内における環境の特徴を示す指標であり、数値により表されるスコアである(1)から(5)のいずれかに記載の情報処理装置。
(7)
 前記情報は前記スコアの信頼度である(6)に記載の情報処理装置。
(8)
 前記情報取得部は前記画像から前記情報としてのスコアを算出する(6)に記載の情報処理装置。
(9)
 前記情報取得部は前記画像から算出された前記スコアから前記画像の撮影範囲外におけるスコアを推定する(8)に記載の情報処理装置。
(10)
 前記仮想立体物は球である(1)から(9)のいずれかに記載の情報処理装置。
(11)
 前記配置対象は、空間データである(1)から(10)のいずれかに記載の情報処理装置。
(12)
 前記配置対象は、2次元および/または3次元の地図データである(11)に記載の情報処理装置。
(13)
 前記配置対象は、画像および/または映像である(11)または(12)に記載の情報処理装置。
(14)
 前記配置対象は、仮想空間である(11)から(13)のいずれかに記載の情報処理装置。
(15)
 前記プロット処理部は、HSV色空間を用いて前記情報を前記仮想立体物にプロットする(1)から(14)のいずれかに記載の情報処理装置。
(16)
 前記プロット処理部は、前記スコアを色相を用いて前記仮想立体物にプロットし、前記信頼度を彩度を用いて前記仮想立体物にプロットする(15)に記載の情報処理装置。
(17)
 前記配置処理部は、前記画像の撮影を行う領域としての実空間にグリッドを設定し、前記グリッドに対応する配置対象としての空間データ内の特定の位置に前記仮想立体物としての球を配置し、
 前記プロット処理部は、前記画像から算出したスコアを前記球の表面上にプロットする(1)から(16)のいずれかに記載の情報処理装置。
(18)
 画像から撮影範囲内の情報を取得し、
 前記情報を仮想立体物の表面にプロットし、
 前記仮想立体物を配置対象に配置する情報処理方法。
(19)
 画像から撮影範囲内の情報を取得し、
 前記情報を仮想立体物の表面にプロットし、
 前記仮想立体物を配置対象に配置する情報処理方法をコンピュータに実行させる情報処理プログラム。
The present technology can also have the following configurations.
(1)
An information acquisition unit that acquires information on the environment within the shooting range from the image,
A plot processing unit that plots the above information on the surface of a virtual three-dimensional object,
An information processing device including an arrangement processing unit for arranging the virtual three-dimensional object as an arrangement target.
(2)
The information processing device according to (1), wherein the plot processing unit plots the information on the virtual three-dimensional object based on the shooting position and shooting posture of the image.
(3)
The information processing apparatus according to (1) or (2), wherein the plot processing unit plots the information corresponding to the plurality of shooting postures on the single virtual three-dimensional object corresponding to the single shooting position. ..
(4)
The information processing apparatus according to any one of (1) to (3), wherein the arrangement processing unit arranges the virtual three-dimensional object at a position in the arrangement target corresponding to the shooting position of the image.
(5)
A grid is set in the area where the image is taken, and the grid is set.
The arrangement processing unit arranges the virtual three-dimensional object in the grid in which the photographing position exists.
The information processing apparatus according to (4), wherein the plot processing unit plots the information acquired from the image in which the shooting position exists in the grid onto the virtual three-dimensional object arranged on the grid.
(6)
The information processing apparatus according to any one of (1) to (5), which is an index showing the characteristics of the environment of the image within the shooting range and is a numerical score.
(7)
The information processing apparatus according to (6), wherein the information is the reliability of the score.
(8)
The information processing device according to (6), wherein the information acquisition unit calculates a score as the information from the image.
(9)
The information processing apparatus according to (8), wherein the information acquisition unit estimates a score outside the shooting range of the image from the score calculated from the image.
(10)
The information processing device according to any one of (1) to (9), wherein the virtual three-dimensional object is a sphere.
(11)
The information processing device according to any one of (1) to (10), wherein the arrangement target is spatial data.
(12)
The information processing device according to (11), wherein the arrangement target is two-dimensional and / or three-dimensional map data.
(13)
The information processing device according to (11) or (12), wherein the arrangement target is an image and / or a moving image.
(14)
The information processing device according to any one of (11) to (13), which is a virtual space.
(15)
The information processing apparatus according to any one of (1) to (14), wherein the plot processing unit plots the information on the virtual three-dimensional object using the HSV color space.
(16)
The information processing apparatus according to (15), wherein the plot processing unit plots the score on the virtual three-dimensional object using hue, and plots the reliability on the virtual three-dimensional object using saturation.
(17)
The arrangement processing unit sets a grid in the real space as an area for capturing the image, and arranges the sphere as the virtual three-dimensional object at a specific position in the spatial data as the arrangement target corresponding to the grid. ,
The information processing apparatus according to any one of (1) to (16), wherein the plot processing unit plots a score calculated from the image on the surface of the sphere.
(18)
Obtain information within the shooting range from the image and
The above information is plotted on the surface of a virtual three-dimensional object.
An information processing method for arranging the virtual three-dimensional object as an arrangement target.
(19)
Obtain information within the shooting range from the image and
The above information is plotted on the surface of a virtual three-dimensional object.
An information processing program that causes a computer to execute an information processing method for arranging the virtual three-dimensional object as an arrangement target.
200・・・情報処理装置。
201・・・情報取得部
202・・・プロット処理部
203・・・配置処理部
200 ... Information processing device.
201 ... Information acquisition unit 202 ... Plot processing unit 203 ... Arrangement processing unit

Claims (19)

  1.  画像から撮影範囲内の環境の情報を取得する情報取得部と、
     前記情報を仮想立体物の表面にプロットするプロット処理部と、
     前記仮想立体物を配置対象に配置する配置処理部と
    を備える情報処理装置。
    An information acquisition unit that acquires information on the environment within the shooting range from the image,
    A plot processing unit that plots the above information on the surface of a virtual three-dimensional object,
    An information processing device including an arrangement processing unit for arranging the virtual three-dimensional object as an arrangement target.
  2.  前記プロット処理部は、前記画像の撮影位置および撮影姿勢に基づいて前記情報を前記仮想立体物にプロットする
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the plot processing unit plots the information on the virtual three-dimensional object based on the shooting position and shooting posture of the image.
  3.  前記プロット処理部は、複数の前記撮影姿勢に対応する前記情報を、単一の前記撮影位置に対応する単一の前記仮想立体物にプロットする
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the plot processing unit plots the information corresponding to the plurality of shooting postures on the single virtual three-dimensional object corresponding to the single shooting position.
  4.  前記配置処理部は、前記画像の撮影位置に対応した前記配置対象における位置に前記仮想立体物を配置する
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the arrangement processing unit arranges the virtual three-dimensional object at a position in the arrangement target corresponding to the shooting position of the image.
  5.  前記画像の撮影を行う領域にグリッドを設定し、
     前記配置処理部は、内部に前記撮影位置が存在する前記グリッド内に前記仮想立体物を配置し、
     前記プロット処理部は、前記グリッド内に前記撮影位置が存在する前記画像から取得された前記情報を前記グリッドに配置された前記仮想立体物にプロットする
    請求項4に記載の情報処理装置。
    A grid is set in the area where the image is taken, and the grid is set.
    The arrangement processing unit arranges the virtual three-dimensional object in the grid in which the photographing position exists.
    The information processing apparatus according to claim 4, wherein the plot processing unit plots the information acquired from the image in which the photographing position exists in the grid onto the virtual three-dimensional object arranged on the grid.
  6.  前記情報は前記画像の前記撮影範囲内における環境の特徴を示す指標であり、数値により表されるスコアである
    請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the information is an index showing the characteristics of the environment within the shooting range of the image, and is a score represented by a numerical value.
  7.  前記情報は前記スコアの信頼度である
    請求項6に記載の情報処理装置。
    The information processing device according to claim 6, wherein the information is the reliability of the score.
  8.  前記情報取得部は前記画像から前記情報としてのスコアを算出する
    請求項6に記載の情報処理装置。
    The information processing device according to claim 6, wherein the information acquisition unit calculates a score as the information from the image.
  9.  前記情報取得部は前記画像から算出された前記スコアから前記画像の撮影範囲外におけるスコアを推定する
    請求項8に記載の情報処理装置。
    The information processing device according to claim 8, wherein the information acquisition unit estimates a score outside the shooting range of the image from the score calculated from the image.
  10.  前記仮想立体物は球である
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the virtual three-dimensional object is a sphere.
  11.  前記配置対象は、空間データである
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the arrangement target is spatial data.
  12.  前記配置対象は、2次元および/または3次元の地図データである
    請求項11に記載の情報処理装置。
    The information processing device according to claim 11, wherein the arrangement target is two-dimensional and / or three-dimensional map data.
  13.  前記配置対象は、画像および/または映像である
    請求項11に記載の情報処理装置。
    The information processing device according to claim 11, wherein the arrangement target is an image and / or a moving image.
  14.  前記配置対象は、仮想空間である
    請求項11に記載の情報処理装置。
    The information processing device according to claim 11, wherein the arrangement target is a virtual space.
  15.  前記プロット処理部は、HSV色空間を用いて前記情報を前記仮想立体物にプロットする
    請求項1に記載の情報処理装置。
    The information processing device according to claim 1, wherein the plot processing unit plots the information on the virtual three-dimensional object using the HSV color space.
  16.  前記プロット処理部は、前記スコアを色相を用いて前記仮想立体物にプロットし、前記信頼度を彩度を用いて前記仮想立体物にプロットする
    請求項15に記載の情報処理装置。
    The information processing apparatus according to claim 15, wherein the plot processing unit plots the score on the virtual three-dimensional object using hue and plots the reliability on the virtual three-dimensional object using saturation.
  17.  前記配置処理部は、前記画像の撮影を行う領域としての実空間にグリッドを設定し、前記グリッドに対応する配置対象としての空間データ内の特定の位置に前記仮想立体物としての球を配置し、
     前記プロット処理部は、前記画像から算出したスコアを前記球の表面上にプロットする
    請求項1に記載の情報処理装置。
    The arrangement processing unit sets a grid in the real space as an area for capturing the image, and arranges the sphere as the virtual three-dimensional object at a specific position in the spatial data as the arrangement target corresponding to the grid. ,
    The information processing device according to claim 1, wherein the plot processing unit plots a score calculated from the image on the surface of the sphere.
  18.  画像から撮影範囲内の情報を取得し、
     前記情報を仮想立体物の表面にプロットし、
     前記仮想立体物を配置対象に配置する
    情報処理方法。
    Obtain information within the shooting range from the image and
    The above information is plotted on the surface of a virtual three-dimensional object.
    An information processing method for arranging the virtual three-dimensional object as an arrangement target.
  19.  画像から撮影範囲内の情報を取得し、
     前記情報を仮想立体物の表面にプロットし、
     前記仮想立体物を配置対象に配置する
    情報処理方法をコンピュータに実行させる情報処理プログラム。
    Obtain information within the shooting range from the image and
    The above information is plotted on the surface of a virtual three-dimensional object.
    An information processing program that causes a computer to execute an information processing method for arranging the virtual three-dimensional object as an arrangement target.
PCT/JP2020/037806 2019-10-15 2020-10-06 Information processing device, information processing method, and information processing program WO2021075307A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019188342 2019-10-15
JP2019-188342 2019-10-15

Publications (1)

Publication Number Publication Date
WO2021075307A1 true WO2021075307A1 (en) 2021-04-22

Family

ID=75537590

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/037806 WO2021075307A1 (en) 2019-10-15 2020-10-06 Information processing device, information processing method, and information processing program

Country Status (1)

Country Link
WO (1) WO2021075307A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023162646A1 (en) * 2022-02-24 2023-08-31 ソニーセミコンダクタソリューションズ株式会社 Video display device, video processing system, and video processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11259673A (en) * 1998-01-08 1999-09-24 Nippon Telegr & Teleph Corp <Ntt> Space stroll video display method, in-space object retrieving method, and in-space object extracting method, device therefor, and recording medium where the methods are recorded
JP2010049313A (en) * 2008-08-19 2010-03-04 Sony Corp Image processor, image processing method, program
JP2018088065A (en) * 2016-11-28 2018-06-07 株式会社Nttファシリティーズ Information visualization system, information visualization method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11259673A (en) * 1998-01-08 1999-09-24 Nippon Telegr & Teleph Corp <Ntt> Space stroll video display method, in-space object retrieving method, and in-space object extracting method, device therefor, and recording medium where the methods are recorded
JP2010049313A (en) * 2008-08-19 2010-03-04 Sony Corp Image processor, image processing method, program
JP2018088065A (en) * 2016-11-28 2018-06-07 株式会社Nttファシリティーズ Information visualization system, information visualization method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023162646A1 (en) * 2022-02-24 2023-08-31 ソニーセミコンダクタソリューションズ株式会社 Video display device, video processing system, and video processing method

Similar Documents

Publication Publication Date Title
CN110073313B (en) Interacting with an environment using a parent device and at least one companion device
CN107836012B (en) Projection image generation method and device, and mapping method between image pixel and depth value
US9256986B2 (en) Automated guidance when taking a photograph, using virtual objects overlaid on an image
JP6329343B2 (en) Image processing system, image processing apparatus, image processing program, and image processing method
JP2018535402A (en) System and method for fusing outputs of sensors having different resolutions
US20160210785A1 (en) Augmented reality system and method for positioning and mapping
CN111344644B (en) Techniques for motion-based automatic image capture
US9662583B2 (en) Portable type game device and method for controlling portable type game device
CN111353930B (en) Data processing method and device, electronic equipment and storage medium
JP6310149B2 (en) Image generation apparatus, image generation system, and image generation method
CN109668545B (en) Positioning method, positioner and positioning system for head-mounted display device
WO2019221800A1 (en) System and method for spatially registering multiple augmented reality devices
US20120154377A1 (en) Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method
US20230244227A1 (en) Data processing method, control apparatus and storage medium
US20180158171A1 (en) Display apparatus and controlling method thereof
US20120218259A1 (en) Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing method, and image processing system
CN114549766B (en) Real-time AR visualization method, device, equipment and storage medium
WO2021075307A1 (en) Information processing device, information processing method, and information processing program
JP2019101563A (en) Information processing apparatus, information processing system, information processing method, and program
JP7006810B2 (en) 3D measuring device, mobile robot, push wheel type moving device and 3D measurement processing method
KR101620983B1 (en) System and Method for realtime 3D tactical intelligence display
WO2021134715A1 (en) Control method and device, unmanned aerial vehicle and storage medium
CN112053444B (en) Method for superposing virtual objects based on optical communication device and corresponding electronic equipment
CN112053451B (en) Method for superposing virtual objects based on optical communication device and corresponding electronic equipment
WO2023162504A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20876894

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20876894

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP