WO2021125190A1 - Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations Download PDF

Info

Publication number
WO2021125190A1
WO2021125190A1 PCT/JP2020/046823 JP2020046823W WO2021125190A1 WO 2021125190 A1 WO2021125190 A1 WO 2021125190A1 JP 2020046823 W JP2020046823 W JP 2020046823W WO 2021125190 A1 WO2021125190 A1 WO 2021125190A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
information processing
processing device
data
evaluation
Prior art date
Application number
PCT/JP2020/046823
Other languages
English (en)
Japanese (ja)
Inventor
秀憲 青木
富士夫 荒井
智彦 後藤
京二郎 永野
遼 深澤
春香 藤澤
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2021125190A1 publication Critical patent/WO2021125190A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • This disclosure relates to an information processing device, an information processing system, and an information processing method.
  • An evaluation data generation unit that generates first evaluation data based on the first map and a third map used for managing services, and the first map or the first map that the object generation unit uses to generate the virtual object.
  • a map selection unit for selecting a map based on the first evaluation data may be further provided.
  • the map selection unit further includes a self-position estimation unit that estimates the self-position, acquires a first evaluation value corresponding to the self-position from the first evaluation data, and generates the object based on the first evaluation value.
  • the first map or the second map that the unit uses to generate the virtual object may be selected.
  • the map selection unit may select the first map when the first evaluation value is equal to or less than the threshold value.
  • the map selection unit may select the first map when the second evaluation value is equal to or less than the first evaluation value.
  • the information processing system includes a first information processing device that generates a first map based on a first image, and a second information processing device that generates a second map based on a second image.
  • the first information processing device generates a virtual object based on the first map or the second map acquired from the second information processing device, and the second information processing device is the second map or the second map.
  • the virtual object may be generated based on the first map acquired from the first information processing apparatus.
  • the figure which showed the example of the evaluation data. A flowchart showing an example of the map selection process.
  • the information processing device may be a device other than a wearable computer.
  • the information processing device according to the present disclosure may be a device portable to a user such as a game device, a smartphone, a tablet, or a laptop computer, and the information processing device according to the present disclosure may be an automobile, an aircraft, a ship, or a railway. It may be a device mounted in the cockpit (driver's seat) or console of a moving body such as a vehicle.
  • the information processing system 100 of FIG. 1 includes two wearable computers (information processing devices). However, the information processing system according to the present disclosure may include more information processing devices. Details of the information processing apparatus according to the present disclosure will be described later.
  • the point cloud data generated in advance can be used.
  • point cloud data updated with the passage of time may be used.
  • the global map data 36 may be data generated by the same method as the local map data described later, or may be data generated by a method different from the local map data.
  • the global map data 36 of the space may be generated by a technique such as SFM or LIDAR.
  • the global map data 36 may be generated by using an automobile equipped with a laser scanner or an unmanned aerial vehicle (UAV: Unmanned Aerial Vehicle) equipped with a laser scanner.
  • UAV Unmanned Aerial Vehicle
  • FIG. 2 shows an example of a global map.
  • the global map of FIG. 2 shows the shape and position of objects along the streets of the city.
  • a global map including a wide area where the service is expected to be used is prepared in advance. Therefore, if you refer to the global map, you can see that the buildings are continuous along the road to a long distance.
  • FIG. 3 shows an example of global map data including the global map of FIG.
  • the global map data 36 data expressing the shape and position of an object in space in a three-dimensional Cartesian coordinate system can be used.
  • the data format of the global map data 36 is not limited.
  • the block diagram of FIG. 4 shows an example of the information processing apparatus according to the present disclosure.
  • the information processing device of FIG. 4 corresponds to the wearable computer 1A or 1B of FIG.
  • the information processing device 1 includes an image pickup unit 2, a sensor unit 3, a processing circuit 4, a storage unit 5, a communication circuit 6, a display unit 7, and an audio circuit 8.
  • the processing circuit 4 for example, a hardware circuit including a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or a combination thereof can be used. it can.
  • the processing circuit 4 includes a local map generation unit 10, a self-position estimation unit 11, a map acquisition unit 12, a map selection unit 13, an evaluation data generation unit 14, and an object generation unit 15 as internal components. Includes.
  • the block diagram of FIG. 5 shows an example of data stored in the storage unit.
  • the image data 20 the first local map data 21, the second local map data 22, the global map data 23, the first evaluation data 24, the second evaluation data 25, and the like.
  • the map processing program 26 and the application program 27 are stored.
  • a volatile memory for example, DRAM or SRAM
  • a non-volatile memory for example, NAND flash memory, NOR flash memory, resistance change type memory, magnetoresistive memory
  • a hard disk SSD (Solid State Drive) or these You can use the combination of.
  • the type of memory / storage used as the storage unit 5 does not matter.
  • FIG. 6 shows an example of a local map.
  • the local map of FIG. 6 shows the shape and position of an object along the road in the same city as that of FIG.
  • the local map generation unit 10 generates a local map from an area where a sufficient number of images can be captured. Therefore, as in the area 28 of FIG. 6, information on the shape and position of the object cannot be sufficiently obtained, and there is a possibility that an area for which a local map has not been created remains in the space.
  • the information on the shape and position of the object in the vicinity of the user in the local map is updated in real time. Therefore, it is possible to acquire detailed information about the area in the space that the user is observing for a long time.
  • the self-position estimation unit 11 estimates the position of the information processing device 1 in space.
  • the self-position estimation unit 11 can estimate the self-position based on the image captured by the imaging unit 2, for example.
  • the self-position estimation unit 11 may estimate the self-position based on the measured value of the sensor in the sensor unit 3.
  • the coordinates in the first local map indicating the current position of the information processing device 1 are updated.
  • the self-position estimation unit 11 may calculate the coordinates in at least one of the second local map and the global map when estimating the self-position.
  • the coordinates for example, offset values (x l , y l , z l ) from the origin in the Cartesian coordinate system can be used.
  • the method of expressing the coordinates is not limited. Other types of coordinate systems, such as spherical or polar coordinates, may be used.
  • the method of coordinate conversion between the first local coordinate and the global coordinate will be described.
  • conversion to the other coordinate system is performed when the first local coordinates or global coordinates corresponding to the same position in the space are obtained will be described.
  • FIG. 8 shows an example in which the position c is expressed using each coordinate system when the global coordinate system is a and the first local coordinate system is b.
  • a P c is a vector indicating the position of c in the coordinate system a.
  • b P c is a vector indicating the position of c in the coordinate system b.
  • a R b is a rotation matrix for converting a vector represented by the coordinate system b into a vector represented by the coordinate system a.
  • a P b is a vector indicating the position of b in the coordinate system a.
  • a T b is a representation of the rotation matrix a R b and the vector a P b in 4 ⁇ 4 of the homogeneous coordinate system.
  • the object generation unit 15 executes AR-related processing such as generation of a virtual object and processing of a service provided to the user.
  • the application program 27 of the storage unit 5 can be executed on the processing circuit 4 to provide the function of the object generation unit 15 to the user.
  • the object generation unit 15 executes the client-side processing of the service.
  • the service program 34 of the management server 30 executes the processing on the server side of the service.
  • the object generation unit 15 when the service program 34 of the management server 30 transmits a command requesting the generation of a virtual object at a predetermined global coordinate, the object generation unit 15 generates a virtual object at the first local coordinate obtained by converting the global coordinate. Then, the user can visually recognize the virtual object through the display unit 7.
  • the map selection unit 13 selects the map used by the object generation unit 15 to display the virtual object based on the evaluation data described later. When the map is selected by the map selection unit 13, the coordinate system used for displaying the virtual object is also determined.
  • View V1 and view V2 of FIG. 9 may be realized by a combination of an image or video displayed on the display unit 7 and a background transmitted through the display unit 7. Further, the view V1 and the view V2 may be realized only by the image or the video displayed on the display unit 7.
  • the service program 34 of the management server 30 transmits a command to generate the virtual object A1 at predetermined global coordinates to the wearable computers 1A and 1B.
  • the virtual object A1 is a character that appears in the service.
  • the virtual object A1 may be a still image or an animation.
  • each of the wearable computers 1A and 1B converts the global coordinates into the local coordinates and displays the virtual object A1 which is a character on the display unit 7.
  • the virtual object A1 which is a character is displayed at a position shifted from above the object OBJ, the user U2 cannot recognize that the character is protruding from the opening of the jar. ..
  • the body shapes and heights of users who use the service will be diverse, and since each user will take different behavior patterns, there will be differences in the area in the space where the user's line of sight is directed. Therefore, a different local map is generated for each wearable computer. If different local maps are used depending on the wearable computer, the appearance of the virtual object may differ depending on the user as shown in the example of FIG. 9, and the quality of service may deteriorate. For example, when the virtual object is an enemy character, the appearance of the enemy character differs depending on the user. In this case, it becomes difficult for a plurality of users who play the action game to cooperate and fight against the enemy character, and the user's satisfaction level is lowered. Therefore, it is desirable to suppress the deviation of the appearance of the virtual object among a plurality of users according to the service situation and situation.
  • the evaluation data generation unit 14 generates evaluation data.
  • the flowchart of FIG. 10 shows an example of evaluation data generation processing. Hereinafter, the evaluation data generation process will be described with reference to the flowchart of FIG.
  • the evaluation data generation unit 14 divides the point cloud data in the vicinity of the first local coordinates in the first local map into the point cloud data in the vicinity of the global coordinates corresponding to the first local coordinates in the global map.
  • the similarity is calculated based on this (step S104).
  • step S104 for example, the similarity value can be calculated for each mesh that divides the space based on the first local coordinate system.
  • Data D1 in FIG. 11 shows an example of the degree of similarity between the global map and the local map calculated in step S104. In the example of FIG. 11, the similarity value is shown for each two-dimensional region. However, the actually calculated similarity may be associated with the coordinates in the three-dimensional space or the mesh in the three-dimensional space.
  • the evaluation data generation unit 14 acquires the attribute information in the vicinity of the first local coordinates from the program that uses the map (step S105). For example, the evaluation data generation unit 14 may acquire attribute information in the vicinity of the first local coordinates from the object generation unit 15 in step S105. Similar to the similarity data calculated in step S104, data in which a value is set for each mesh in which the space is divided based on the first local coordinate system can be used as the attribute information.
  • the flowchart of FIG. 15 shows an example of the map selection process.
  • the process will be described with reference to the flowchart of FIG.
  • the explanation of the flowchart when it is stated that data is to be acquired, it shall include not only the case of acquiring the entire relevant data but also the acquisition of a part of the relevant data.
  • the process to be executed branches according to the determination result in step S203.
  • the object generation unit 15 uses the first local map (first local map data 21) when the virtual object is generated. Use (step S208). If the first evaluation value is less than or equal to the threshold value, it is not highly necessary to create a virtual object using the same local map as other users, or even if the first local map is used, a virtual object is created between users. It is assumed that the deviation of the appearance of the object will not increase. Therefore, when the determination in step S203 is negative, the object generation unit 15 executes the virtual object generation process using the first local map (step S208).
  • the map acquisition unit 12 When the first evaluation value of the self-position in the evaluation data is larger than the threshold value (YES in step S203), the map acquisition unit 12 has the second local map (second local map data 22) and the second local map data 22 from other information processing devices.
  • the second evaluation data (second evaluation data 25) is acquired (step S204). If the first evaluation value is larger than the threshold value, it is assumed that there is a large difference in the appearance of the virtual object between users, or it is highly necessary to generate the virtual object using the same local map as other users. The object.
  • the information processing device from which the second local map data 22 and the second evaluation data 25 are acquired is not particularly limited.
  • the map acquisition unit 12 can acquire the second local map data 22 and the second evaluation data 25 from at least one of the information processing devices in the information processing system 100 in step S204. Further, the map acquisition unit 12 may acquire the second local map data 22 and the second evaluation data 25 from the plurality of information processing devices in step S204.
  • the information processing device 1 may select another information processing device that is the acquisition source of the second local map data 22 and the second evaluation data 25.
  • the map acquisition unit 12 may acquire the second local map data 22 and the second evaluation data 25 from the information processing device having the highest intensity of the beacon signal received via the communication circuit 6. Further, the map acquisition unit 12 may acquire the second local map data 22 and the second evaluation data 25 from one or more information processing devices selected based on the strength of the beacon signal. Further, the map acquisition unit 12 may acquire the second local map data 22 and the second evaluation data 25 from the user or the information processing device designated by the object generation unit 15. This makes it possible to select the information processing device that is the acquisition source of the local map based on the position of each user and the processing in the object generation unit 15.
  • the map selection unit 13 uses the second local map (second local map data 22) and the second evaluation data. (Second evaluation data 25) is used to obtain the second evaluation value of the self-position (step S205).
  • the evaluation value obtained based on the second local coordinates is referred to as the second evaluation value.
  • the map selection unit 13 obtains the second local coordinates corresponding to the self-position based on the first local coordinates corresponding to the self-position.
  • the map selection unit 13 performs the above-mentioned coordinate transformation to calculate the second local coordinates corresponding to its own position. Further, the map selection unit 13 may calculate the second local coordinates corresponding to its own position based on the shape and position of the object included in the first local map data 21 and the second local map data 22. For example, pattern matching may be performed based on the feature amount, and the second local coordinates corresponding to the self-position in the second local map data 22 may be searched. However, the method in which the map selection unit 13 calculates the second local coordinates corresponding to its own position does not matter.
  • the map selection unit 13 searches for the second local coordinates corresponding to the self-position in the second evaluation data 25. As a result, the map selection unit 13 can obtain the second evaluation value of the self-position. Then, the map selection unit 13 determines whether or not the second evaluation value of the self-position is larger than the first evaluation value of the self-position (step S206).
  • the map selection unit 13 acquires the second local from the plurality of information processing devices.
  • the processes of steps S205 and S206 can be executed for the map data 22 and the second evaluation data 25.
  • the object generation unit 15 generates a virtual object using the second local map data 22 of the information processing apparatus having the largest value obtained by subtracting the first evaluation value from the second evaluation value in step S207. You may. Further, when an information processing device for which the determination in step S206 is affirmative is detected, a virtual object may be generated using the second local map data 22 acquired from the information processing device.
  • FIG. 16 shows an example of how a virtual object looks in the information processing system according to the present disclosure.
  • the user U1 wearing the wearable computer 1A and the user U2 wearing the wearable computer 1B are facing the object OBJ at the same positions and postures as in FIG. Similar to FIG. 9, the object OBJ is a jar having an opening at the top.
  • View V1 shows an example of how it looks from user U1.
  • the view V2 shows an example of how the user U2 sees the view.
  • the service program 34 of the management server 30 transmits a command to generate the virtual object A1 at predetermined global coordinates to the wearable computers 1A and 1B as in FIG.
  • the wearable computer 1A and the wearable computer 1B execute the processing of the flowchart of FIG. 15 and select the local coordinates (local map) to be used when the virtual object A1 is generated. Therefore, the wearable computer 1A and the wearable computer 1B generate the virtual object A1 by using either the first local map or the second local map.
  • the information processing device and the information processing system it is possible to eliminate the difference in the appearance of virtual objects between users and improve the quality of services using AR. Multiple users will be able to share their experiences in augmented reality space. For example, when the virtual object is an enemy character, a plurality of users playing an action game can cooperate to fight the enemy character.
  • the information processing device includes a map generation unit that generates a first map based on an image, a map acquisition unit that acquires a second map from another device, and a virtual object using the first map or the second map. It may be provided with an object generation unit that generates.
  • the first local map data 21 is an example of the first map.
  • the second local map data 22 is an example of the second map.
  • the wearable computer 1B corresponds to another device.
  • the information processing apparatus is used by an evaluation data generation unit that generates the first evaluation data based on the first map and the third map used for service management, and an object generation unit that generates a virtual object. It may further include a map selection unit that selects the first map or the second map based on the first evaluation data.
  • the global map data 23 and 36 described above are examples of the third map.
  • the evaluation data generation unit may generate the first evaluation data based on the degree of similarity between the first map and the third map.
  • the map acquisition unit may acquire the third map from the server.
  • the management server 30 described above is an example of a server.
  • the map selection unit may select the first map when the first evaluation value is equal to or less than the threshold value. Further, the map acquisition unit acquires the second evaluation data corresponding to the second map from another device, and the map selection unit acquires the second evaluation value corresponding to the self-position from the second evaluation data, and the second evaluation value is obtained. The second map may be selected when the evaluation value is larger than the first evaluation value. Further, the map selection unit may select the first map when the second evaluation value is equal to or less than the first evaluation value.
  • the information processing system includes a first information processing device that generates a first map based on a first image and a second information processing device that generates a second map based on a second image.
  • the first information processing device generates a virtual object based on the first map or the second map acquired from the second information processing device.
  • the second information processing device generates a virtual object based on the second map or the first map acquired from the first information processing device.
  • the first information processing device corresponds to the wearable computer 1A
  • the second information processing device corresponds to the wearable computer 1B.
  • the first map is, for example, the first local map data 21 generated by the local map generation unit 10 of the first information processing apparatus.
  • the second map is, for example, the second local map data 22 stored in the storage unit 5 of the first information processing apparatus.
  • the second map also corresponds to the first local map data 21 generated by the local map generation unit 10 of the second information processing apparatus.
  • the first information processing apparatus estimates the first self-position, acquires the first evaluation value corresponding to the first self-position from the first evaluation data, and performs the first evaluation. You may choose the first or second map to use to create the virtual object based on the value.
  • the second information processing device estimates the second self-position, acquires the third evaluation value corresponding to the second self-position from the second evaluation data, and uses it to generate a virtual object based on the third evaluation value. You may choose the first map or the second map.
  • the third evaluation value corresponds to the first evaluation value in steps S202 and S203 of FIG.
  • the first self-position and the second self-position are estimated by, for example, the self-position estimation unit 11 in each information processing device.
  • the flowchart of FIG. 17 shows an example of the display processing of the virtual object in the modification 1. Hereinafter, the process will be described with reference to the flowchart of FIG.
  • the self-position estimation unit 11 estimates the self-position of the information processing device 1 (step S211). For example, the self-position estimation unit 11 can obtain the first local coordinates corresponding to the self-position of the information processing device 1 in step S211.
  • the map selection unit 13 obtains the first evaluation value of the self-position from the first evaluation data (first evaluation data 24) (step S212). For example, the map selection unit 13 searches for a mesh in the first evaluation data 24 including the first local coordinates corresponding to its own position. As a result, the map selection unit 13 can obtain the evaluation value set in the mesh.
  • the map selection unit 13 determines whether or not the first evaluation value of the self-position in the evaluation data is larger than the threshold value (step S213). For example, when the data D4 (FIG. 14) is used as the first evaluation data, a value larger than 0 and smaller than 1 can be used as the threshold value. However, a different threshold may be used.
  • FIG. 18 shows an example of how the virtual object looks in the first modification.
  • the user U1 wearing the wearable computer 1A and the user U2 wearing the wearable computer 1B are facing the object OBJ at the same positions and postures as in FIGS. 9 and 16.
  • the object OBJ is a jar having an opening at the top.
  • View V1 shows an example of how it looks from user U1.
  • the view V2 shows an example of how the user U2 sees the view.
  • the service program 34 of the management server 30 transmits a command to generate the virtual object A1 at predetermined global coordinates to the wearable computers 1A and 1B, as in FIGS. 9 and 16.
  • the wearable computer 1A and the wearable computer 1B execute the processing of the flowchart of FIG. 17 and determine whether or not the generated virtual object is displayed larger than the specified size.
  • the virtual object A1 is displayed in a predetermined size.
  • the display deviation is small, and the virtual object A1 which is a character appears at a position corresponding to the upper part of the jar. Therefore, the user U1 can visually recognize how the character is popping out of the opening of the jar, as intended by the service.
  • the virtual object A1 is displayed larger than the specified size.
  • the virtual object A1 is enlarged 1.5 times as large as the specified size (size of FIG. 9). Therefore, in FIG. 16, as in FIG. 9, the display position of the virtual object A1 which is a character is displaced, but it is difficult for the user U2 to recognize this displacement.
  • the virtual object A1 is enlarged in three directions of the x-axis, the y-axis, and the z-axis.
  • the object generation unit 15 may expand the virtual object in the direction of a part of the axes. Further, the object generation unit 15 may execute a predetermined arithmetic process on the virtual object to change the shape of the virtual object and display it.
  • the information processing apparatus controls at least one of the size and shape of the virtual object displayed on the display unit 7 according to the first evaluation value of the self-position in the first evaluation data. May be good.
  • the processing of the flowchart of FIG. 17 also makes it possible to eliminate the deviation in the appearance of virtual objects between users and improve the quality of services using AR.
  • the object generation unit is a virtual object having a size larger when the first evaluation value is larger than the threshold value as compared with the case where the first evaluation value is less than or equal to the threshold value. May be generated.
  • the first information processing apparatus when the first evaluation value is larger than the threshold value, the first information processing apparatus produces a virtual object having a larger size than when the first evaluation value is equal to or less than the threshold value. It may be generated. Further, the second information processing apparatus may generate a virtual object having a larger size when the third evaluation value is larger than the threshold value as compared with the case where the third evaluation value is equal to or less than the threshold value.
  • the information processing device 1 may execute the processing of the flowchart of FIG. 15 in combination with the flowchart of FIG. That is, the information processing apparatus according to the present disclosure may execute a combination of a process of selecting a local map to be used when displaying a virtual object and a process of controlling at least one of the size and shape of the virtual object to be displayed.
  • the transmitting member 17 may be coated with a coating that prevents transmission of a specific wavelength band such as an infrared component or an ultraviolet component among electromagnetic waves. Further, a film that prevents transmission of a specific wavelength band of electromagnetic waves may be attached to the transmission member 17. Further, the head-mounted display 1C of FIG. 19 includes an earphone 16. The earphone 16 reproduces voice based on the voice signal output by the audio circuit 8.
  • the configuration of the head-mounted display shown in FIG. 19 is only an example. Therefore, a head-mounted display having a different shape may be used. Further, a head-mounted display in which an audio device such as an earphone is omitted may be used. For example, the user may use external earphones or headphones that receive audio signals transmitted by the head-mounted display via wireless communication (eg, Bluetooth).
  • wireless communication eg, Bluetooth
  • FIG. 20 shows an example of the configuration of the display unit.
  • the display unit 7 of FIG. 20 includes an illuminance sensor 3A, a dimming element 18, and a display 19.
  • the illuminance sensor 3A detects the ambient brightness of the head-mounted display 1C.
  • the dimming element 18 is, for example, a plate-shaped element containing an electrochromic material.
  • the voltage applied to the dimming element 18 is adjusted according to the measured value of the illuminance sensor 3A. For example, the applied voltage of the dimming element 18 can be increased when the measured value of the illuminance sensor 3A becomes large, and the applied voltage of the dimming element 18 can be decreased when the measured value of the illuminance sensor 3A becomes small. ..
  • the processing for superimposed display includes, for example, image generation processing, delay compensation processing, sound image generation processing, and the like.
  • image generation processing for example, image generation processing, delay compensation processing, sound image generation processing, and the like.
  • the information processing device may include a display unit that superimposes and displays a virtual object generated by the object generation unit on the background.
  • the display unit may include a transmissive display.
  • the information processing device may be a wearable computer.
  • the information processing device, information processing system, and information processing method according to the present disclosure it is possible to reduce the deviation in the appearance of virtual objects among a plurality of users. As a result, the quality of service using AR and the satisfaction of users can be improved. Multiple users will be able to share their experiences in augmented reality space. Specifically, in an action game, a plurality of users can cooperate to fight against an enemy character. Since the map to be used at each position in the space can be selected, the map can be used according to the service situation and timing. In addition, since the map to be used can be switched at the timing required on the service side, it is possible to suppress the influence on the processing load and the response speed.
  • the present technology can have the following configurations.
  • a map generator that generates the first map based on the image
  • a map acquisition unit that acquires a second map from another device
  • An information processing device including an object generation unit that creates a virtual object using the first map or the second map.
  • An evaluation data generation unit that generates the first evaluation data based on the first map and the third map used for managing the service.
  • the information processing apparatus according to (1) further comprising a map selection unit for selecting the first map or the second map used by the object generation unit for generating the virtual object based on the first evaluation data.
  • the map selection unit acquires a first evaluation value corresponding to the self-position from the first evaluation data, and the object generation unit uses the first map to generate the virtual object based on the first evaluation value. Or select the second map, The information processing device according to (2). (4) The map selection unit selects the first map when the first evaluation value is equal to or less than a threshold value. The information processing device according to (3). (5) The map acquisition unit acquires the second evaluation data corresponding to the second map from the other device, and obtains the second evaluation data. The map selection unit acquires a second evaluation value corresponding to the self-position from the second evaluation data, and selects the second map when the second evaluation value is larger than the first evaluation value. The information processing device according to (3) or (4).
  • the information processing device according to any one of (1) to (9).
  • a sensor unit for measuring at least one of the direction in which the image pickup unit is facing or the inclination angle of the image pickup unit is provided.
  • the map generation unit generates the first map using the measured values of the sensor unit.
  • the information processing device according to (10).
  • a display unit that superimposes and displays a virtual object generated by the object generation unit on the background is further provided.
  • the display includes a transmissive display.
  • a wearable computer The information processing device according to (12).
  • a first information processing device that generates a first map based on a first image, It is equipped with a second information processing device that generates a second map based on the second image.
  • the first information processing device generates a virtual object based on the first map or the second map acquired from the second information processing device.
  • the second information processing device generates the virtual object based on the second map or the first map acquired from the first information processing device.
  • Information processing system (15) Further equipped with a server that manages services using the third map
  • the first information processing apparatus generates first evaluation data based on the first map and the third map, selects the first map or the second map based on the first evaluation data, and virtualizes the first map. Used to create an object
  • the second information processing device generates second evaluation data based on the second map and the third map, selects the second map or the first map based on the first evaluation data, and the above. Used to create virtual objects,
  • the information processing system according to (14).
  • Steps to generate the first map based on the image The step of getting the second map from another device, Have the computer perform the steps of creating a virtual object using the first map or the second map.
  • Information processing method (20) The step of generating the first evaluation data based on the first map and the third map used for managing the service, and The information processing method according to (19), further including a step of selecting the first map or the second map used for generating the virtual object based on the first evaluation data.
  • Information processing device 1A, 1B Wearable computer 1C Head-mounted display 2 Imaging unit 3 Sensor unit 3A Illumination sensor 4 Processing circuit 5 Storage unit 6, 32 Communication circuit 7 Display unit 8 Audio circuit 10 Local map generator 11 Self-position estimation unit 12 Map acquisition unit 13 Map selection unit 14 Evaluation data generation unit 15 Object generation unit 16 Earphones 17 Transmission member 18 Dimming element 19 Display 30 Management server 31 Bus 33 Processor 34 Service program 35 Storage 36 Global map data 40 Network 41 Base station

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention a pour objet un dispositif de traitement d'informations, un système de traitement d'informations et un procédé de traitement d'informations qui rendent possible d'augmenter la qualité d'un service, utilisant la réalité augmentée, et le niveau de satisfaction d'un utilisateur. À cet effet, un dispositif de traitement d'informations selon la présente invention comprend : une unité de génération de carte qui génère une première carte sur la base d'une image ; une unité d'acquisition de carte qui acquiert une seconde carte d'un autre dispositif ; une unité de génération d'objet qui utilise la première carte ou la seconde carte pour générer un objet virtuel ; et une unité d'affichage qui affiche l'objet virtuel, généré par l'unité de génération d'objet, en le superposant à un arrière-plan.
PCT/JP2020/046823 2019-12-19 2020-12-15 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations WO2021125190A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-229580 2019-12-19
JP2019229580 2019-12-19

Publications (1)

Publication Number Publication Date
WO2021125190A1 true WO2021125190A1 (fr) 2021-06-24

Family

ID=76478796

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/046823 WO2021125190A1 (fr) 2019-12-19 2020-12-15 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2021125190A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011186808A (ja) * 2010-03-09 2011-09-22 Sony Corp 情報処理装置、マップ更新方法、プログラム及び情報処理システム
US20170243403A1 (en) * 2014-11-11 2017-08-24 Bent Image Lab, Llc Real-time shared augmented reality experience

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011186808A (ja) * 2010-03-09 2011-09-22 Sony Corp 情報処理装置、マップ更新方法、プログラム及び情報処理システム
US20170243403A1 (en) * 2014-11-11 2017-08-24 Bent Image Lab, Llc Real-time shared augmented reality experience

Similar Documents

Publication Publication Date Title
US20210407160A1 (en) Method and sytem for presenting a digital information related to a real object
US11157070B2 (en) Massive simultaneous remote digital presence world
CN108780578B (zh) 增强现实系统和操作增强现实系统的方法
US10482662B2 (en) Systems and methods for mixed reality transitions
CN107209950B (zh) 从现实世界材料自动生成虚拟材料
EP3666352B1 (fr) Procédé et dispositif pour réalité augmentée et virtuelle
US11204639B2 (en) Artificial reality system having multiple modes of engagement
US10607403B2 (en) Shadows for inserted content
US11069075B2 (en) Machine learning inference on gravity aligned imagery
CN116778368A (zh) 使用语义分割的平面检测
CN111602104A (zh) 用于与所识别的对象相关联地呈现合成现实内容的方法和设备
EP4134917A1 (fr) Systèmes et procédés d'imagerie pour faciliter l'éclairage local
US11699412B2 (en) Application programming interface for setting the prominence of user interface elements
WO2021125190A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations
JP2015118578A (ja) 拡張現実情報詳細
JP7400810B2 (ja) 情報処理装置、情報処理方法、及び記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20901550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20901550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP