WO2018116377A1 - Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel - Google Patents

Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel Download PDF

Info

Publication number
WO2018116377A1
WO2018116377A1 PCT/JP2016/087958 JP2016087958W WO2018116377A1 WO 2018116377 A1 WO2018116377 A1 WO 2018116377A1 JP 2016087958 W JP2016087958 W JP 2016087958W WO 2018116377 A1 WO2018116377 A1 WO 2018116377A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
processing
unit
receiving
object information
Prior art date
Application number
PCT/JP2016/087958
Other languages
English (en)
Japanese (ja)
Inventor
謙太朗 茂出木
Original Assignee
株式会社キッズプレート
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社キッズプレート filed Critical 株式会社キッズプレート
Priority to PCT/JP2016/087958 priority Critical patent/WO2018116377A1/fr
Publication of WO2018116377A1 publication Critical patent/WO2018116377A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems

Definitions

  • the present invention relates to a processing device, a method, a program, and a processing system for performing processing for superimposing an object on a video obtained by imaging a real space.
  • VR virtual reality
  • AR augmented reality
  • the AR recognizes the environment in front of you directly by applying the location-based type that superimposes additional information on video using position information that can be acquired from GPS, etc., and technology such as image recognition and space recognition.
  • position information that can be acquired from GPS, etc.
  • technology such as image recognition and space recognition.
  • vision-based type that superimposes additional information on video by analyzing.
  • Patent Document 1 discloses a portable information terminal that uses a location-based AR to display a message corresponding to a real space situation that changes over time. Specifically, the portable information terminal obtains the imaging range of the GPS module and the position information calculation module that detects the current position of the own terminal and the camera module, and measures the distance from the absolute position of the own terminal to each object. Get detailed information acquisition module, message data and its absolute position information from message server, and based on distance to each object and message absolute position information, each object display range and message drawing range Judgment is made on the front and back relationship between the display range of the object and the drawing range of the message and their overlap. And a control unit for displaying.
  • the information added to the video by the portable information terminal described in Patent Document 1 is a message previously associated with various objects in the real space by the user, and does not give an effect to enjoy the change. Is static.
  • a processing device for processing a dynamic object that gives an action and enjoys a change by associating the augmented reality and the virtual reality with each other in a predetermined place in the real space
  • An object is to provide a processing system.
  • a processing device includes an object storage unit that stores object information, a first control unit that communicates with a first terminal device that performs processing related to augmented reality, and virtual reality.
  • a second control unit that communicates with a second terminal device that performs processing related to the first terminal device, the first control unit receiving a request from the first terminal device, and the object in response to the request
  • a first transmission unit configured to transmit object information stored in the storage unit to the first terminal device
  • the second control unit includes a second reception unit configured to receive an operation signal for the object from the second terminal device;
  • the second object reflected in the object information stored in the object storage unit is processed on the object corresponding to the operation signal received by the second receiving unit.
  • a second transmission unit that transmits the object information processed by the second object processing unit to the second terminal device, the first transmission unit after the processing by the second object processing unit Is transmitted to the first terminal device.
  • the processing method in one aspect of the present invention includes a first processing step for communicating with a first terminal device that performs processing relating to augmented reality, a second processing step for communicating with a second terminal device that performs processing relating to virtual reality,
  • the first processing step includes a first reception step of receiving a request from the first terminal device, and a first of transmitting object information stored in the object storage unit to the first terminal device in response to the request.
  • a second receiving step for receiving an operation signal for the object from the second terminal device, and a process corresponding to the operation signal received in the second receiving step for the object.
  • the second object processing step reflected in the object information stored in the object storage unit and the processing after the processing by the second object processing step. Includes a second transmission step of transmitting the object information to the second terminal device, wherein the first transmission step transmits the object information after processing by said second object processing step to the first terminal device.
  • a processing program includes a first processing step that communicates with a first terminal device that performs processing related to augmented reality, a second processing step that communicates with a second terminal device that performs processing related to virtual reality,
  • the first processing step includes a first reception step for receiving a request from the first terminal device, and object information stored in the object storage unit in response to the request. Transmitting to the first terminal device, and the second processing step receives the operation signal for the object from the second terminal device and the second receiving step.
  • a second object that performs processing corresponding to the operation signal on the object and reflects the object information stored in the object storage unit;
  • the object information is transmitted to the first terminal device.
  • a processing system includes an object storage unit that stores object information, a first control unit that communicates with a first terminal device that performs processing related to augmented reality, and a second terminal device that performs processing related to virtual reality
  • a processing system comprising: a second control unit that communicates with each other; and a plurality of terminal devices connected via a network, wherein the first control unit receives a request from the first terminal device.
  • the first transmission unit transmits the object information processed by the second object processing unit to the first terminal device.
  • the processing system 1 links augmented reality (AR) technology and virtual reality (VR) technology, and superimposes objects in the real space at a predetermined location by using AR or VR or AR.
  • AR augmented reality
  • VR virtual reality
  • a service is provided in which the object is affected by changing the shape or the like of the object.
  • the processing system 1 performs the following processing.
  • “Process 1” In one terminal device, an application (hereinafter referred to as an AR application) that can receive provision of a service using AR is activated, and an image of the real space acquired by the camera is displayed on the display. The object acquired from the processing device is displayed superimposed on the video.
  • “Process 2” An application (hereinafter referred to as a VR application) that can receive provision of a service using VR is started in another terminal device, and an object acquired from the processing device is displayed on the display, A predetermined operation is performed. For example, the shape of the object changes according to a predetermined operation.
  • “Process 3” In one terminal device, the shape and position of the displayed object change according to the change of the object by “Process 2”.
  • the processing system 1 is connected to a processing device 2, a first terminal device 5, and a second terminal device 7 via a network N.
  • the first terminal device 5 is a device including a camera 52 that can image a subject, a display 51 that displays the subject imaged by the camera 52, and the like, for example, a smartphone or a tablet.
  • FIG. 2A is a diagram illustrating the surface of the first terminal device 5
  • FIG. 2B is a diagram illustrating the back surface of the first terminal device 5.
  • the first terminal device 5 detects a sensor (for example, a GPS sensor) that detects the position of the device, a sensor (for example, an acceleration sensor) that detects the orientation and inclination of the device, and a direction.
  • a sensor for example, a gyro sensor
  • a sensor for example, a gyro sensor
  • the first terminal device 5 is installed with an AR application that can receive an AR service from the processing device 2.
  • the AR application can grasp the posture (tilt) and direction (direction in which the optical axis of the camera is directed) of the camera 52 based on signals from the respective sensors built in the first terminal device 5. .
  • the first terminal device 5 is provided from the processing device 2 to continuously displayed images while continuously acquiring and displaying images in the real space with the camera 52 using the AR application. Process to superimpose objects.
  • the second terminal device 7 has the same configuration as the first terminal device 5 although not shown.
  • the second terminal device 7 is installed with a VR application that can receive a VR service from the processing device 2.
  • the second terminal device 7 is set in a dedicated VR viewer, or a VR service (game or the like) provided from the processing device 2 using the VR application by connecting a goggle for VR. ).
  • both the AR application and the VR application may be installed.
  • the AR application when an area where an object appears due to AR is near from home, the AR application is used to participate in the game outdoors, and when an area where an object appears due to AR is far from home, the VR application is used to play a game at home. Can participate in.
  • the AR application when used, it is referred to as the first terminal device 5, and when the VR application is used, it is referred to as the second terminal device 7.
  • the first terminal device 5 may mean an individual first terminal device, or may mean the entire plurality of first terminal devices.
  • the 2nd terminal device 7 may mean each 2nd terminal device, and the whole several 2nd terminal device may be meant.
  • the processing device 2 includes an object storage unit 10, a first control unit 20, and a second control unit 30.
  • the object storage unit 10 stores object information.
  • the 1st control part 20 communicates with the 1st terminal unit 5 which performs processing about augmented reality (AR).
  • the 2nd control part 30 communicates with the 2nd terminal unit 7 which performs processing about virtual reality (VR).
  • AR augmented reality
  • VR virtual reality
  • the object information includes information on the position on the map where the object appears virtually (hereinafter referred to as position information), information on the shape of the object (hereinafter referred to as shape information), and the like.
  • the object is data formed in a predetermined format, and various forms are conceivable depending on the use of the service.
  • the predetermined format is, for example, a 3D file format such as FBX or OBJ, but may be another format (VRML, X3D, DXF, etc.).
  • the object When the service is a game (AR game or VR game), the object is, for example, a spaceship or a fighter plane as shown in FIGS. Further, when the service is an advertisement (AR advertisement or VR advertisement), the object is an advertisement (for example, “O ⁇ department store sale” as shown in FIGS. 3C, 3D, and 3E), for example. ! “Etc.) are displayed, such as ad balloons, airships, billboards, etc. Services are not limited to games and advertisements.
  • the position information is information indicating the position on the map where the object appears virtually (the position may be anywhere in the world as well as in Japan). For example, by using a service that provides map information, a place where an object appears virtually by AR can be specified on the map.
  • the object storage unit 10 stores a table in which information (latitude, longitude, height) of a place where an object virtually appears is associated with an object ID.
  • the shape information is information indicating the three-dimensional shape or state of the object. Although details will be described later, the shape and state of the object change according to the operation of the user of the first terminal device 5 or the second terminal device 7. That is, the shape information is a concept including the shape or state of the original object before the change and the shape or state of the object after the change.
  • the first control unit 20 includes a first reception unit 21, a first transmission unit 22, and a first object processing unit 23.
  • the second control unit 30 includes a second reception unit 31, a second transmission unit 32, and a second object processing unit 33.
  • the first receiving unit 21 receives a request from the first terminal device 5.
  • the first transmission unit 22 transmits the object information stored in the object storage unit 10 to the first terminal device 5 in response to the request.
  • the second receiving unit 31 receives an operation signal for the object from the second terminal device 7.
  • the 2nd terminal device 7 starts the VR application, participates in the game which the processing apparatus 2 provides, and assumes the case where operation (for example, operation, such as an attack) is performed with respect to the object in a game. Yes.
  • the second object processing unit 33 performs processing corresponding to the operation signal received by the second receiving unit 31 on the object, and reflects the object information stored in the object storage unit 10. Specifically, the second object processing unit 33 changes the position of the object or changes the state of the object based on processing corresponding to the operation signal. For example, when the operation signal is a signal for attacking the object, the second object processing unit 33 performs a process in which a part of the object is damaged by the attack (for example, a process in which white smoke rises).
  • the second transmission unit 32 transmits the object information (information on the shape and position of the object) after processing by the second object processing unit 33 to the second terminal device 7. That is, on the display 71 of the second terminal device 7, for example, as shown in FIG. 5A, from the state where the object X 2 is displayed in the game space X 1, as shown in FIG. In response, the object X2 ′ that has caused white smoke to rise is displayed in a displayed state.
  • FIG. 5 shows an example in which a displayed image is composed of a left-eye screen and a right-eye screen so that the displayed image can be viewed three-dimensionally by VR.
  • the first transmission unit 22 transmits the object information (information on the shape and position of the object) after processing by the second object processing unit 33 to the first terminal device 5. That is, from the state where the object X2 is superimposed on the video X3 in the real space on the display 51 of the first terminal device 5 as shown in FIG. As shown in FIG. 6B, a state is displayed in which the object X2 'that has been attacked and caused white smoke to rise is displayed.
  • the first receiving unit 21 receives an operation signal for the virtually displayed object from the first terminal device 5.
  • the first object processing unit 23 performs processing corresponding to the operation signal received by the first receiving unit 21 on the object and reflects the object information stored in the object storage unit 10. Specifically, the first object processing unit 23 changes the position of the object or changes the state of the object based on processing corresponding to the operation signal. For example, when the operation signal is a signal that attacks the object, the first object processing unit 23 performs a process such that the object is tilted by the attack.
  • the first transmission unit 22 transmits the object information (information on the shape and position of the object) after processing by the first object processing unit 23 to the first terminal device 5. That is, on the display 51 of the first terminal device 5, for example, as shown in FIG. 7A, from the state where the object X2 is superimposed on the video X3 in the real space, as shown in FIG. 7B. The state in which the tilted object X2 ′′ is changed to the displayed state is displayed.
  • the second transmission unit 32 transmits the object information (information on the shape and position of the object) after processing by the first object processing unit 23 to the second terminal device 7. That is, on the display 71 of the second terminal device 7, as in the first terminal device 5, for example, as shown in FIG. 8A, from the state where the object X 2 is displayed in the game space X 1, FIG. As shown in (b), the state in which the object X2 ′′ tilted by the attack changes to the displayed state is displayed.
  • FIG. 8 shows an example in which a displayed image is configured with a left-eye screen and a right-eye screen so that the displayed image can be viewed three-dimensionally with VR.
  • the processing system 1 uses the VR by the second terminal device 7 when the first terminal device 5 performs an operation on the object using the AR in a service that links the AR and the VR.
  • the processing corresponding to the operation is reflected in the displayed object in real time, and the first terminal device 5 displays the AR using the AR when the second terminal device 7 operates the object using the VR.
  • the processing corresponding to the operation can be reflected in real time on the object that is being processed.
  • the processing system 1 transmits object information to the first terminal device 5 and performs processing related to AR on the first terminal device 5 side, data processing frequently occurs between the first terminal device 5 and the processing device 2. Transmission / reception is not performed, and the processing load on the processing device 2 can be reduced.
  • the AR-related processing includes, for example, arithmetic processing related to determining whether an object is in an area where the object appears virtually due to AR, processing for recalculating the position of the object when the place is moved, and the like.
  • a second person who participates in the game using the Web (hereinafter referred to as a Web player) using a PC or the like in the first environment 100 (home, event venue, etc.) and the second.
  • a person who participates in a game (hereinafter referred to as an AR player) using the AR by the first terminal device 5 in the environment 101 (area where an object by AR appears) and a third environment 102 (home, event venue, etc.)
  • a person who participates in a game using VR by the second terminal device 7 (hereinafter referred to as a VR player) proceeds with the data being linked to each other.
  • a predetermined condition may be provided.
  • the configuration may be such that when a user participates in a game using AR and reaches a predetermined level, the authority to participate in the game using VR is given.
  • the Web player may be configured such that the number of objects that can be used increases according to the game level.
  • the Web player manages the location where the object appears on the map.
  • the AR player can virtually display the object selected by the Web player on the display of the first terminal device 5 at a specific place (area where the object by AR appears).
  • the VR player can display the object selected by the Web player in VR by connecting to a specific link destination (URL or the like).
  • the first terminal device 5 downloads object information from the processing device 2 when the AR application is activated.
  • the object information includes information such as the shape of the object, information about the place where the object appears virtually (latitude, longitude, height), and information about the area where the object appears virtually (hereinafter referred to as area information). ) Etc. are included.
  • the area information is, for example, information on an area (area) in which a radius r is determined centering on the appearance position (latitude, longitude) of the object.
  • the radius r is, for example, several tens of meters or hundreds of meters, and can be arbitrarily set by an administrator.
  • the shape of the area is described as being circular. However, the shape is not limited to a circle, and may be an ellipse or a rectangle.
  • the area A in which an object appears is defined by a circle having a radius r centered on the appearance position of the object (C in FIG. 10) in the map M as shown in FIG.
  • the area A in which an object appears is defined by a circle having a radius r centered on the appearance position of the object (C in FIG. 10) in the map M as shown in FIG.
  • the AR player operates the first terminal device 5 to activate the AR application.
  • the AR application calculates the current position (latitude, longitude) based on the signal from the built-in GPS sensor.
  • the first terminal device 5 refers to the area information based on the calculated position and determines whether or not the user is in the area A on the map M.
  • the notification method may be a method of displaying a predetermined message or image on the display, may be notified by sound or vibrator, or a combination of these.
  • the first terminal device 5 calculates the orientation and direction of the camera based on the signal from the built-in sensor.
  • the first terminal device 5 refers to the object information (latitude and longitude information where the object appears virtually) and determines that the camera is facing the appearance position of the object
  • the first terminal device 5 acquires the reality acquired by the camera. The object is superimposed on the image of the space.
  • the size and sound of the object may be changed depending on the distance to the appearance position of the object.
  • the information regarding the place where the object appears virtually is not limited to the latitude and longitude information, but may be information indicating the range where the object appears virtually.
  • the information has an object appearance range (A1 in FIG. 11) defined in the map M.
  • the object is displayed at the center position of the appearance range.
  • the AR player operates the first terminal device 5 to activate the AR application.
  • the AR application calculates the current position (latitude, longitude) of the terminal itself based on a signal from the built-in GPS sensor.
  • the first terminal device 5 determines whether or not it is within the appearance range of the object based on the calculated position.
  • the first terminal device 5 calculates the center position (C1 in FIG. 11) of the appearance range from the current position (latitude, longitude) of the terminal itself.
  • the first terminal device 5 performs processing so that the object is superimposed on the calculated center position.
  • the object information does not need the latitude and longitude information, and it is sufficient if there is information indicating the range in which the object appears virtually.
  • the web player can send a message to the team as a means of communication.
  • the Web player operates the PC and transmits a message “UFO appears in NY ⁇ !”.
  • AR player, VR player on the display of another player (AR player, VR player) of the same team as the Web player, “Web: NY UFO appears!” Is displayed.
  • the friend AR player can confirm the message, move to the vicinity of XX (place name, building name, etc.) in the message, and use the first terminal device 5 to participate in the game by AR.
  • the ally VR player can confirm the message, connect to a predetermined site using the VR application of the second terminal device 7, and participate in the game by VR.
  • the friend AR player and the friend VR player can reply to a message from the Web player using the message function.
  • the web player can send a message to the enemy team.
  • the Web player operates the PC and transmits a message “Now, invade NY XX!”. Then, on the display of the player of the Web player and the enemy team, “Enemy Web: Invite NY's XX!” Is displayed.
  • the enemy AR player can confirm the message, move to the vicinity of XX (place name, building name, etc.) in the message, and use the first terminal device 5 to prepare for the invasion.
  • the enemy VR player can confirm the message, connect to a predetermined site using the VR application of the second terminal device 7, and prepare for invasion by the VR. Further, the AR player and the VR player cannot reply to a message from an enemy web player.
  • a terminal device operated by the Web player the AR player, and the VR player
  • settings for publishing a message to an SNS may be performed.
  • SNS Social Networking Service
  • the excitement of a game can be transmitted with respect to the other user who has not participated in the game through SNS.
  • a Twitter (registered trademark) hashtag application link may be added to the message.
  • a user who knows the game through SNS is expected to participate as a new player.
  • the message between players may be freely created by a sentence creation function, or a regular message may be selected from a predetermined command format menu for time saving. Note that the fixed message can be freely changed.
  • each player can communicate by operating his / her terminal device and making a voice call in real time.
  • the AR player can attack the object X2 by operating the first terminal device 5 (operation of directly tapping the object X2 displayed on the display or shaking the terminal device).
  • the processing device 2 changes the shape and position of the object X2 according to the operation of the first terminal device 5.
  • the Web player can see how the object X2 is changed by the AR player attack on the PC.
  • the VR player can see how the object X2 is changed by the AR player attack due to the VR by the second terminal device 7.
  • the VR player operates the second terminal device 7 (operations such as shaking the terminal device or operating control buttons displayed on the display), and activates its own object (fighter in FIG. 13).
  • the target object UFO in FIG. 13
  • the processing device 2 changes the shape and position of the target object according to the operation of the second terminal device 7.
  • the Web player can see how the target object is changed by the attack of the VR player by the PC.
  • the AR player can see how the target object is changed by the attack of the VR player by the first terminal device 5.
  • An object based on AR may be displayed more realistically using a technique based on AI (artificial intelligence).
  • AI artificial intelligence
  • a predetermined algorithm that can recognize the sky included in the image is incorporated in the AR application.
  • the predetermined algorithm may be an algorithm that recognizes the sky by machine learning or an algorithm that recognizes the sky by deep learning.
  • the AR application when the AR application is activated and the real space image X1 acquired from the camera is displayed on the display 51, the AR application recognizes the sky by a predetermined algorithm. To do.
  • the AR application superimposes the object X2 only on the empty part. That is, the AR application can virtually superimpose the object X2 between the sky and the building as shown in FIG.
  • the processing system 1 can be used for purposes other than games, for example, using advertisements.
  • the user operates the first terminal device 5 and activates the AR application at a first position in an area where an AR object appears (hereinafter referred to as an appearance area).
  • an appearance area As shown in FIG. 15A, the AR application superimposes the advertising object X2 on the video X1 in the real space. In this state, since the advertisement is not visible, the user moves to the second position in the appearance area.
  • the AR application superimposes the advertising object X2 on the real space video X1 ′ at the second position. In this state, it is possible to recognize the character of the ad balloon advertisement “Oh!
  • the object X3 imitating the VR user appears and the state in which the advertising object X2 changes is shown in AR. Is displayed.
  • the processing system 1 uses the service that links the AR and the VR for the advertisement, so that the AR user not only collects information while moving the place, but also creates the object by the VR user's operation. You can enjoy the change and increase the chances of seeing advertisements for many AR users.
  • the first control unit 20 communicates with the first terminal device that performs processing related to augmented reality (first processing step).
  • first processing step includes the following steps S1 to S3.
  • the second control unit 30 communicates with the second terminal device that performs processing related to virtual reality (second processing step).
  • the second processing step includes the following steps S11 to S13.
  • step S1 the first receiving unit 21 receives a request from the first terminal device 5 (first receiving step).
  • step S2 the first transmission unit 22 transmits the object information stored in the object storage unit 10 to the first terminal device 5 in response to the request (first transmission step).
  • the first terminal device 5 uses the AR application to continuously acquire the real space video with the camera 52 and display it on the display 51, while the object is displayed on the video continuously displayed based on the object information. To superimpose.
  • step S11 the second receiving unit 31 receives an operation signal for the object from the second terminal device 7 (second receiving step).
  • the second terminal device 7 starts a VR application, participates in a game provided by the processing device 2, and performs an operation (for example, an operation such as an attack) on an object in the game. is doing.
  • step S12 the second object processing unit 33 performs processing corresponding to the operation signal received in step S11 on the object and reflects the object information stored in the object storage unit 10 (second object processing). Process).
  • step S13 the second transmission unit 32 transmits the object information after the process in the step S12 to the second terminal device 7 (second transmission step).
  • step S3 the first transmission unit 22 transmits the object information after the processing in the step S12 to the first terminal device 5.
  • the processing system 1 uses the VR by the second terminal device 7 when the first terminal device 5 performs an operation on the object using the AR in a service that links the AR and the VR.
  • the processing corresponding to the operation is reflected in the displayed object in real time, and the first terminal device 5 displays the AR using the AR when the second terminal device 7 operates the object using the VR.
  • the processing corresponding to the operation can be reflected in real time on the object that is being processed.
  • the processing system 1 transmits object information to the first terminal device 5 and performs processing related to AR on the first terminal device 5 side, data processing frequently occurs between the first terminal device 5 and the processing device 2. Transmission / reception is not performed, and the processing load on the processing device 2 can be reduced.
  • the AR-related processing includes, for example, arithmetic processing related to determining whether an object is in an area where the object appears virtually due to AR, processing for recalculating the position of the object when the place is moved, and the like.
  • the program includes a first processing step that communicates with the first terminal device 5 that performs processing related to augmented reality, and a second processing step that communicates with the second terminal device 7 that performs processing related to virtual reality.
  • the first processing step includes a first reception step of receiving a request from the first terminal device 5 and a first transmission step of transmitting object information stored in the object storage unit 10 to the first terminal device 5 in response to the request. And comprising.
  • a second reception step for receiving an operation signal for the object from the second terminal device 7 and a process corresponding to the operation signal received in the second reception step are performed on the object.
  • the object information after the processing in the second object processing step is transmitted to the first terminal device 5.
  • the “computer system” here includes an OS and hardware such as peripheral devices.
  • the “computer-readable recording medium” refers to a storage device such as a flexible disk, a magneto-optical disk, a portable medium such as a ROM and a CD-ROM, and a hard disk incorporated in a computer system.
  • “computer-readable recording medium” means a program that dynamically holds a program in a short period of time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line.
  • a volatile memory inside a computer system that serves as a server or a client may also include a program that holds a program for a certain period of time.
  • the program may be for realizing a part of the above-described functions, and may be capable of realizing the above-described functions in combination with a program already recorded in the computer system. .
  • 1 processing system 2 processing device, 5 first terminal device, 7 second terminal device, 10 object storage unit, 20 first control unit, 21 first receiving unit, 22 first transmitting unit, 23 first object processing unit, 30 second control unit, 31 second reception unit, 32 second transmission unit, 33 second object processing unit, 52 camera, 51, 71 display, 100 first environment, 101 second environment, 102 third environment, N network

Abstract

Selon la présente invention, une première unité de commande 20 est pourvue : d'une première unité de réception 21 qui communique avec un premier dispositif terminal 5 pour effectuer un processus lié à la réalité augmentée et reçoit une requête en provenance du premier dispositif terminal 5; et une première unité de transmission qui transmet, au premier dispositif de terminal, des informations d'objet mémorisées dans une unité de mémorisation d'objet selon la demande. L'invention concerne une seconde unité de commande 30 comprenant : une seconde unité de réception 31 qui communique avec un second dispositif de terminal 7 pour effectuer un processus associé à la réalité virtuelle et reçoit, à partir d'un second dispositif de terminal 7, un signal de fonctionnement pour un objet; une seconde unité de traitement d'objet 33 qui effectue, sur un objet, un processus correspondant au signal de fonctionnement et réfléchit le résultat de processus vers des informations d'objet devant être stockées dans l'unité de mémorisation d'objet 10; et une seconde unité de transmission 32 qui transmet, au second dispositif de terminal, les informations d'objet qui ont été traitées par la seconde unité de traitement d'objet 33. La première unité de transmission transmet, au premier dispositif terminal, les informations d'objet qui ont été traitées par la seconde unité de traitement d'objet.
PCT/JP2016/087958 2016-12-20 2016-12-20 Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel WO2018116377A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/087958 WO2018116377A1 (fr) 2016-12-20 2016-12-20 Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/087958 WO2018116377A1 (fr) 2016-12-20 2016-12-20 Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel

Publications (1)

Publication Number Publication Date
WO2018116377A1 true WO2018116377A1 (fr) 2018-06-28

Family

ID=62626091

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/087958 WO2018116377A1 (fr) 2016-12-20 2016-12-20 Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel

Country Status (1)

Country Link
WO (1) WO2018116377A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7369259B1 (ja) 2022-09-27 2023-10-25 雅史 高尾 情報同期システム、情報同期プログラム及び情報同期方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175439A (ja) * 2010-02-24 2011-09-08 Sony Corp 画像処理装置、画像処理方法、プログラム及び画像処理システム
JP2016522463A (ja) * 2013-03-11 2016-07-28 マジック リープ, インコーポレイテッド 拡張現実および仮想現実のためのシステムおよび方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175439A (ja) * 2010-02-24 2011-09-08 Sony Corp 画像処理装置、画像処理方法、プログラム及び画像処理システム
JP2016522463A (ja) * 2013-03-11 2016-07-28 マジック リープ, インコーポレイテッド 拡張現実および仮想現実のためのシステムおよび方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7369259B1 (ja) 2022-09-27 2023-10-25 雅史 高尾 情報同期システム、情報同期プログラム及び情報同期方法
WO2024071172A1 (fr) * 2022-09-27 2024-04-04 雅史 高尾 Système de synchronisation d'informations, programme de synchronisation d'informations et procédé de synchronisation d'informations

Similar Documents

Publication Publication Date Title
JP6316387B2 (ja) 広範囲同時遠隔ディジタル提示世界
JP6470356B2 (ja) 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置
US20150371447A1 (en) Method and Apparatus for Providing Hybrid Reality Environment
JP6637650B2 (ja) ゲームプログラム、コンピュータの制御方法、情報処理装置、画像出力装置、画像出力方法および画像出力プログラム
JP2021522910A (ja) 仮想オブジェクトの情報表示方法並びにその、アプリケーション・プログラム、装置、端末及びサーバ
JP6342024B1 (ja) 仮想空間を提供するための方法、当該方法をコンピュータに実行させるためのプログラム、および当該プログラムを実行するための情報処理装置
JP6615732B2 (ja) 情報処理装置および画像生成方法
CN109634413B (zh) 对虚拟环境进行观察的方法、设备及存储介质
JP6392945B1 (ja) 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置
KR20200060361A (ko) 정보 처리 장치, 정보 처리 방법, 및 프로그램
US10916061B2 (en) Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content
CN113274729B (zh) 基于虚拟场景的互动观察方法、装置、设备及介质
JP2023027126A (ja) データ処理プログラム、データ処理方法、および、データ処理装置
CN111744185A (zh) 虚拟对象控制方法、装置、计算机设备及存储介质
CN113244616A (zh) 基于虚拟场景的互动方法、装置、设备及可读存储介质
CN109806583B (zh) 用户界面显示方法、装置、设备及系统
CN110833695A (zh) 基于虚拟场景的业务处理方法、装置、设备及存储介质
JP2017102897A (ja) アバター表示システム、ユーザ端末、及びプログラム
JP6321271B1 (ja) コンテンツ提供方法、当該方法をコンピュータに実行させるプログラム、およびコンテンツ提供装置
CN113144598A (zh) 虚拟对局的预约方法、装置、设备及介质
WO2018116377A1 (fr) Dispositif de traitement, procédé, programme et système de traitement pour superposer un objet sur une image obtenue par capture d'espace réel
JP6921789B2 (ja) 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置
JP2018094086A (ja) 情報処理装置および画像生成方法
CN112973116B (zh) 虚拟场景画面展示方法、装置、计算机设备及存储介质
CN113194329B (zh) 直播互动方法、装置、终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924254

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924254

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP