WO2022118652A1 - プログラム、情報処理方法、情報処理装置、及びシステム - Google Patents
プログラム、情報処理方法、情報処理装置、及びシステム Download PDFInfo
- Publication number
- WO2022118652A1 WO2022118652A1 PCT/JP2021/042180 JP2021042180W WO2022118652A1 WO 2022118652 A1 WO2022118652 A1 WO 2022118652A1 JP 2021042180 W JP2021042180 W JP 2021042180W WO 2022118652 A1 WO2022118652 A1 WO 2022118652A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- race
- user
- predetermined
- contestant
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present disclosure relates to programs, information processing methods, information processing devices, and systems.
- Patent Documents 1 and 2 disclose techniques related to AR (Augmented Reality).
- One aspect of the present disclosure provides a program, information processing method, information processing device, and system capable of improving the recognition of the contestants of the real world race corresponding to the virtual race through the virtual race displayed on the computer.
- the purpose is to do.
- the program is attached to the processor.
- the step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
- the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
- the virtual object includes at least a moving object corresponding to the contestant or the moving object.
- the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
- the program is offered.
- FIG. 1 is a diagram showing a configuration of a system 1 according to the present embodiment.
- the system 1 can display, for example, a predetermined race performed in the real world as a virtual race using a virtual object on an information processing device used by a user.
- the "predetermined race” is not particularly limited as long as it is a race carried out in the real world, and for example, a boat race (actual race or exhibition race), a horse race, a bicycle race, an auto race, etc.
- Car races such as F1, drone races, dog races, marathons, ekiden, etc. can be mentioned.
- the system 1 is a user such as a user terminal 10A, a user terminal 10B, and a user terminal 10C (hereinafter, user terminals 10A, 10B, 10C) which are information processing devices (first computers) used by each user. It includes a plurality of user terminals 10 such as (generally referred to as "user terminal 10"), a first server device (second computer) 20, a second server device 40, and a network 30.
- the user terminal 10A and the user terminal 10B are connected to the network 30 by communicating with the radio base station 31.
- the user terminal 10C connects to the network 30 by communicating with a wireless router 32 installed in a facility such as a house.
- the user terminal 10 is, for example, a portable terminal provided with a touch screen, and may be a smartphone, a phablet, a tablet, or the like.
- the user terminal 10 executes, for example, a program installed via a platform for distributing an application or the like, or a program including pre-installed website browsing software.
- the user terminal 10 communicates with the first server device 20 by executing the above program, and transmits / receives data related to a predetermined race, data related to the user, and the like to / from the first server device 20. It is possible to display a virtual race on the terminal 10.
- the first server device 20 receives data related to a predetermined race from the second server device 40.
- the first server device 20 appropriately transmits data related to a predetermined race to the user terminal 10.
- the first server device 20 stores and manages data related to a predetermined race and data related to each user.
- the first server device 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor (second processor) 29 as hardware configurations, and these provide a communication bus. They are connected to each other via.
- IF Interface
- second processor second processor
- the communication IF 22 supports various communication standards such as a LAN (Local Area Network) standard, and functions as an interface for transmitting and receiving data to and from a user terminal 10 and a second server device 40.
- LAN Local Area Network
- the input / output IF 23 functions as an interface for accepting input of information to the first server device 20 and outputting information to the outside of the first server device 20.
- the input / output IF 23 may include an input receiving unit that accepts the connection of an information input device such as a mouse and a keyboard, and an output unit that accepts the connection of an information output device such as a display for displaying an image or the like.
- the memory 25 is a storage device for storing data or the like used for processing.
- the memory 25 provides the processor 29 with a work area for temporary use, for example, when the processor 29 performs processing.
- the memory 25 includes a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
- the storage 26 is a storage device for storing various programs and data for the processor 29 to read and execute.
- the information stored in the storage 26 includes data related to a predetermined race, data related to each user, and the like.
- the storage 26 may be configured to include a storage device such as an HDD (Hard Disk Drive) or a flash memory.
- the storage is not limited to the form included in the server device, and a cloud service can also be used.
- the processor 29 controls the operation of the first server device 20 by reading and executing a program or the like stored in the storage 26.
- the processor 29 may be configured to include, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and the like.
- the second server device 40 stores and manages data related to a predetermined race.
- the second server device 40 is, for example, a server device managed by a predetermined race organizer or an organization (publisher of a race magazine, race video distributor, or radio distributor) that transmits information about a predetermined race to the outside. Etc.) is a server device managed by.
- the second server device 40 appropriately transmits data related to a predetermined race to the first server device 20.
- the second server device 40 may transmit data related to a predetermined race to the user terminal 10.
- the hardware configuration of the second server device 40 may be the same as that of the first server device 20 as long as there is no contradiction. There may be a plurality of second server devices 40.
- FIG. 2 is a block diagram showing an example of the functional configuration of the user terminal 10.
- the user terminal 10 includes an antenna 110, a wireless communication IF 120, a touch screen 130, an input / output IF 140, a storage unit 150, a voice processing unit 160, a microphone 161 and a speaker 162. It includes an image pickup unit 170 and a control unit (first processor) 190.
- the antenna 110 radiates a signal emitted by the user terminal 10 into space as a radio wave. Further, the antenna 110 receives radio waves from the space and gives a received signal to the wireless communication IF 120.
- the wireless communication IF 120 Since the user terminal 10 communicates with other communication devices, the wireless communication IF 120 performs modulation / demodulation processing for transmitting / receiving signals via the antenna 110 or the like.
- the wireless communication IF 120 is a communication module for wireless communication including a tuner, a high frequency circuit, etc., performs modulation / demodulation and frequency conversion of the wireless signal transmitted / received by the user terminal 10, and supplies the received signal to the control unit 190.
- the touch screen 130 receives input from the user and outputs information to the user on the display 132.
- the touch screen 130 includes a touch panel 131 for receiving user operation input and a display 132.
- the touch panel 131 detects that a user's finger or the like has approached by using, for example, a capacitance type touch panel.
- the display 132 is realized by, for example, an LCD (Liquid Crystal Display), an organic EL (electroluminescence), or other display device.
- the input / output IF 140 functions as an interface for accepting input of information to the user terminal 10 and outputting information to the outside of the user terminal 10.
- the storage unit 150 is composed of a flash memory, an HDD, or the like, and stores a program used by the user terminal 10 and various data received by the user terminal 10 from the first server device 20 or the like.
- the voice processing unit 160 performs modulation / demodulation of the voice signal.
- the voice processing unit 160 modulates the signal given from the microphone 161 and gives the modulated signal to the control unit 190. Further, the voice processing unit 160 gives a voice signal to the speaker 162.
- the voice processing unit 160 is realized by, for example, a processor for voice processing.
- the microphone 161 functions as an audio input unit for receiving an input of an audio signal and outputting it to the control unit 190.
- the speaker 162 functions as an audio output unit for outputting an audio signal to the outside of the user terminal 10.
- the image pickup unit 170 is a camera that captures a real image around the user terminal 10.
- the image captured by the image pickup unit 170 is image-processed by the control unit 190 and output to the display 132.
- the control unit 190 controls the operation of the user terminal 10 by reading and executing a program stored in the storage unit 150.
- the control unit 190 is realized by, for example, an application processor.
- the storage unit 150 stores the application program 151, the application information 152, and the user information 153.
- the user terminal 10 downloads the application program 151 from the first server device 20 and stores it in the storage unit 150. Further, the user terminal 10 communicates with the first server device 20 to send and receive various data such as application information 152 and user information 153 to and from the first server device 20.
- the application program 151 is a program for displaying a virtual race on the user terminal 10.
- the application information 152 includes various data referred to by the application program 151.
- the application information 152 includes the first information 152A, the event information 152B, and the avatar information 152C.
- the first information 152A is information regarding a predetermined race transmitted from the first server device 20 and the second server device 40.
- the first information 152A includes, for example, race time information indicating the race time of a participant or a moving body (hereinafter, also referred to as “participant”) of a predetermined race, position information of the participant or the like during the execution of the predetermined race, and the like.
- the time information corresponding to the position information and the time information are included.
- the term "participant” is a concept that includes not only humans but also animals such as horses and dogs.
- the "moving body” is a main body of movement in a predetermined race, and is an animal or an aircraft on which a participant rides, an aircraft remotely controlled by a contestant, or the like. In marathons, dog races, etc., "participants” and “moving bodies” are the same.
- the first information 152A includes, for example, a predetermined race name, date and time, race field data, contestant data, moving object data, odds information, race prediction, race start table, and information immediately before the race. It may include pit reports, race results, race videos, race stills, past race information, and other information that may be published in information magazines or information sites about a given race.
- the event information 152B is information about an event that can occur or has occurred in a predetermined race, and includes, for example, information about an effect display (effect display) and an effect sound corresponding to the event.
- the event is not particularly limited, but for example, the goal of the first contestant, etc. is reached, the ranking of a predetermined race is confirmed, the ranking of the contestants, etc. is changed, and the difference in the positions of two or more contestants, etc. at a predetermined timing. May be less than or equal to a predetermined threshold value (two or more contestants or the like have a close battle), or one or more of the occurrence of rule violations.
- the event information 152B is information received from the first server device 20 after an event occurs in a predetermined race.
- the event information 152B includes, for example, information about a two-dimensional image or a three-dimensional image corresponding to a predetermined event.
- the event information 152B may include information about a text image for notifying the content of the event, a predetermined character image, an effect image added to a virtual object described later, and the like.
- the avatar information 152C includes information about an object that becomes a user's avatar (hereinafter, referred to as an avatar object). Specifically, the avatar information 152C may include information such as gender, type (occupation), hairstyle, skin color, and action of an avatar object (an example of an avatar). The avatar information 152C can be edited by the user via the avatar editing screen described later.
- the user information 153 includes information about the user of the user terminal 10.
- the user information 153 includes, for example, information for identifying a user, position information of a user terminal 10, a race purchase history and a hit rate of a user (for example, a history of a boat ticket purchased in the case of a boat race, and a boat ticket purchased. Hit rate) and the like may be included.
- control unit 190 By reading and executing the application program 151, the control unit 190 reads and executes the operation input reception unit 191, the transmission / reception unit 192, the object generation unit 193, the display control unit 194, the flat surface detection unit 195, and the event detection unit. Each function of 196 and the selection information acquisition unit 197 is exhibited.
- the operation input receiving unit 191 accepts the user's operation input based on the output of the touch screen 130. Specifically, the operation input receiving unit 191 detects that the user's finger or the like touches or approaches the touch panel 131 as the coordinates of the coordinate system consisting of the horizontal axis and the vertical axis of the surface constituting the touch screen 130.
- the operation input receiving unit 191 determines the user's operation on the touch screen 130.
- the operation input reception unit 191 is, for example, “approach operation”, “release operation”, “tap operation”, “double tap operation”, “long press operation (long touch operation)”, “drag operation (swipe operation)", Determine the user's operation such as “move operation”, “flick operation”, “pinch-in operation", and "pinch-out operation”.
- the operation input receiving unit 191 may accept the movement of the user terminal 10 detected by the acceleration sensor or the like mounted on the user terminal 10 as an operation input.
- the transmission / reception unit 192 transmits and receives various information to and from external communication devices such as the first server device 20 and the second server device 40 via the wireless communication IF 120 and the network 30.
- the transmission / reception unit 192 receives, for example, the first information 152A from the first server device 20 or the second server device 40. Further, the transmission / reception unit 192 transmits, for example, information corresponding to the operation input received by the operation input reception unit 191 and information stored in the user information 153 to the first server device 20 or the second server device 40. ..
- the object generation unit 193 generates a virtual object for presenting information about a predetermined race to the user based on the first information.
- the object generation unit 193 generates, as virtual objects, a race field object representing a race field and a moving object representing a contestant or the like.
- the object generation unit 193 may generate a virtual display board for displaying the second information generated based on the first information as text.
- the object generation unit 193 is a landscape object that constitutes a landscape such as a virtual screen for displaying an image of the second information generated based on the first information, various building objects, and trees. Etc. may be generated.
- the object generation unit 193 may generate an avatar object based on the avatar information 152C.
- the object generation unit 193 may generate, for example, a three-dimensional model of a cartoon or animation character, a mascot character of various sports competitions, a celebrity including a celebrity, or the like as an avatar object.
- the avatar object may be a two-dimensional image limited to a three-dimensional model.
- the object generation unit 193 may replace the generated avatar object with a portion indicating a contestant of the moving object.
- the display control unit 194 superimposes an image (hereinafter, also referred to as “superimposed image”) on which a virtual object generated by the object generation unit 193 is superimposed on a real image around the user terminal 10 imaged by the image pickup unit 170. It is displayed on the display 132.
- the display control unit 194 moves each moving object on the race track object based on the race time information included in the first information 152A, and displays a virtual race that virtually reproduces a predetermined race on the display 132. It is preferable that the display control unit 194 reproduces the virtual race based on the position information of the contestants and the like included in the first information 152A and the time information corresponding to the position information in addition to the race time information. ..
- the display control unit 194 can change the viewpoint in the superimposed image according to the operation input received by the operation input reception unit 191.
- the display control unit 194 displays various menu screens and GUIs (Graphical User Interfaces) on the display 132, and changes the display contents of the display 132, in response to the operation input received by the operation input reception unit 191.
- GUIs Graphic User Interfaces
- the display control unit 194 displays an effect corresponding to the event in response to the event detection unit 196 detecting the occurrence of the event.
- the content of the effect display is not particularly limited, but may be, for example, a display of a predetermined avatar object, an effect display on at least one of a race field object and a moving object, and a two-dimensional image corresponding to a predetermined event. Alternatively, it may be one or more of displaying three-dimensional images and operating these images according to the progress of the race.
- the display control unit 194 may simultaneously execute two or more effect displays corresponding to different events.
- the flat surface detection unit 195 detects a flat surface in the real image captured by the image pickup unit 170.
- the detection of a flat surface is realized by a conventionally known image recognition technique. For example, when the user performs an operation of selecting a flat surface detected by the flat surface detection unit 195, a superimposed image in which the race field object is arranged on the flat surface is displayed on the display 132.
- the flat surface is preferably a horizontal surface.
- the angle formed by the flat surface and the bottom surface constituting the race field object may be 0 degrees, but is preferably an acute angle, and can be, for example, in the range of 15 degrees to 45 degrees. The above angle may be adjusted by accepting a user's operation. Also, even if there is a convex part on a part of the flat surface in the real world, or if there is an object on the flat surface, if the convex part or the object is of a size that can be hidden by the racetrack object. , The place where the race field object can be placed may be detected as a flat surface.
- the event detection unit 196 detects the occurrence of a predetermined event in a predetermined race.
- the event detection unit 196 is, for example, based on the fact that the transmission / reception unit 192 receives event information transmitted from the first server device 20 when a predetermined event occurs in a predetermined race in the real world. Detect the occurrence.
- the event detection unit 196 may detect the occurrence of a predetermined event based on, for example, the first information 152A and the user information 153. Specifically, the event detection unit 196 determines from the position of the contestants, etc. during the race, such as the change of ranking of the contestants, etc., based on the position information of the contestants, etc. and the time information corresponding to the position information. Can detect possible events. The event detection unit 196 can detect an event such that a contestant or the like purchased by the user is located at the head of the race based on, for example, information on the race result and information on the race purchase history of the user.
- the selection information acquisition unit 197 acquires one or more selection information indicating one or more contestants selected by the user.
- the "selection information" is, for example, information indicating a contestant or the like predicted by the user as a winner or the like of a predetermined race.
- the "selection information" may be information indicating a boat ticket or a betting ticket purchased by the user.
- the selection information acquisition unit 197 acquires selection information based on, for example, a user's operation input on the screen for selecting a contestant or the like.
- the selection information acquired by the selection information acquisition unit 197 is transmitted to, for example, the first server device 20 or the second server device 40.
- the predetermined race is a boat race or a horse race
- the purchase process of the boat ticket or the betting ticket may be performed according to the selection information being transmitted to the second server device 40.
- FIG. 3 is a block diagram showing a functional configuration of the first server device 20. A detailed configuration of the first server device 20 will be described with reference to FIG.
- the first server device 20 exerts functions as a communication unit 220, a storage unit 250, and a control unit 290 by operating according to a program.
- the communication unit 220 functions as an interface for the first server device 20 to communicate with an external communication device such as the user terminal 10 or the second server device 40 via the network 30.
- the storage unit 250 stores various programs and data for realizing the system 1.
- the storage unit 250 stores the program 251 and the race information 252 and the user information 253.
- the program 251 is a program for the first server device 20 to communicate with the user terminal 10 and the second server device 40 to realize the system 1.
- the program 251 is executed by the control unit 290 to send and receive data to and from the user terminal 10 and the second server device 40, a process according to the operation content performed by the user of the user terminal 10, race information 252, and the user.
- the first server device 20 is made to perform a process of updating the information 253 and the like.
- Race information 252 contains various data related to a given race.
- the race information 252 includes, for example, the first information 252A and the event information 252B.
- the first information 252A is the source information of the first information 152A, and the first information 152A can be a part of the first information 252A.
- the first information 252A is, for example, information acquired from the second server device 40.
- the event information 252B is the source information of the event information 152B, and the event information 152B may be a part of the event information 252B.
- the event information 252B is, for example, information acquired from the second server device 40.
- the user information 253 is information about the user of the user terminal 10.
- the user information 253 includes a user management table 253A.
- the user management table 253A stores, for example, information for identifying the user, position information of the user terminal 10, a race purchase history of the user, a hit rate, and the like for each user.
- the control unit 290 is realized by the processor 29, and by executing the program 251, the transmission / reception unit 291, the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, the data management unit 294, and the timekeeping unit 295. Demonstrate the function as.
- the transmission / reception unit 291 transmits and receives various information to and from an external communication device such as a user terminal 10 and a second server device 40 via the communication unit 220 and the network 30.
- the transmission / reception unit 291 transmits, for example, at least a part of the first information 252A and the event information 252B to the user terminal 10. Further, the transmission / reception unit 291 receives, for example, the first information 252A and the event information 252B from the second server device 40.
- the first information acquisition unit 292A acquires the first information 252A from the second server device 40 via the transmission / reception unit 291.
- the event information acquisition unit 292B acquires the event information 252B from the second server device 40 via the transmission / reception unit 291.
- the selection information acquisition unit 293 acquires the user's selection information from the user terminal 10 or the second server device 40 via the transmission / reception unit 291.
- the data management unit 294 performs a process of updating various data stored in the storage unit 250 according to the processing results of the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, and the like.
- the timekeeping unit 295 performs a process of measuring time. Various times displayed on the user terminal 10 (for example, the time until the start of the race) can be controlled based on the time measured by the time measuring unit 295.
- FIG. 4 is a schematic diagram showing an example of a boat race field in the real world.
- Two turn marks 403 are installed in the boat race field 401, and races are carried out by boats 402a to 402f on which each boat racer rides.
- the race time information indicating the race times of the boats 402a to 402f is transmitted from the second server device 40 to the first server device 20, and is transmitted from the first server device 20 to the user terminal 10.
- the boat race field 401 is provided with image pickup devices (cameras) 404a to 404b.
- the image pickup apparatus 404a captures the boats 402a to 402f in the field of view from above the boat race field 401.
- the image pickup apparatus 404b keeps the boats 402a to 402f in the field of view from the side of the boat race field 401.
- the images of the boats 402a to 402f captured by the image pickup devices 404a to 404b are transmitted to the second server device 40.
- image analysis of each image is performed, and position information indicating the position of each of the boats 402a to 402f in the shooting time of each image is calculated.
- the calculated position information and the time information related to the shooting time corresponding to the position information are transmitted to the first server device 20, and are transmitted from the first server device 20 to the user terminal 10.
- the location information may be calculated by the first server device 20.
- a position sensor such as a GPS sensor may be installed on the boats 402a to 402f instead of or in addition to the image pickup devices 404a to 404b.
- the position information of the boats 402a to 402f acquired by the position sensor and the time information indicating the time when the position information was acquired are finally transmitted to the user terminal 10.
- FIG. 5 is a schematic diagram showing an example of a virtual object displayed on the user terminal 10.
- a race field object 501 moving objects 502a to 502f, two turnmark objects 503, and a virtual display board 505 are shown.
- the race field object 501 is an object that virtually displays the boat race field 401.
- the race field object 501 and the turn mark object 503 are preferably created based on race field data such as course information of the boat race field 401, and preferably have a shape corresponding to the boat race field 401.
- the moving objects 502a to 502f are objects that virtually display the boats 402a to 402f, respectively, and have a shape imitating a boat.
- the moving objects 502a to 502f move the race field object 501 based on the race time information, the position information of the boats 402a to 402f, and the time information corresponding to the position information. That is, the race field object 501 and the moving objects 502a to 502f display the race in the real world as a virtual race on the user terminal 10.
- the virtual display board 505 is an object that displays text information.
- the virtual display board 505 is, for example, a corresponding nonexistent object in the boat race field 401.
- the text information displayed on the virtual display board 505 is not particularly limited, and may be, for example, ranking information, odds information, or the like. Further, the text information displayed on the virtual display board 505 may be changeable based on the operation input of the user.
- 6 and 7 are flowcharts showing an example of display control processing. It should be noted that the order of each process constituting the flowchart described below is random as long as there is no contradiction or inconsistency in the process contents. Further, the processing executed by each device may be executed by another device within a range that does not cause a contradiction.
- FIGS. 6 and 7 The processes shown in FIGS. 6 and 7 are realized by the control unit 190 executing the application program 151 and the control unit 290 executing the program 251.
- step S610 the control unit 190 generates a base avatar that is the base of the avatar object.
- step S620 the control unit 190 acquires the editing information of the avatar object based on the operation input of the user.
- step S630 the control unit 190 updates the avatar information 152C stored in the storage unit 150 based on the editing information acquired in step S620.
- FIG. 8 is a diagram showing an example of an avatar editing screen 800 for a user to edit an avatar object based on the generated base avatar.
- the avatar editing screen 800 shown in FIG. 8 has a base avatar 801 and a gender button 802 for changing the gender of the base avatar 801 and an attribute change column 803 for changing various attributes other than the gender of the base avatar 801. Includes.
- the base avatar 801 is generated in the form of a male, but can be changed to a female by tapping the gender button 802 or the like.
- the attribute change column 803 for example, a plurality of options are displayed for each attribute such as "type (occupation)", “skin color”, “winning action”, and “losing action” of the avatar object.
- FIG. 9 is a diagram showing a selection table 900 showing options for each attribute.
- the selection table 900 shown in FIG. 9 may be included in, for example, the avatar information 152C stored in the storage unit 150.
- the selection table 900 includes, for example, "type”, “skin color”, “winning action”, “losing action”, and “vehicle” as attributes, and examples 1 to 4 are included in each attribute. Choices are registered.
- the control unit 190 displays the attribute change field 803 of the avatar edit screen 800 based on the selection table 900. As a result, the user can select a favorite option from a plurality of options for each attribute.
- the control unit 190 Based on the user's operation input to the gender button 802 and the attribute change field 803, the control unit 190 acquires information such as the gender, type, skin color, and action of the avatar object as the editing information of the avatar object. For example, the user taps the gender button 802 to select the gender of the base avatar 801 as "male”, taps the attribute change field 803 to select the type "hero”, and wins the action "guts pose”. And select the losing action "Boat Blast". In step S620, the control unit 190 acquires information for changing the avatar object as editing information based on these operation inputs of the user.
- control unit 190 takes the form of a "male" "hero” in step S630, and if the avatar object wins the race, "guts pose".
- the avatar information 152C is updated so that the action of "boat blast” is executed when the action of is executed and the race is lost.
- the process of editing and registering a single avatar object is described in steps S620 and S630, but the present invention is not limited to this example.
- this step it is possible to edit and register multiple avatar objects.
- the options for editing the avatar object are not limited to the options displayed on the avatar edit screen 800 shown in FIG. 8 and the selection table 900 shown in FIG. 9, and for example, the user himself / herself likes an image (user). You may be able to capture an image of yourself, etc.) and edit the avatar object based on the captured image.
- step S640 the control unit 190 acquires information (hereinafter, referred to as contestant information) regarding the contestants or moving objects of a predetermined race from the first server device 20.
- the process of step S640 may be executed at a predetermined frequency or when a specific event occurs. That is, the contestant information can be updated every time the process of step S640 is executed.
- the specific event is not particularly limited, but for example, when the contestant information is changed (for example, when the odds change), the start or end of the race, the ranking is confirmed, etc. There are cases.
- step S650 the control unit 190 associates and registers the avatar information with the acquired contestant information. That is, the control unit 190 registers the contestant information included in the first information 152A acquired from the first server device 20 and stored in the storage unit 150 in association with the avatar information 152C.
- the association between the contestant information and the avatar information 152C may be performed, for example, based on the selection information acquired based on the user's operation input on the screen for selecting the contestant or the like. That is, the control unit 190 may combine the contestant information of the contestant who purchased the boat ticket or the betting ticket with the avatar information 152C.
- step S660 the control unit 190 activates the image pickup unit 170, which is a camera.
- the image pickup unit 170 captures a real image around the user terminal 10.
- step S670 the control unit 190 detects a flat surface in the image captured by the image pickup unit 170.
- step S680 the control unit 190 arranges the virtual object on the detected flat surface.
- FIG. 10 is a schematic diagram showing an example of a real image captured by the imaging unit 170.
- a keyboard 1002 and a monitoring device 1003 are placed on a flat desk 1001.
- step S670 When the image pickup unit 170 is activated in step S670, the real image captured by the image pickup unit 170 is displayed on the display 132.
- step S680 the control unit 190 detects a flat surface in the image captured by the image pickup unit 170, that is, in the image displayed on the display 132.
- the region 1004 is detected as a flat surface.
- the control unit 190 detects the area 1004 as a flat surface.
- the position of the region 1004 can also be changed.
- the area 1004 is displayed on the display 132 so as to be distinguishable from other parts, for example, by adding a predetermined color.
- the control unit 190 arranges virtual objects such as the race field object 501 and the moving objects 502a to 502f on the area 1004.
- FIG. 11 is a schematic diagram showing an example of a screen in which a virtual object is superimposed on a real image and displayed.
- the area with the dot pattern including the monitor device 1003 is the real image, and the other area is the area where the virtual object is displayed.
- an advertisement image may be displayed in an area where the virtual object is not displayed.
- a race field object 501 As virtual objects, a race field object 501, a plurality of moving objects 502, two turnmark objects 503, a large monitor object 506, building objects 507a to 507b, and other reference numerals are not attached. A large number of objects (tree objects, clock objects, etc.) and are displayed. These objects are created, for example, based on the first information 152A received from the first server device 20.
- FIG. 12 is a schematic view showing an example of a screen in which a virtual object is superimposed on a real image, and shows another aspect of the race field object 501 shown in FIG. Specifically, FIG. 12 is an example in which a predetermined race is horse racing.
- the area with the dot pattern including the monitoring device 1003 is the real image, and the other areas are the areas where the virtual objects are displayed.
- virtual objects a race field object 511, a plurality of moving objects 512, a large monitor object 513, a pond object 514, and a plurality of tree objects 515 are displayed on the display 132. These objects are also created, for example, based on the first information 152A received from the first server device 20.
- the racetrack object 511, the large monitor object 513, the pond object 514, and the plurality of tree objects 515 are preferably created based on racetrack data such as, for example, course information of a predetermined racetrack in the real world.
- the plurality of moving objects 512 are, for example, objects that virtually display horses and jockeys running in horse racing.
- step S690 the control unit 190 associates the contestant information with the plurality of moving objects 502.
- the avatar information 152C is associated with the moving object 502 together with the contestant information and registered in the storage unit 150.
- step S710 of FIG. 7 the control unit 190 acquires the position information of the boats 402a to 402f in the boat race field 401 in the real world. That is, when the race by the boats 402a to 402f is started in the real world, the control unit 190 acquires the position information and the time information of the boats 402a to 402f from the first server device 20.
- the method of acquiring the position information and the time information is as described with reference to FIG.
- step S720 the control unit 190 controls the position information acquired in step S710 to be linked with the moving object. Specifically, using the time information and the position information, the movements of the boats 402a to 402f on the race field object 501 are controlled to be the same as the movement objects 502a to 502f on the boat race field 401.
- step S730 when the control unit 190 accepts the user's selection operation for a specific moving object among the plurality of moving objects 502a to 502f, the process of step S740 is executed.
- step S740 the control unit 190 displays detailed contestant information corresponding to the moving object 502 for which the user's selection operation was performed in step S730 together with the avatar information.
- step S750 the detailed display of these information is ended in step S750.
- the control unit 190 ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
- FIG. 13 is a schematic diagram showing an example of a method of displaying detailed contestant information together with avatar information.
- FIG. 13 is a screen displayed on the display 132 when, for example, the user performs an operation (for example, a tap operation) to select the moving object 502a in the state of FIG. 11.
- FIG. 13 is a schematic diagram showing an example of a screen in a state where detailed information is displayed.
- moving objects 502a to 502b are displayed on the race field object 501. Further, in FIG. 13, the detailed information object 1310 is displayed around the moving object 502a. When the moving object 502a moves, the detailed information object 1310 may also move so as to follow the moving object 502a.
- the detailed information object 1310 includes a name field 1320, an avatar display field 1330, a chart display field 1340, and an end button 1350.
- the name field 1320 the name of the racer boarding the moving object 502a is displayed.
- the avatar display field 1330 for example, an image of the avatar object registered in association with the racer (participant information) boarding the moving object 502a is displayed.
- the image of the avatar object displayed in the avatar display field 1330 may be a two-dimensional image or a three-dimensional image.
- the display mode of the avatar object displayed in the avatar display field 1330 may be changed, in particular, according to the current order. For example, a smiling avatar object may be displayed in the case of a higher rank, and a crying face avatar object may be displayed in the case of a lower rank.
- a graph for example, a radar chart
- racer characteristic data in a plurality of items such as "win rate”, “double rate”, “start”, "speed”, and “turn” are displayed.
- the information displayed in the chart display field 1340 may be a graph of the contestant information obtained as text information by the control unit 190, or the contestant information acquired in step S640 from the beginning. May be.
- the end button 1350 is a virtual button for hiding the detailed information object 1310 in step S750.
- the detailed information object 1310 may be hidden by tapping the outside of the detailed information object 1310.
- FIG. 13 the detailed information object 1310 is provided with a chart display field 1340 in which a radar chart is displayed, but the present invention is not limited to this example.
- FIG. 14 is a schematic diagram showing an example of a screen in which another example of the detailed information object is shown.
- the detailed information display field 1440 is displayed instead of the chart display field 1340. ..
- detailed contestant information is displayed as text information.
- the chart display field 1340 and the detailed information display field 1440 may be collectively displayed on the detailed information object by combining the example of FIG. 13 and the example of FIG.
- steps S710 to S750 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world.
- the position information of the boats 402a to 402f from the start to the end of the race may be collectively acquired before the start of the virtual race. Further, the virtual race may be displayed by acquiring only the race time information without acquiring the position information and the like.
- the race displayed as a virtual race may be an exhibition race.
- step S760 the control unit 190 executes the result display process and ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
- FIG. 15 is a flowchart showing an example of the display control process in the second operation example. Since steps S710 and S720 in FIG. 15 are the same as steps S710 and S720 shown in FIG. 7, detailed description thereof will be omitted.
- step S1510 the control unit 190 acquires selection information. Specifically, the control unit 190 acquires information indicating a boat ticket or betting ticket purchased by the user as selection information.
- step S1520 the control unit 190 registers the avatar information in the moving object to be selected based on the acquired selection information.
- the control unit 190 changes the display of the moving object based on the registered avatar information.
- FIG. 16 is a schematic diagram showing an example of the display screen in step S1530.
- the control unit 190 acquires the fact that the user has purchased the boat ticket of the boat 402a as selection information, and registers the user's avatar information in association with the moving object 502a corresponding to the boat 402a.
- the registered avatar information is, for example, the avatar information updated in step S630 based on the editing information acquired in step S620.
- the control unit 190 changes the display of the racer image displayed on the moving object 502a to an avatar object based on the avatar information registered in association with the moving object 502a.
- the racer on the moving object 502a is changed to the avatar object 1620 (an example of the first avatar object) in the form of a monster based on the editing information updated by the user and displayed.
- an avatar object 1610 (an example of a second avatar object) in the form of a normal racer is displayed on the moving object 502b corresponding to the boat 402b for which the user has not purchased a boat ticket.
- the moving object 502a representing the contestants selected by the user is displayed in a display mode different from the moving object 502b representing the contestants not selected by the user.
- the avatar object 1620 on the moving object 502a and the avatar object 1610 on the moving object 502b move together with the moving objects 502a and 502b.
- the control unit 190 detects the occurrence of a predetermined event in a predetermined race.
- a predetermined event occurs in a predetermined race in the real world, for example, the control unit 290 is based on the image analysis of the race, the information received from the second server device 40, and the information stored in the event information 252B. , The occurrence of an event is detected, and the event information indicating that the occurrence of the event has been detected is transmitted to the user terminal 10.
- the control unit 190 detects the occurrence of an event, for example, based on receiving the event information during the race.
- the control unit 190 generates a predetermined event based on, for example, the first information 152A and the user information 153, without depending on the transmission of the event information from the first server device 20 during the race. It may be detected.
- the control unit 190 may detect the occurrence of a predetermined event based on the position information of the contestants and the like and the time information corresponding to the position information.
- step S1550 the control unit 190 adds a specific action to the avatar information registered in the moving object 502.
- the event information is received from the first server device 20 in step S1540 and the avatar information corresponding to the event information is registered
- step S1550 for example, the display of the action object is displayed based on the avatar information. Will be executed. It should be noted that only the information identifying the event such as the event ID is received from the first server device 20, for example, the information corresponding to the acquired event ID or the like is acquired from the event information 152B, and the action is taken based on the event information 152B. The object may be displayed.
- FIG. 17 is a schematic diagram showing an example of the display screen in step S1550.
- an action object (action image) is displayed for the moving object 502a corresponding to the boat 402a for which the user has purchased a boat ticket (voting right).
- the display 132 displays an action object 1710 that imitates a flame against the monster object 1620 displayed on the moving object 502a.
- This action object 1710 is added to a contestant who is the target of the voting right purchased by the user, for example, with the occurrence of the final straight line event.
- the addition of the action object 1710 is not limited to the above example, for example, detection of the occurrence of another event other than the final straight line event (for example, the contestant who is the subject of the voting right wins, the contestant loses). It may be added according to the fact that the acceleration has increased, the speed is faster than other contestants, etc.). For example, the action attached to the avatar object may be changed according to the current rank.
- the action object added when the event occurs can be selected by the user on the avatar editing screen 800 of FIG.
- the user can use the avatar editing screen 800 of FIG. 8 among the plurality of options displayed in the “winning action” column of the selection table 900 shown in FIG.
- the action object that represents the winning action selected in is displayed.
- the contestant who is the target of the voting right loses, the user can use the avatar editing screen of FIG. 8 among the plurality of options displayed in the "losing action" column of the selection table 900 shown in FIG.
- An action object representing the losing action selected in 800 is displayed.
- the action object may be attached to another virtual object such as the race field object 501 instead of the moving object 502.
- the action object 1710 is displayed only when an event occurs, but depending on the type of action, the action object is always near the moving object of the contestant who is the target of the voting right purchased by the user. It may be displayed. That is, in step S730, the occurrence of an event may be detected based on the user purchasing the voting right of a predetermined contestant.
- the association between the moving object 502 and the action object is not limited to the above example, and may be added to, for example, a contestant who is not subject to the voting right purchased by the user.
- the association between the moving object 502 and the action object may be set in advance by the administrator of the first server device 20 or the user of the user terminal 10.
- the user of the user terminal 10 may display the action object corresponding to the moving object set as "favorite".
- the action object displayed corresponding to the moving object 502 may be configured so that the user can select it from a plurality of objects set as selection candidates. Further, the types of objects that are candidates for selection may be increased according to the user's billing and the like.
- steps S710 to S1550 in FIG. 15 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world. .. Since the process of step S760 in FIG. 15 is the same as the process of step S760 in FIG. 7, detailed description thereof will be omitted.
- the program is attached to the processor.
- the step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
- the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
- the virtual object includes at least a moving object corresponding to the contestant or the moving object.
- the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object. program.
- the first information further includes position information of the contestant or the moving body during the execution of the predetermined race, and time information corresponding to the position information.
- the display step displays a virtual race in which the moving object is moved based on the position information and the time information to virtually represent the predetermined race, and in the virtual race, at least the moving object is displayed. Including displaying a portion to include the predetermined avatar, The program described in item 1. As a result, the visibility and interest of the virtual race can be improved, and the user's satisfaction can be further improved.
- the program is applied to the processor and further to the processor. It is intended to execute a step of acquiring one or more selection information indicating one or more contestants or the moving object selected by the user. In the step to be displayed, the user has not selected one or more of the predetermined avatars selected by the user, the first avatar representing the contestant or the moving object, based on the selection information.
- the present invention includes displaying in a display mode different from that of the second avatar representing the contestant or the moving object.
- the program according to item 1 or item 2.
- the display step comprises changing the display mode of the predetermined avatar according to the current ranking in the predetermined race.
- the first computer further comprises a storage unit for storing a database relating to the predetermined avatar.
- the program is applied to the processor and further to the processor.
- the step of editing the predetermined avatar based on a plurality of items preset in the database is executed.
- the program according to any one of items 1 to 4.
- the user can edit the avatar according to his / her preference, thereby improving the familiarity of the user with the contestants and the like.
- the contestant information displayed in the displayed step includes at least a characteristic chart of the contestant or the moving object.
- the program according to any one of items 1 to 5. This allows the user to understand the characteristics of each moving object. Further, since the characteristic information is usually one of the information that the user is highly interested in, the satisfaction of the user can be improved by displaying the characteristic information together with the moving object.
- the receiving step is executed at a predetermined frequency or when a specific event occurs, and is executed.
- the information processing method is applied to the processor.
- the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
- the virtual object includes at least a moving object corresponding to the contestant or the moving object.
- the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
- Information processing method As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race. In addition, since the user's sense of familiarity with the contestants can be enhanced, the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
- An information processing device including a processor and an image pickup unit, wherein the processor is Receive the first information about a given race in the real world from the second computer, A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
- the virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
- the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
- the virtual object includes at least a moving object corresponding to the contestant or the moving object.
- the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
- the second processor is Acquire the first information about a predetermined race in the real world, transmit the first information to the first computer, and then The first processor is The first information is received from the second computer, A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
- the virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
- the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
- the virtual object includes at least a moving object corresponding to the contestant or the moving object.
- the superimposed display is to display at least a part of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
- Including the system As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race.
- the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
- the first processor further Acquire one or more selection information indicating the one or more contestants or the moving object selected by the user, and obtain the selection information.
- the first information includes the position information of the contestant or the moving body during the execution of the predetermined race, and the time information corresponding to the position information.
- the display step displays a virtual race that virtually represents the predetermined race by moving the moving object, and at least a part of the moving object includes the predetermined avatar in the virtual race. Including displaying on The position information is acquired based on the position sensor provided on the contestant or the moving body, or the image of the predetermined race captured by the image pickup device during the execution of the predetermined race.
- the system according to item 10. This makes it easy to obtain the position information of the contestants and the like. As a result, the ranking and offense and defense during the race in the real world can be reproduced on the virtual race, so that the user's satisfaction can be improved.
- Storage unit (of 1 server device) 290: Control unit (of 1st server device), 401: Boat race field, 402a to 402f: Boat, 501: Race field object, 502 (502a to 502f): Moving object, 1310: Detailed information object, 1330: Avatar display field, 1340: Chart display field, 1610, 1620: Avatar object, 1710: Action object
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-201882 | 2020-12-04 | ||
JP2020201882A JP6857774B1 (ja) | 2020-12-04 | 2020-12-04 | プログラム、情報処理方法、情報処理装置、及びシステム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022118652A1 true WO2022118652A1 (ja) | 2022-06-09 |
Family
ID=75378005
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/042180 WO2022118652A1 (ja) | 2020-12-04 | 2021-11-17 | プログラム、情報処理方法、情報処理装置、及びシステム |
Country Status (2)
Country | Link |
---|---|
JP (2) | JP6857774B1 (enrdf_load_stackoverflow) |
WO (1) | WO2022118652A1 (enrdf_load_stackoverflow) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7576511B2 (ja) * | 2021-05-12 | 2024-10-31 | 株式会社コロプラ | プログラム、情報処理方法、情報処理装置、及びシステム |
CN117635891A (zh) * | 2022-08-12 | 2024-03-01 | 腾讯科技(深圳)有限公司 | 虚拟场景中的模型展示方法、装置、设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001251671A (ja) * | 2000-03-03 | 2001-09-14 | Matsushita Electric Ind Co Ltd | 相互位置情報通知システム及び相互位置情報通知方法 |
JP2003204481A (ja) * | 2001-08-16 | 2003-07-18 | Space Tag Inc | 画像情報配信システム |
JP2012157510A (ja) * | 2011-01-31 | 2012-08-23 | Fujitsu Frontech Ltd | 通過位置表示システム、その端末装置、プログラム |
JP2019003346A (ja) * | 2017-06-13 | 2019-01-10 | 株式会社Mgrシステム企画 | 活動応援方法および活動応援装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4059417B2 (ja) * | 1999-04-27 | 2008-03-12 | 株式会社バンダイナムコゲームス | ゲーム装置、ホスト装置および情報記憶媒体 |
JP2002210060A (ja) * | 2001-01-16 | 2002-07-30 | Nippon Hoso Kyokai <Nhk> | 競技参加型運動装置及び該競技参加型運動装置のセンタ装置 |
JP2007089648A (ja) * | 2005-09-27 | 2007-04-12 | Nec Corp | 競技情報の配信システム、携帯機器、配信方法、及びプログラム |
JP7010292B2 (ja) * | 2017-07-25 | 2022-01-26 | 富士通株式会社 | 映像生成プログラム、映像生成方法および映像生成装置 |
US10872493B2 (en) * | 2018-04-30 | 2020-12-22 | Igt | Augmented reality systems and methods for sports racing |
-
2020
- 2020-12-04 JP JP2020201882A patent/JP6857774B1/ja active Active
-
2021
- 2021-03-18 JP JP2021044313A patent/JP7434202B2/ja active Active
- 2021-11-17 WO PCT/JP2021/042180 patent/WO2022118652A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001251671A (ja) * | 2000-03-03 | 2001-09-14 | Matsushita Electric Ind Co Ltd | 相互位置情報通知システム及び相互位置情報通知方法 |
JP2003204481A (ja) * | 2001-08-16 | 2003-07-18 | Space Tag Inc | 画像情報配信システム |
JP2012157510A (ja) * | 2011-01-31 | 2012-08-23 | Fujitsu Frontech Ltd | 通過位置表示システム、その端末装置、プログラム |
JP2019003346A (ja) * | 2017-06-13 | 2019-01-10 | 株式会社Mgrシステム企画 | 活動応援方法および活動応援装置 |
Also Published As
Publication number | Publication date |
---|---|
JP7434202B2 (ja) | 2024-02-20 |
JP2022089736A (ja) | 2022-06-16 |
JP2022089466A (ja) | 2022-06-16 |
JP6857774B1 (ja) | 2021-04-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022118652A1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP6903805B1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
WO2022107639A1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
WO2022239403A1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP7212721B2 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP6857785B1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP6857776B1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
WO2022137376A1 (ja) | 方法、コンピュータ可読媒体、および情報処理装置 | |
JP7303845B2 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP6934547B1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
WO2022014303A1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP7324801B2 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP6866540B1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
WO2022107638A1 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP7303846B2 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP7581132B2 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP2022073026A (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP7641143B2 (ja) | プログラム、情報処理方法、情報処理装置、及びシステム | |
JP2022080829A (ja) | プログラム、情報処理方法、情報処理装置、及びシステム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21900407 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21900407 Country of ref document: EP Kind code of ref document: A1 |