WO2022107639A1 - Program, information processing method, information processing device, and system - Google Patents

Program, information processing method, information processing device, and system Download PDF

Info

Publication number
WO2022107639A1
WO2022107639A1 PCT/JP2021/041114 JP2021041114W WO2022107639A1 WO 2022107639 A1 WO2022107639 A1 WO 2022107639A1 JP 2021041114 W JP2021041114 W JP 2021041114W WO 2022107639 A1 WO2022107639 A1 WO 2022107639A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
race
color
predetermined
moving object
Prior art date
Application number
PCT/JP2021/041114
Other languages
French (fr)
Japanese (ja)
Inventor
功淳 馬場
聡志 松山
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Publication of WO2022107639A1 publication Critical patent/WO2022107639A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to programs, information processing methods, information processing devices, and systems.
  • Patent Documents 1 and 2 disclose techniques related to AR (Augmented Reality).
  • Patent Documents 1 and 2 superimpose and display a virtual object unrelated to the real world on an image in the real world, and do not present any means for solving the above problem.
  • One aspect of the present disclosure is to use a virtual object linked to a real world race to determine a moving object corresponding to a race participant or a moving object when displaying the real world race as a virtual race.
  • the purpose is to provide easy programs, information processing methods, information processing devices, and systems.
  • the program is attached to the processor.
  • the step of receiving the first information about a given race in the real world from the second computer A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
  • the step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
  • the first information includes identification information of the contestant or the moving body and color information associated with the identification information.
  • the generating step is provided with a program comprising generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
  • a real world race is displayed as a virtual race by using a virtual object linked to the real world race
  • the movement corresponding to a race participant or a moving object is performed.
  • the object can be easily identified.
  • (A) and (b) are schematic diagrams showing an example of a moving object displayed on a user terminal according to a certain embodiment. It is a schematic diagram which shows an example of the real image which was taken by the image pickup part according to a certain embodiment. It is a schematic diagram which shows an example of the screen which superposed virtual object on the real image which follows a certain embodiment. It is a schematic diagram which shows an example of the screen which superposed virtual object on the real image which follows a certain embodiment. It is a schematic diagram which shows an example of the virtual object displayed on the user terminal according to a certain embodiment. It is a schematic diagram which shows an example of the virtual object displayed on the user terminal according to a certain embodiment.
  • FIG. 1 is a diagram showing a configuration of a system 1 according to the present embodiment.
  • the system 1 can display, for example, a predetermined race performed in the real world as a virtual race using a virtual object on an information processing device used by a user.
  • the "predetermined race” is not particularly limited as long as it is a race carried out in the real world, and for example, a boat race (actual race or exhibition race), horse racing, bicycle racing, auto racing, etc.
  • Car races such as F1, drone races, dog races, marathons, ekiden, etc. can be mentioned.
  • the system 1 is a user such as a user terminal 10A, a user terminal 10B, and a user terminal 10C (hereinafter, user terminals 10A, 10B, 10C) which are information processing devices (first computers) used by each user. It includes a plurality of user terminals 10 such as (generally referred to as "user terminal 10"), a first server device (second computer) 20, a second server device 40, and a network 30.
  • the user terminal 10A and the user terminal 10B are connected to the network 30 by communicating with the radio base station 31.
  • the user terminal 10C connects to the network 30 by communicating with a wireless router 32 installed in a facility such as a house.
  • the user terminal 10 is, for example, a portable terminal provided with a touch screen, and may be a smartphone, a phablet, a tablet, or the like.
  • the user terminal 10 executes, for example, a program installed via a platform for distributing an application or the like, or a program including pre-installed website browsing software.
  • the user terminal 10 communicates with the first server device 20 by executing the above program, and transmits / receives data related to a predetermined race, data related to the user, and the like to / from the first server device 20. It is possible to display a virtual race on the terminal 10.
  • the first server device 20 receives data related to a predetermined race from the second server device 40.
  • the first server device 20 appropriately transmits data related to a predetermined race to the user terminal 10.
  • the first server device 20 stores and manages data related to a predetermined race and data related to each user.
  • the first server device 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor (second processor) 29 as a hardware configuration, and these provide a communication bus. They are connected to each other via.
  • IF Interface
  • second processor second processor
  • the communication IF 22 supports various communication standards such as LAN (Local Area Network) standards, and functions as an interface for transmitting and receiving data to and from the user terminal 10 and the second server device 40.
  • LAN Local Area Network
  • the input / output IF 23 functions as an interface for accepting input of information to the first server device 20 and outputting information to the outside of the first server device 20.
  • the input / output IF 23 may include an input receiving unit that accepts the connection of an information input device such as a mouse and a keyboard, and an output unit that accepts the connection of an information output device such as a display for displaying an image or the like.
  • the memory 25 is a storage device for storing data or the like used for processing.
  • the memory 25 provides the processor 29 with a work area for temporary use, for example, when the processor 29 performs processing.
  • the memory 25 includes a storage device such as a ROM (ReadOnlyMemory) and a RAM (RandomAccessMemory).
  • the storage 26 is a storage device for storing various programs and data for the processor 29 to read and execute.
  • the information stored in the storage 26 includes data related to a predetermined race (including identification information and color information of a moving object), data related to each user, and the like.
  • the storage 26 may be configured to include a storage device such as an HDD (Hard Disk Drive) or a flash memory.
  • the storage is not limited to the form included in the server device, and a cloud service can also be used.
  • the processor 29 controls the operation of the first server device 20 by reading and executing a program or the like stored in the storage 26.
  • the processor 29 may be configured to include, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and the like.
  • the second server device 40 stores and manages data related to a predetermined race.
  • the second server device 40 is, for example, a server device managed by a predetermined race organizer or an organization (publisher of a race magazine, race video distributor, or radio distributor) that transmits information about a predetermined race to the outside. Etc.) is a server device managed by.
  • the second server device 40 appropriately transmits data related to a predetermined race to the first server device 20.
  • the second server device 40 may transmit data related to a predetermined race to the user terminal 10.
  • the hardware configuration of the second server device 40 may be the same as that of the first server device 20 as long as there is no contradiction. There may be a plurality of second server devices 40.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the user terminal 10.
  • the user terminal 10 includes an antenna 110, a wireless communication IF 120, a touch screen 130, an input / output IF 140, a storage unit 150, a voice processing unit 160, a microphone 161 and a speaker 162. It includes an image pickup unit 170 and a control unit (first processor) 190.
  • the antenna 110 radiates a signal emitted by the user terminal 10 into space as a radio wave. Further, the antenna 110 receives radio waves from the space and gives a received signal to the wireless communication IF 120.
  • the wireless communication IF 120 Since the user terminal 10 communicates with other communication devices, the wireless communication IF 120 performs modulation / demodulation processing for transmitting / receiving signals via the antenna 110 or the like.
  • the wireless communication IF 120 is a communication module for wireless communication including a tuner, a high frequency circuit, etc., performs modulation / demodulation and frequency conversion of the wireless signal transmitted / received by the user terminal 10, and supplies the received signal to the control unit 190.
  • the touch screen 130 receives input from the user and outputs information to the user on the display 132.
  • the touch screen 130 includes a touch panel 131 for receiving user operation input and a display 132.
  • the touch panel 131 detects that a user's finger or the like has approached by using, for example, a capacitance type touch panel.
  • the display 132 is realized by, for example, an LCD (Liquid Crystal Display), an organic EL (electroluminescence), or other display device.
  • the input / output IF 140 functions as an interface for accepting input of information to the user terminal 10 and outputting information to the outside of the user terminal 10.
  • the storage unit 150 is composed of a flash memory, an HDD, or the like, and stores a program used by the user terminal 10 and various data received by the user terminal 10 from the first server device 20 or the like.
  • the voice processing unit 160 performs modulation / demodulation of the voice signal.
  • the voice processing unit 160 modulates the signal given from the microphone 161 and gives the modulated signal to the control unit 190. Further, the voice processing unit 160 gives a voice signal to the speaker 162.
  • the voice processing unit 160 is realized by, for example, a processor for voice processing.
  • the microphone 161 functions as an audio input unit for receiving an input of an audio signal and outputting it to the control unit 190.
  • the speaker 162 functions as an audio output unit for outputting an audio signal to the outside of the user terminal 10.
  • the image pickup unit 170 is a camera that captures a real image around the user terminal 10.
  • the image captured by the image pickup unit 170 is image-processed by the control unit 190 and output to the display 132.
  • the control unit 190 controls the operation of the user terminal 10 by reading and executing a program stored in the storage unit 150.
  • the control unit 190 is realized by, for example, an application processor.
  • the storage unit 150 stores the application program 151, the application information 152, and the user information 153.
  • the user terminal 10 downloads the application program 151 from the first server device 20 and stores it in the storage unit 150. Further, the user terminal 10 communicates with the first server device 20 to send and receive various data such as application information 152 and user information 153 to and from the first server device 20.
  • the application program 151 is a program for displaying a virtual race on the user terminal 10.
  • the application information 152 includes various data referred to by the application program 151.
  • the application information 152 includes the first information 152A and the event information 152B.
  • the first information 152A is information regarding a predetermined race transmitted from the first server device 20 and the second server device 40.
  • the first information 152A includes, for example, race time information indicating the race time of a participant or a moving body (hereinafter, also referred to as “participant”) of a predetermined race, position information of the participant or the like during the execution of the predetermined race, and the like.
  • the time information corresponding to the position information and the time information are included.
  • the first information further includes identification information of participants and the like, color information associated with the identification information, and color position information for designating a position of color registration in a moving object described later.
  • the term "participant” is a concept that includes not only humans but also animals such as horses and dogs.
  • the "moving body” is a main body of movement in a predetermined race, and is an animal or an aircraft on which a participant rides, an aircraft remotely controlled by a contestant, or the like. In marathons, dog races, etc., "participants” and “moving bodies” are the same.
  • the first information 152A includes, for example, a predetermined race name, date and time, race field data, contestant data, moving object data, odds information, race prediction, race start table, and information immediately before the race. It may include pit reports, race results, race videos, race stills, past race information, and other information that may be published in information magazines or information sites about a given race.
  • the event information 152B is information about an event that can occur or has occurred in a predetermined race, and includes, for example, information about an effect display (effect display) and an effect sound corresponding to the event.
  • the event is not particularly limited, but for example, the goal of the first contestant, etc. is reached, the ranking of a predetermined race is confirmed, the ranking of the contestants, etc. is changed, and the difference in the positions of two or more contestants, etc. at a predetermined timing. May be less than or equal to a predetermined threshold value (two or more contestants or the like have a close battle), or one or more of the occurrence of rule violations.
  • the event information 152B is information received from the first server device 20 after an event occurs in a predetermined race.
  • the event information 152B includes, for example, information about a two-dimensional image or a three-dimensional image corresponding to a predetermined event.
  • the event information 152B may include information about a text image for notifying the content of the event, a predetermined character image, an effect image to be added to a virtual object described later, and the like.
  • the user information 153 includes information about the user of the user terminal 10.
  • the user information 153 includes, for example, information for identifying a user, location information of a user terminal 10, a race purchase history and a hit rate of a user (for example, a history of a boat ticket purchased in the case of a boat race, and a boat ticket purchased. Hit rate) and the like may be included.
  • the control unit 190 By reading and executing the application program 151, the control unit 190 includes an operation input reception unit 191, a transmission / reception unit 192, a race attribute selection unit 193, an object generation unit 194, an identification number assigning unit 195, and color information.
  • Each function of the registration unit 196, the display control unit 197, the flat surface detection unit 198A, the event detection unit 198B, and the selection information acquisition unit 199 is exhibited.
  • the operation input receiving unit 191 accepts the user's operation input based on the output of the touch screen 130. Specifically, the operation input receiving unit 191 detects that the user's finger or the like touches or approaches the touch panel 131 as the coordinates of the coordinate system consisting of the horizontal axis and the vertical axis of the surface constituting the touch screen 130.
  • the operation input receiving unit 191 determines the user's operation on the touch screen 130.
  • the operation input reception unit 191 is, for example, “approach operation”, “release operation”, “tap operation”, “double tap operation”, “long press operation (long touch operation)”, “drag operation (swipe operation)", Determine the user's operation such as “move operation”, “flick operation”, “pinch-in operation", and "pinch-out operation”.
  • the operation input receiving unit 191 may accept the movement of the user terminal 10 detected by the acceleration sensor or the like mounted on the user terminal 10 as an operation input.
  • the transmission / reception unit 192 transmits and receives various information to and from external communication devices such as the first server device 20 and the second server device 40 via the wireless communication IF 120 and the network 30.
  • the transmission / reception unit 192 receives, for example, the first information 152A from the first server device 20 or the second server device 40. Further, the transmission / reception unit 192 transmits, for example, information corresponding to the operation input received by the operation input reception unit 191 and information stored in the user information 153 to the first server device 20 or the second server device 40. ..
  • the race attribute selection unit 193 selects the race attribute (specifically, the type of race) according to the operation input received by the operation input reception unit 191.
  • examples of race types include boat races (actual races and exhibition races), horse races, bicycle races, auto races, car races such as F1, drone races, dog races, marathons, and relay races. ..
  • the object generation unit 194 generates a virtual object for presenting information about a predetermined race selected by the race attribute selection unit 193 to the user based on the first information.
  • the object generation unit 194 generates, as virtual objects, a race field object representing a race field and a moving object representing a contestant or the like.
  • the object generation unit 194 may generate a virtual display board for displaying the second information generated based on the first information as text.
  • the object generation unit 194 is a landscape object that constitutes a landscape such as a virtual screen for displaying an image of the second information generated based on the first information, various building objects, and trees. , An object or the like that becomes a user's avatar may be generated.
  • the identification number assigning unit 195 assigns an identification number to each of the moving objects generated by the object generation unit 194.
  • the identification number assigning unit 195 assigns identification numbers in the order of arrangement (for example, frame order) of the moving objects according to the number of moving objects that differ depending on the attributes of the race.
  • the color information registration unit 196 acquires color information such as participants in a predetermined race selected by the race attribute selection unit 193.
  • the color information registration unit 196 further associates the acquired color information with the identification number assigned to each moving object by the identification number assigning unit 195, and executes color registration to the moving object.
  • the color information registration unit 196 acquires identification information and color information from the first information 152A received from the second server device 40, and is identified by the identification number assigning unit 195 based on the acquired identification information and color information.
  • a predetermined color is registered in a predetermined part of each numbered moving object.
  • the display control unit 197 superimposes an image (hereinafter, also referred to as “superimposed image”) on which a virtual object generated by the object generation unit 194 is superimposed on a real image around the user terminal 10 imaged by the image pickup unit 170. It is displayed on the display 132. Based on the race time information included in the first information 152A, the display control unit 197 moves each moving object whose predetermined portion is colored by the color information registration unit 196 on the racecourse object, and virtually performs the predetermined race. The reproduced virtual race is displayed on the display 132. It is preferable that the display control unit 197 reproduces the virtual race based on the position information of the contestants and the like included in the first information 152A and the time information corresponding to the position information in addition to the race time information. ..
  • the display control unit 197 can change the viewpoint in the superimposed image according to the operation input received by the operation input reception unit 191.
  • the display control unit 197 displays various menu screens and GUIs (Graphical User Interfaces) on the display 132, and changes the display contents of the display 132, in response to the operation input received by the operation input reception unit 191.
  • GUIs Graphic User Interfaces
  • the display control unit 197 displays an effect corresponding to the event in response to the event detection unit 198B detecting the occurrence of the event.
  • the content of the effect display is not particularly limited, but may be, for example, an effect display on at least one of a race field object and a moving object, and a display of a two-dimensional image or a three-dimensional image corresponding to a predetermined event. It may be one or more of displaying predetermined character images and operating these images according to the progress of the race.
  • the display control unit 197 may simultaneously execute two or more effect displays corresponding to different events.
  • the flat surface detection unit 198A detects a flat surface in a real image captured by the image pickup unit 170.
  • the detection of a flat surface is realized by a conventionally known image recognition technique. For example, when the user performs an operation of selecting a flat surface detected by the flat surface detection unit 198A, a superimposed image in which the race field object is arranged on the flat surface is displayed on the display 132.
  • the flat surface is preferably a horizontal surface.
  • the angle formed by the flat surface and the bottom surface constituting the race field object may be 0 degrees, but is preferably an acute angle, and can be, for example, in the range of 15 degrees to 45 degrees. The above angle may be adjusted by accepting a user's operation. Also, even if there is a convex part on a part of the flat surface in the real world, or if there is an object on the flat surface, if the convex part or the object is of a size that can be hidden by the racetrack object. , The place where the race field object can be placed may be detected as a flat surface.
  • the event detection unit 198B detects the occurrence of a predetermined event in a predetermined race.
  • the event detection unit 198B receives, for example, event information 292B transmitted from the first server device 20 when a predetermined event occurs in a predetermined race in the real world, and the transmission / reception unit 192 receives the predetermined event. Detects the occurrence of.
  • the event detection unit 198B may detect the occurrence of a predetermined event based on, for example, the first information 152A and the user information 153. Specifically, the event detection unit 198B determines from the position of the contestants, etc. during the race, such as the change of ranking of the contestants, etc., based on the position information of the contestants, etc. and the time information corresponding to the position information. Can detect possible events. The event detection unit 198B can detect an event such that a contestant or the like purchased by the user is located at the head of the race based on, for example, information on the race result and information on the race purchase history of the user.
  • the selection information acquisition unit 199 acquires one or more selection information indicating one or more contestants selected by the user.
  • the "selection information" is, for example, information indicating a contestant or the like predicted by the user as a winner or the like of a predetermined race.
  • the "selection information" may be information indicating a boat ticket or a betting ticket purchased by the user.
  • the selection information acquisition unit 199 acquires selection information based on, for example, a user's operation input on the screen for selecting a contestant or the like.
  • the selection information acquired by the selection information acquisition unit 199 is transmitted to, for example, the first server device 20 or the second server device 40.
  • the predetermined race is a boat race or a horse race
  • the purchase process of the boat ticket or the betting ticket may be performed according to the selection information being transmitted to the second server device 40.
  • FIG. 3 is a block diagram showing a functional configuration of the first server device 20. A detailed configuration of the first server device 20 will be described with reference to FIG.
  • the first server device 20 exerts functions as a communication unit 220, a storage unit 250, and a control unit 290 by operating according to a program.
  • the communication unit 220 functions as an interface for the first server device 20 to communicate with an external communication device such as the user terminal 10 or the second server device 40 via the network 30.
  • the storage unit 250 stores various programs and data for realizing the system 1.
  • the storage unit 250 stores the program 251 and the race information 252 and the user information 253.
  • the program 251 is a program for the first server device 20 to communicate with the user terminal 10 and the second server device 40 to realize the system 1.
  • the program 251 is executed by the control unit 290 to send and receive data to and from the user terminal 10 and the second server device 40, a process according to the operation content performed by the user of the user terminal 10, race information 252, and the user.
  • the first server device 20 is made to perform a process of updating the information 253 and the like.
  • Race information 252 contains various data related to a given race.
  • the race information 252 includes, for example, the first information 252A and the event information 252B.
  • the first information 252A is the source information of the first information 152A, and the first information 152A can be a part of the first information 252A.
  • the first information 252A is, for example, information acquired from the second server device 40.
  • the event information 252B is the source information of the event information 152B, and the event information 152B may be a part of the event information 252B.
  • the event information 252B is, for example, information acquired from the second server device 40.
  • the user information 253 is information about the user of the user terminal 10.
  • the user information 253 includes a user management table 253A.
  • the user management table 253A stores, for example, information for identifying the user, position information of the user terminal 10, a race purchase history of the user, a hit rate, and the like for each user.
  • the control unit 290 is realized by the processor 29, and by executing the program 251, the transmission / reception unit 291, the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, the data management unit 294, and the timekeeping unit 295. Demonstrate the function as.
  • the transmission / reception unit 291 transmits and receives various information to and from an external communication device such as a user terminal 10 and a second server device 40 via the communication unit 220 and the network 30.
  • the transmission / reception unit 291 transmits, for example, at least a part of the first information 252A and the event information 252B to the user terminal 10. Further, the transmission / reception unit 291 receives, for example, the first information 252A and the event information 252B from the second server device 40.
  • the first information acquisition unit 292A acquires the first information 252A from the second server device 40 via the transmission / reception unit 291.
  • the event information acquisition unit 292B acquires the event information 252B from the second server device 40 via the transmission / reception unit 291.
  • the selection information acquisition unit 293 acquires the user's selection information from the user terminal 10 or the second server device 40 via the transmission / reception unit 291.
  • the data management unit 294 performs a process of updating various data stored in the storage unit 250 according to the processing results of the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, and the like.
  • the timekeeping unit 295 performs a process of measuring time. Various times displayed on the user terminal 10 (for example, the time until the start of the race) can be controlled based on the time measured by the time measuring unit 295.
  • FIG. 4 is a schematic diagram showing an example of a boat race field in the real world.
  • Two turn marks 403 are installed in the boat race field 401, and races are carried out by boats 402a to 402f on which each boat racer rides.
  • the race time information indicating the race times of the boats 402a to 402f is transmitted from the second server device 40 to the first server device 20, and is transmitted from the first server device 20 to the user terminal 10.
  • the boat race field 401 is provided with image pickup devices (cameras) 404a to 404b.
  • the image pickup apparatus 404a captures the boats 402a to 402f in the field of view from above the boat race field 401.
  • the image pickup apparatus 404b keeps the boats 402a to 402f in the field of view from the side of the boat race field 401.
  • the images of the boats 402a to 402f captured by the image pickup devices 404a to 404b are transmitted to the second server device 40.
  • image analysis of each image is performed, and position information indicating the position of each of the boats 402a to 402f in the shooting time of each image is calculated.
  • the calculated position information and the time information related to the shooting time corresponding to the position information are transmitted to the first server device 20, and are transmitted from the first server device 20 to the user terminal 10.
  • the location information may be calculated by the first server device 20.
  • a position sensor such as a GPS sensor may be installed on the boats 402a to 402f instead of or in addition to the image pickup devices 404a to 404b.
  • the position information of the boats 402a to 402f acquired by the position sensor and the time information indicating the time when the position information was acquired are finally transmitted to the user terminal 10.
  • FIG. 5 is a schematic diagram showing an example of a virtual object displayed on the user terminal 10.
  • a race field object 501 moving objects 502a to 502f, two turnmark objects 503, and a virtual display board 505 are shown.
  • the race field object 501 is an object that virtually displays the boat race field 401.
  • the race field object 501 and the turn mark object 503 are preferably created based on race field data such as course information of the boat race field 401, and preferably have a shape corresponding to the boat race field 401.
  • the moving objects 502a to 502f are objects that virtually display the boats 402a to 402f, respectively, and have a shape imitating a boat.
  • the moving objects 502a to 502f move the race field object 501 based on the race time information, the position information of the boats 402a to 402f, and the time information corresponding to the position information. That is, the race field object 501 and the moving objects 502a to 502f display the race in the real world as a virtual race on the user terminal 10.
  • the moving objects 502a to 502f which are virtual representations of the boats 402a to 402f, are colored by the color information registration unit 196 in a predetermined position in accordance with the boats 402a to 402f in the real world. 10 Displayed above. As shown in FIGS. 4 and 5, the moving object 502a is colored white like the real-world boat 402a (boat 1) and the moving object 502b is black like the real-world boat 402b (boat 2).
  • the moving object 502c is colored red like the real-world boat 402c (boat 3), the moving object 502d is colored blue like the real-world boat 402d (boat 4), and the moving object
  • the 502e is colored yellow like the real world boat 402e (boat 5) and the moving object 502f is colored green like the real world boat 402f (boat 6).
  • the virtual display board 505 is an object that displays text information.
  • the virtual display board 505 is, for example, a corresponding nonexistent object in the boat race field 401.
  • the text information displayed on the virtual display board 505 is not particularly limited, and may be, for example, ranking information, odds information, or the like. Further, the text information displayed on the virtual display board 505 may be changeable based on the operation input of the user.
  • 6 and 7 are flowcharts showing an example of display control processing. It should be noted that the order of each process constituting the flowchart described below is random as long as there is no contradiction or inconsistency in the process contents. Further, the processing executed by each device may be executed by another device within a range that does not cause a contradiction.
  • FIGS. 6 and 7 The processes shown in FIGS. 6 and 7 are realized by the control unit 190 executing the application program 151 and the control unit 290 executing the program 251.
  • step S610 the control unit 190 selects a race type (race attribute) to be displayed on the user terminal 10 as a virtual object based on the user's operation input.
  • FIG. 8 is a diagram showing a race selection screen 800 displayed on the user terminal 10 by executing the application program 151. As shown in FIG. 8, on the race selection screen 800, a plurality of races (for example, boat race, horse race, bicycle race, auto race) corresponding to a plurality of types of races (for example, boat race, horse race, bicycle race, auto race) in which a virtual race can be displayed on the user terminal 10 by the application program 151 are displayed. Selection buttons 801a to 801d are included. The control unit 190 selects a race type based on the user inputting an operation to any of the plurality of selection buttons 801a to 801d displayed on the race selection screen 800.
  • step S620 the control unit 190 creates a race field object as a virtual object according to the selected race type.
  • a race field object of a boat race is generated based on the user selecting the selection button 801a corresponding to the boat race.
  • step S630 the control unit 190 creates a moving object as a virtual object according to the selected race type.
  • a plurality of moving objects 502a to 502f corresponding to the plurality of boats 402a to 402f of the boat race are generated.
  • step S640 the control unit 190 assigns an identification number to each of the plurality of generated moving objects 502a to 502f. Specifically, the control unit 190 assigns an identification number to each of the moving objects 502a to 502f in correspondence with the boats 402a to 402f of boats 1 to 6 in the boat race in the real world.
  • step S650 the control unit 190 associates the identification numbers assigned to the moving objects 502a to 502f with the color information. Specifically, the control unit 190 associates the color information matching the designated colors of boats 1 to 6 in the boat race in the real world with the identification numbers of the moving objects 502a to 502f.
  • the control unit 190 is No. 1 in the real world boat race based on the fact that the plurality of moving objects 502a to 502f generated by the object generation unit 194 are objects corresponding to the boats 402a to 402f of the real world boat race.
  • the color information that matches the designated color from the boat to the No. 6 boat is associated with the identification number of each moving object 502a to 502f.
  • the designated color of boat 1 is white
  • the designated color of boat 2 is black
  • the designated color of boat 3 is red
  • the designated color of boat 4 is red.
  • the designated color of boat No. 5 is blue and the designated color of boat No. 6 is green. Therefore, the control unit 190 associates white color information with the identification number of the moving object 502a corresponding to the boat 402a of the first boat, and black color the identification number of the moving object 502b corresponding to the boat 402b of the second boat.
  • the information is associated with the identification number of the moving object 502c corresponding to the boat 402c of the third boat, and the red color information is associated with the identification number.
  • control unit 190 associates blue color information with the identification number of the moving object 502d corresponding to the boat 402d of the 4th boat, and yellow color the identification number of the moving object 502e corresponding to the boat 402e of the 5th boat.
  • the information is associated with the identification number of the moving object 502f corresponding to the boat 402f of the No. 6 boat, and the green color information is associated with the identification number.
  • step S660 the control unit 190 performs color registration at designated locations of the moving objects 502a to 502f based on the correspondence between the identification numbers of the moving objects 502a to 502f and the color information. Specifically, the control unit 190, and the control unit 190, each moving object based on the color information associated with the identification numbers of the moving objects 502a to 502f and the color position information included in the first information. Color registration is performed at predetermined positions of 502a to 502f. That is, for each moving object 502a to 502f, color registration is performed according to the frame number of the boat in the real world.
  • FIG. 9 (a) is a diagram showing an example of a moving object 502 that imitates a boat and a player (boat racer) of a boat race
  • FIG. 9 (b) is a diagram that imitates a racehorse and a player (jockey) of horse racing.
  • It is a figure which shows an example of a moving object 950.
  • the color position information includes, for example, color registration being performed on the side surface 901A of the boat portion 901 and the helmet 902A of the athlete portion 902 of the moving object 502. Is done.
  • FIG. 9A in the case of boat race, the color position information includes, for example, color registration being performed on the side surface 901A of the boat portion 901 and the helmet 902A of the athlete portion 902 of the moving object 502. Is done.
  • FIG. 9A in the case of boat race, the color position information includes, for example, color registration being performed on the side surface 901A of the boat portion 901 and the helmet 902A of the
  • the color position information includes, for example, color registration being performed on the hat 952A of the player 952 among the moving objects 950. It is also possible to color-register the racehorse 951 number 951A and the racing uniform 952B. In this way, the color position information included in the first information is changed according to the type (attribute) of the race and the shape of the moving object.
  • the color position information may be changed by the user's operation input. That is, the user may be able to select / change the color position information so that the color is registered at a predetermined position of the moving object desired by the user.
  • step S670 the control unit 190 activates the image pickup unit 170, which is a camera.
  • the image pickup unit 170 captures a real image around the user terminal 10.
  • step S680 the control unit 190 detects a flat surface in the image captured by the image pickup unit 170.
  • step S690 the control unit 190 arranges the virtual object on the detected flat surface.
  • FIG. 10 is a schematic diagram showing an example of a real image captured by the imaging unit 170.
  • a keyboard 1002 and a monitoring device 1003 are placed on a flat desk 1001.
  • step S670 When the image pickup unit 170 is activated in step S670, the real image captured by the image pickup unit 170 is displayed on the display 132.
  • step S680 the control unit 190 detects a flat surface in the image captured by the image pickup unit 170, that is, in the image displayed on the display 132.
  • the region 1004 is detected as a flat surface.
  • the control unit 190 detects the area 1004 as a flat surface.
  • the position of the region 1004 can also be changed.
  • the area 1004 is displayed on the display 132 so as to be distinguishable from other parts, for example, by adding a predetermined color.
  • the control unit 190 arranges virtual objects such as the race field object 501 and the moving objects 502a to 502f on the area 1004.
  • FIG. 11 is a schematic diagram showing an example of a screen in which a virtual object is superimposed on a real image and displayed.
  • the area with the dot pattern including the monitor device 1003 is the real image, and the other area is the area where the virtual object is displayed.
  • an advertisement image may be displayed in an area where the virtual object is not displayed.
  • racefield objects 501 a plurality of moving objects 502, two turnmark objects 503, 505, large monitor objects 506, building objects 507a to 507b, and other reference numerals are added as virtual objects.
  • a large number of objects (tree objects, clock objects, etc.) that have not been displayed are displayed. These objects are created, for example, based on the first information 152A received from the first server device 20.
  • FIG. 12 is a schematic view showing an example of a screen in which a virtual object is superimposed on a real image, and shows another aspect of the race field object 501 shown in FIG. Specifically, FIG. 12 is an example in which a predetermined race is horse racing.
  • the area with the dot pattern including the monitoring device 1003 is the real image, and the other areas are the areas where the virtual objects are displayed.
  • virtual objects a race field object 511, a plurality of moving objects 512, a large monitor object 513, a pond object 514, and a plurality of tree objects 515 are displayed on the display 132. These objects are also created, for example, based on the first information 152A received from the first server device 20.
  • the racetrack object 511, the large monitor object 513, the pond object 514, and the plurality of tree objects 515 are preferably created based on racetrack data such as, for example, course information of a predetermined racetrack in the real world.
  • the plurality of moving objects 512 are, for example, objects that virtually display horses and jockeys running in horse racing.
  • step S710 of FIG. 7 the control unit 190 acquires the position information of the boats 402a to 402f in the boat race field 401 in the real world. That is, when the race by the boats 402a to 402f is started in the real world, the control unit 190 acquires the position information and the time information of the boats 402a to 402f from the first server device 20.
  • the method of acquiring the position information and the time information is as described with reference to FIG.
  • step S720 the control unit 190 controls the position information acquired in step S710 to be linked with the moving object. Specifically, using the time information and the position information, the movements of the boats 402a to 402f on the race field object 501 are controlled to be the same as the movement objects 502a to 502f on the boat race field 401.
  • step S730 the control unit 190 detects the occurrence of a predetermined event in a predetermined race.
  • the control unit 290 is based on the image analysis of the race, the information received from the second server device 40, and the information stored in the event information 252B. , The occurrence of the event is detected, and the event information indicating that the occurrence of the event is detected is transmitted to the user terminal 10.
  • the control unit 190 detects the occurrence of an event, for example, based on receiving the event information during the race.
  • step S730 the control unit 190 generates a predetermined event based on, for example, the first information 152A and the user information 153, without depending on the transmission of the event information 252B from the first server device 20 during the race. May be detected.
  • the control unit 190 may detect the occurrence of a predetermined event based on the position information of the contestants and the like and the time information corresponding to the position information.
  • step S740 the control unit 190 executes registration of the display color of the effect image corresponding to the generated event based on the identification number and the color information associated with each other in step S650. For example, in the case of a boat race, the control unit 190 registers the display color of the effect image based on the color registration corresponding to each identification number of the moving objects 502a to 502f.
  • step S750 the control unit 190 executes the display of the effect image for which the registration of the display color is completed.
  • event information is received from the first server device 20 in step S730 and the event information includes information related to the effect display
  • step S750 for example, based on the information related to the effect display
  • the effect image is displayed.
  • the display is executed. It should be noted that only the event identification information such as the event ID is received from the first server device 20, and for example, the information about the effect image corresponding to the acquired event ID or the like is acquired from the event information 152B and based on the information. An effect image may be displayed.
  • FIGS. 13 and 14 are schematic views showing an example of the display screen in step S750.
  • the effect image 1201 is displayed differently depending on the type of event detected in step S730.
  • an aura-like effect image 1201 is formed around the moving object 502c which is the target of the effect display. It is displayed.
  • This effect image 1201 is added to, for example, a contestant who is the target of the voting right purchased by the user with the occurrence of the final straight line event.
  • the effect image 1201 displayed around the moving object 502c is registered in red, which is the same color as the display color of the moving object 502c, in step S740, and is displayed in a state of being colored red in step S750.
  • the addition of the effect image 1201 is not limited to the above example, and for example, detection of the occurrence of another event other than the final linear event (for example, increased acceleration, faster speed than other contestants, etc.) May be added depending on (such as when) is detected.
  • the effect display may be added to other virtual objects such as the race field object 501 instead of the moving object 502.
  • the movement locus image 1301 through which the movement object 502c of the event target has passed among the movement objects 502a to 502f may be added to the race field object 501 as an effect display.
  • the movement locus image 1301 is registered in red, which is the same color as the display color of the moving object 502c to be displayed in step S740, and is displayed in a state of being colored red in step S750. Is preferable.
  • the moving objects 502d, 502e, and 502f are omitted from the moving objects 502a to 502f for the sake of simplification.
  • the effect image 1201 and the movement locus image 1301 are displayed only when an event occurs.
  • the effect image may be displayed at all times. That is, in step S730, the occurrence of an event may be detected based on the user purchasing the voting right of a predetermined contestant.
  • Examples of the effect image that is always displayed include a crown-shaped two-dimensional image displayed on the upper part of the moving object 502 that is the target of the voting right purchased by the user.
  • the position and shape of the effect image is not limited to the illustrated example as long as it is arranged in the vicinity of the corresponding moving object, and may be a three-dimensional image.
  • the association between the moving object 502 and the effect image is not limited to the above example, and may be added to, for example, a contestant who is not subject to the voting right purchased by the user.
  • the association between the moving object 502 and the effect image may be set in advance by the administrator of the first server device 20 or the user of the user terminal 10.
  • the user of the user terminal 10 may display the effect image corresponding to the moving object set as "favorite".
  • the effect image displayed corresponding to the moving object 502 may be configured so that the user can select it from a plurality of objects set as selection candidates. Further, the types of objects that are candidates for selection may be increased according to the user's billing and the like.
  • steps S710 to S750 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world.
  • the position information of the boats 402a to 402f from the start to the end of the race may be collectively acquired before the start of the virtual race. Further, the virtual race may be displayed by acquiring only the race time information without acquiring the position information and the like.
  • the race displayed as a virtual race may be an exhibition race.
  • step S760 the control unit 190 executes the result display process and ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
  • the program is attached to the processor.
  • the step of receiving the first information about a given race in the real world from the second computer A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
  • the step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
  • the first information includes identification information of the contestant or the moving body and color information associated with the identification information.
  • the generation step comprises generating the moving object, which is at least partially colored with a predetermined color, based on the color information. program.
  • the first information further includes color position information for designating the position of color registration to the moving object.
  • the color position information is changed according to the shape of the moving object and at least one of the attributes of the predetermined race.
  • the program according to any one of items 1 to 3.
  • the shape of the moving object changes according to the type of race in the real world, so by registering the color of the moving object according to the race, the visibility and interest of the virtual race can be further improved. ..
  • the moving object is moved on the race field object based on the first information, the predetermined race is virtually displayed, and at least one of the race field object and the moving object is displayed. Display the effect on Based on the color information, at least a part of the effect display is colored with the predetermined color.
  • the program according to any one of items 1 to 4. As a result, the user can make the target moving object more conspicuous during the virtual race, and the user can easily grasp the progress of the race. As a result, the convenience of the user can be improved.
  • the information processing method is applied to the processor.
  • the step of receiving the first information about a given race in the real world from the second computer A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
  • the step of superimposing and displaying the virtual object on the real image around the first computer captured by the imaging unit is included.
  • the first information includes the identification information of the moving object and the color information associated with the identification information.
  • the generation step comprises generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
  • Information processing method is applied to the processor.
  • the step of receiving the first information about a given race in the real world from the second computer A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of
  • An information processing device including a processor and an image pickup unit, wherein the processor is Receive the first information about a given race in the real world from the second computer, Based on the first information, a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race is generated.
  • the virtual object is superimposed and displayed on a real image around the information processing apparatus captured by the image pickup unit.
  • the first information includes the identification information of the moving object and the color information associated with the identification information.
  • Creating the virtual object comprises creating the moving object, which is at least partially colored with a predetermined color, based on the color information.
  • Information processing equipment is included in the image pickup unit.
  • the second processor is It acquires the first information about a predetermined race in the real world and transmits the first information to the first computer.
  • the first processor is The first information is received from the second computer, Based on the first information, a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race is generated.
  • the virtual object is superimposed and displayed on a real image around the first computer captured by the image pickup device.
  • the first information includes identification information of the moving object and color information associated with the identification information.
  • Creating the virtual object comprises creating the moving object, which is at least partially colored with a predetermined color, based on the color information. system. This makes it easier to identify moving objects that correspond to race participants and moving objects when displaying real-world races as virtual races using virtual objects linked to real-world races. The visibility of the race can be improved. As a result, it is possible to improve the satisfaction and convenience of users, especially users who are new to watching games.

Abstract

A program executed by a first computer causes a processor to execute: a step for receiving first information about a race in the real world from a second computer; a step for generating a virtual object including a racetrack object that represents a racetrack and moving objects that represent moving bodies or participants in a prescribed race on the basis of the first information; and a step for displaying the virtual object superimposed on a real image captured by an imaging unit. The first information includes identification information for the moving bodies or participants, and color information associated with the identification information. In the step for generating the virtual object, a moving object 502 that is at least partially colored with a prescribed color is generated on the basis of the color information.

Description

プログラム、情報処理方法、情報処理装置、及びシステムPrograms, information processing methods, information processing equipment, and systems
本開示は、プログラム、情報処理方法、情報処理装置、及びシステムに関する。 The present disclosure relates to programs, information processing methods, information processing devices, and systems.
特許文献1及び2には、AR(Augmented Reality)に関する技術が開示されている。 Patent Documents 1 and 2 disclose techniques related to AR (Augmented Reality).
特開2020-58658号公報Japanese Unexamined Patent Publication No. 2020-58658 特開2020-77187号公報Japanese Unexamined Patent Publication No. 2020-77187
ところで、現実世界においては、ボートレースや競馬などの様々なレースが実施されている。これらのレースを観戦するために実際にレース場へ足を運ぶ人々もいるが、実際にレース場へ足を運ぶことは、時間的又は地理的な制約により不可能な場合もある。 By the way, in the real world, various races such as boat races and horse races are carried out. Some people actually go to the race track to watch these races, but it may not be possible due to time or geographical constraints.
上記のような状況から、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示することができれば、実際にレース場へ行かずともレースを擬似的に観戦することができ、有益である。
ところで、ボートレースや競馬などの様々なレースを仮想的なレースとして表示する際、観戦初心者は、自分の応援している出場者が判別しにくいという問題がある。
From the above situation, if the real world race can be displayed as a virtual race using a virtual object linked to the real world race, you can watch the race in a simulated manner without actually going to the race track. Can be beneficial.
By the way, when displaying various races such as boat races and horse races as virtual races, there is a problem that it is difficult for beginners to identify the contestants they support.
特許文献1及び2に開示された技術は、現実世界の画像に現実世界とは関係のない仮想オブジェクトを重畳させて表示するものであり、上記問題を解決する手段を何ら提示していない。 The techniques disclosed in Patent Documents 1 and 2 superimpose and display a virtual object unrelated to the real world on an image in the real world, and do not present any means for solving the above problem.
本開示の一態様は、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示する際に、レースの出場者や移動体に対応する移動オブジェクトを判別し易いプログラム、情報処理方法、情報処理装置、及びシステムを提供することを目的とする。 One aspect of the present disclosure is to use a virtual object linked to a real world race to determine a moving object corresponding to a race participant or a moving object when displaying the real world race as a virtual race. The purpose is to provide easy programs, information processing methods, information processing devices, and systems.
本開示に示す一実施形態によれば、
プロセッサ及び撮像部を備えた第1コンピュータにおいて実行されるプログラムであって、
前記プログラムは、前記プロセッサに、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成するステップと、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させ、
前記第1情報には、前記出場者又は前記移動体の識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
前記生成するステップは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、プログラムが提供される。
According to one embodiment presented in the present disclosure.
A program executed in a first computer equipped with a processor and an image pickup unit.
The program is attached to the processor.
The step of receiving the first information about a given race in the real world from the second computer,
A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
The step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
The first information includes identification information of the contestant or the moving body and color information associated with the identification information.
The generating step is provided with a program comprising generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
本開示に示す一実施形態によれば、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示する際に、レースの出場者や移動体に対応する移動オブジェクトを判別し易くすることができる。 According to one embodiment shown in the present disclosure, when a real world race is displayed as a virtual race by using a virtual object linked to the real world race, the movement corresponding to a race participant or a moving object is performed. The object can be easily identified.
ある実施の形態に従うシステムの構成例を示す図である。It is a figure which shows the configuration example of the system which follows a certain embodiment. ある実施の形態に従うユーザ端末の機能的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional configuration of the user terminal according to a certain embodiment. ある実施の形態に従うサーバの機能的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional configuration of the server according to a certain embodiment. ある実施の形態に従う現実世界のレース場の一例を示す模式図である。It is a schematic diagram which shows an example of the race field of the real world according to a certain embodiment. ある実施の形態に従うユーザ端末に表示される仮想オブジェクトの一例を示す模式図である。It is a schematic diagram which shows an example of the virtual object displayed on the user terminal according to a certain embodiment. ある実施の形態に従う表示制御処理の一例を示すフローチャートである。It is a flowchart which shows an example of the display control processing according to a certain embodiment. ある実施の形態に従う表示制御処理の一例を示すフローチャートである。It is a flowchart which shows an example of the display control processing according to a certain embodiment. ある実施の形態に従うユーザ端末に表示されるレース選択画面を示す図である。It is a figure which shows the race selection screen which is displayed on the user terminal according to a certain embodiment. (a)及び(b)は、ある実施の形態に従うユーザ端末に表示される移動オブジェクトの一例を示す模式図である。(A) and (b) are schematic diagrams showing an example of a moving object displayed on a user terminal according to a certain embodiment. ある実施の形態に従う撮像部により撮像された現実画像の一例を示す模式図である。It is a schematic diagram which shows an example of the real image which was taken by the image pickup part according to a certain embodiment. ある実施の形態に従う現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図である。It is a schematic diagram which shows an example of the screen which superposed virtual object on the real image which follows a certain embodiment. ある実施の形態に従う現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図である。It is a schematic diagram which shows an example of the screen which superposed virtual object on the real image which follows a certain embodiment. ある実施の形態に従うユーザ端末に表示される仮想オブジェクトの一例を示す模式図である。It is a schematic diagram which shows an example of the virtual object displayed on the user terminal according to a certain embodiment. ある実施の形態に従うユーザ端末に表示される仮想オブジェクトの一例を示す模式図である。It is a schematic diagram which shows an example of the virtual object displayed on the user terminal according to a certain embodiment.
以下、この技術的思想の実施の形態について図面を参照しながら詳細に説明する。以下の説明では、同一の要素には同一の符号を付し、重複する説明を適宜省略する。本開示において示される1以上の実施形態において、各実施形態が含む要素を互いに組み合わせることができ、かつ、当該組み合わせられた結果物も本開示が示す実施形態の一部をなすものとする。 Hereinafter, embodiments of this technical idea will be described in detail with reference to the drawings. In the following description, the same elements are designated by the same reference numerals, and duplicate description will be omitted as appropriate. In one or more embodiments shown in the present disclosure, the elements included in each embodiment may be combined with each other, and the combined result shall also form part of the embodiments shown in the present disclosure.
(システムの構成)
図1は、本実施の形態に従うシステム1の構成を示す図である。システム1は、例えば、現実世界で実施される所定のレースを、ユーザが使用する情報処理装置上において仮想オブジェクトを用いた仮想レースとして表示することが可能なものである。本明細書において、「所定のレース」とは、現実世界で実施されるレースであれば特に制限はされず、例えば、ボートレース(本番のレースや、展示レース)、競馬、競輪、オートレース、F1等のカーレース、ドローンレース、ドッグレース、マラソン、駅伝などが挙げられる。
(System configuration)
FIG. 1 is a diagram showing a configuration of a system 1 according to the present embodiment. The system 1 can display, for example, a predetermined race performed in the real world as a virtual race using a virtual object on an information processing device used by a user. In the present specification, the "predetermined race" is not particularly limited as long as it is a race carried out in the real world, and for example, a boat race (actual race or exhibition race), horse racing, bicycle racing, auto racing, etc. Car races such as F1, drone races, dog races, marathons, ekiden, etc. can be mentioned.
図1に示すように、システム1は、各ユーザが使用する情報処理装置(第1コンピュータ)であるユーザ端末10A、ユーザ端末10B及びユーザ端末10C(以下、ユーザ端末10A、10B、10Cなどのユーザ端末を総称して「ユーザ端末10」とも称する)など複数のユーザ端末10と、第1サーバ装置(第2コンピュータ)20と、第2サーバ装置40と、ネットワーク30と、を含む。 As shown in FIG. 1, the system 1 is a user such as a user terminal 10A, a user terminal 10B, and a user terminal 10C (hereinafter, user terminals 10A, 10B, 10C) which are information processing devices (first computers) used by each user. It includes a plurality of user terminals 10 such as (generally referred to as "user terminal 10"), a first server device (second computer) 20, a second server device 40, and a network 30.
ユーザ端末10Aとユーザ端末10Bとは、無線基地局31と通信することにより、ネットワーク30と接続する。ユーザ端末10Cは、家屋などの施設に設置される無線ルータ32と通信することにより、ネットワーク30と接続する。ユーザ端末10は、例えば、タッチスクリーンを備える携帯型端末であり、スマートフォン、ファブレット、タブレットなどでありうる。 The user terminal 10A and the user terminal 10B are connected to the network 30 by communicating with the radio base station 31. The user terminal 10C connects to the network 30 by communicating with a wireless router 32 installed in a facility such as a house. The user terminal 10 is, for example, a portable terminal provided with a touch screen, and may be a smartphone, a phablet, a tablet, or the like.
ユーザ端末10は、例えば、アプリ等を配信するプラットフォームを介してインストールされたプログラム、又は、予めプリインストールされているウェブサイト閲覧用ソフトウエアなどを含むプログラムを実行する。ユーザ端末10は、上記プログラムの実行により、第1サーバ装置20と通信し、所定のレースに関連するデータやユーザに関連するデータ等を第1サーバ装置20との間で送受信することにより、ユーザ端末10上で仮想レースを表示することを可能とする。 The user terminal 10 executes, for example, a program installed via a platform for distributing an application or the like, or a program including pre-installed website browsing software. The user terminal 10 communicates with the first server device 20 by executing the above program, and transmits / receives data related to a predetermined race, data related to the user, and the like to / from the first server device 20. It is possible to display a virtual race on the terminal 10.
第1サーバ装置20は、所定のレースに関連するデータを、第2サーバ装置40から受信する。第1サーバ装置20は、所定のレースに関連するデータを、適宜、ユーザ端末10へ送信する。第1サーバ装置20は、所定のレースに関連するデータや、各ユーザに関連するデータを記憶して管理する。 The first server device 20 receives data related to a predetermined race from the second server device 40. The first server device 20 appropriately transmits data related to a predetermined race to the user terminal 10. The first server device 20 stores and manages data related to a predetermined race and data related to each user.
第1サーバ装置20は、ハードウェア構成として、通信IF(Interface)22と、入出力IF23と、メモリ25と、ストレージ26と、プロセッサ(第2プロセッサ)29と、を備え、これらが通信バスを介して互いに接続されている。 The first server device 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor (second processor) 29 as a hardware configuration, and these provide a communication bus. They are connected to each other via.
通信IF22は、例えばLAN(Local Area Network)規格など各種の通信規格に対応しており、ユーザ端末10や第2サーバ装置40などとの間でデータを送受信するためのインタフェースとして機能する。 The communication IF 22 supports various communication standards such as LAN (Local Area Network) standards, and functions as an interface for transmitting and receiving data to and from the user terminal 10 and the second server device 40.
入出力IF23は、第1サーバ装置20への情報の入力を受け付けるとともに、第1サーバ装置20の外部へ情報を出力するためのインタフェースとして機能する。入出力IF23は、マウス、キーボード等の情報入力機器の接続を受け付ける入力受付部と、画像等を表示するためのディスプレイ等の情報出力機器の接続を受け付ける出力部とを含みうる。 The input / output IF 23 functions as an interface for accepting input of information to the first server device 20 and outputting information to the outside of the first server device 20. The input / output IF 23 may include an input receiving unit that accepts the connection of an information input device such as a mouse and a keyboard, and an output unit that accepts the connection of an information output device such as a display for displaying an image or the like.
メモリ25は、処理に使用されるデータ等を記憶するための記憶装置である。メモリ25は、例えば、プロセッサ29が処理を行う際に一時的に使用するための作業領域をプロセッサ29に提供する。メモリ25は、ROM(Read Only Memory)、RAM(Random Access Memory)等の記憶装置を含んで構成されている。 The memory 25 is a storage device for storing data or the like used for processing. The memory 25 provides the processor 29 with a work area for temporary use, for example, when the processor 29 performs processing. The memory 25 includes a storage device such as a ROM (ReadOnlyMemory) and a RAM (RandomAccessMemory).
ストレージ26は、プロセッサ29が読み込んで実行するための各種プログラム及びデータを記憶するための記憶装置である。ストレージ26が記憶する情報には、所定のレースに関連するデータ(移動体の識別情報や色情報を含む)や、各ユーザに関連するデータ等が含まれる。ストレージ26は、HDD(Hard      Disk Drive)、フラッシュメモリ等の記憶装置を含んで構成されうる。なお、ストレージは、サーバ装置に含まれる形態に限られず、クラウドサービスを利用することもできる。 The storage 26 is a storage device for storing various programs and data for the processor 29 to read and execute. The information stored in the storage 26 includes data related to a predetermined race (including identification information and color information of a moving object), data related to each user, and the like. The storage 26 may be configured to include a storage device such as an HDD (Hard Disk Drive) or a flash memory. The storage is not limited to the form included in the server device, and a cloud service can also be used.
プロセッサ29は、ストレージ26に記憶されるプログラム等を読み込んで実行することにより、第1サーバ装置20の動作を制御する。プロセッサ29は、例えば、CPU(Central Processing Unit)、MPU(Micro          Processing Unit)、GPU(Graphics Processing Unit)等を含んで構成されうる。 The processor 29 controls the operation of the first server device 20 by reading and executing a program or the like stored in the storage 26. The processor 29 may be configured to include, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and the like.
第2サーバ装置40は、所定のレースに関連するデータを記憶して管理している。第2サーバ装置40は、例えば、所定のレースの開催者が管理するサーバ装置や、所定のレースに関する情報を外部へ発信する団体(レース専門誌の発行者、レースの映像配信者又はラジオ配信者など)が管理するサーバ装置である。第2サーバ装置40は、所定のレースに関連するデータを、適宜、第1サーバ装置20へ送信する。ある局面においては、第2サーバ装置40は、所定のレースに関連するデータをユーザ端末10へ送信してもよい。第2サーバ装置40のハードウェア構成は、矛盾の生じない範囲で、第1サーバ装置20と同様であってもよい。第2サーバ装置40は、複数あってもよい。 The second server device 40 stores and manages data related to a predetermined race. The second server device 40 is, for example, a server device managed by a predetermined race organizer or an organization (publisher of a race magazine, race video distributor, or radio distributor) that transmits information about a predetermined race to the outside. Etc.) is a server device managed by. The second server device 40 appropriately transmits data related to a predetermined race to the first server device 20. In certain aspects, the second server device 40 may transmit data related to a predetermined race to the user terminal 10. The hardware configuration of the second server device 40 may be the same as that of the first server device 20 as long as there is no contradiction. There may be a plurality of second server devices 40.
(ユーザ端末)
図2は、ユーザ端末10の機能的な構成の一例を示すブロック図である。図2に示すように、ユーザ端末10は、アンテナ110と、無線通信IF120と、タッチスクリーン130と、入出力IF140と、記憶部150と、音声処理部160と、マイク161と、スピーカ162と、撮像部170と、制御部(第1プロセッサ)190と、を含む。
(User terminal)
FIG. 2 is a block diagram showing an example of the functional configuration of the user terminal 10. As shown in FIG. 2, the user terminal 10 includes an antenna 110, a wireless communication IF 120, a touch screen 130, an input / output IF 140, a storage unit 150, a voice processing unit 160, a microphone 161 and a speaker 162. It includes an image pickup unit 170 and a control unit (first processor) 190.
アンテナ110は、ユーザ端末10が発する信号を電波として空間へ放射する。また、アンテナ110は、空間から電波を受信して受信信号を無線通信IF120へ与える。 The antenna 110 radiates a signal emitted by the user terminal 10 into space as a radio wave. Further, the antenna 110 receives radio waves from the space and gives a received signal to the wireless communication IF 120.
無線通信IF120は、ユーザ端末10が他の通信機器と通信するため、アンテナ110等を介して信号を送受信するための変復調処理などを行う。無線通信IF120は、チューナー、高周波回路などを含む無線通信用の通信モジュールであり、ユーザ端末10が送受信する無線信号の変復調や周波数変換を行い、受信信号を制御部190へ与える。 Since the user terminal 10 communicates with other communication devices, the wireless communication IF 120 performs modulation / demodulation processing for transmitting / receiving signals via the antenna 110 or the like. The wireless communication IF 120 is a communication module for wireless communication including a tuner, a high frequency circuit, etc., performs modulation / demodulation and frequency conversion of the wireless signal transmitted / received by the user terminal 10, and supplies the received signal to the control unit 190.
タッチスクリーン130は、ユーザからの入力を受け付けて、ユーザに対し情報をディスプレイ132に出力する。タッチスクリーン130は、ユーザの操作入力を受け付けるためのタッチパネル131と、ディスプレイ132と、を含む。タッチパネル131は、例えば、静電容量方式のものを用いることによって、ユーザの指などが接近したことを検出する。ディスプレイ132は、例えばLCD(Liquid Crystal Display)、有機EL(electroluminescence)その他の表示装置によって実現される。 The touch screen 130 receives input from the user and outputs information to the user on the display 132. The touch screen 130 includes a touch panel 131 for receiving user operation input and a display 132. The touch panel 131 detects that a user's finger or the like has approached by using, for example, a capacitance type touch panel. The display 132 is realized by, for example, an LCD (Liquid Crystal Display), an organic EL (electroluminescence), or other display device.
入出力IF140は、ユーザ端末10への情報の入力を受け付けるとともに、ユーザ端末10の外部へ情報を出力するためのインタフェースとして機能する。 The input / output IF 140 functions as an interface for accepting input of information to the user terminal 10 and outputting information to the outside of the user terminal 10.
記憶部150は、フラッシュメモリ、HDD等により構成され、ユーザ端末10が使用するプログラム、及び、ユーザ端末10が第1サーバ装置20等から受信する各種データ等を記憶する。 The storage unit 150 is composed of a flash memory, an HDD, or the like, and stores a program used by the user terminal 10 and various data received by the user terminal 10 from the first server device 20 or the like.
音声処理部160は、音声信号の変復調を行う。音声処理部160は、マイク161から与えられる信号を変調して、変調後の信号を制御部190へ与える。また、音声処理部160は、音声信号をスピーカ162へ与える。音声処理部160は、例えば、音声処理用のプロセッサによって実現される。マイク161は、音声信号の入力を受け付けて制御部190へ出力するための音声入力部として機能する。スピーカ162は、音声信号を、ユーザ端末10の外部へ出力するための音声出力部として機能する。 The voice processing unit 160 performs modulation / demodulation of the voice signal. The voice processing unit 160 modulates the signal given from the microphone 161 and gives the modulated signal to the control unit 190. Further, the voice processing unit 160 gives a voice signal to the speaker 162. The voice processing unit 160 is realized by, for example, a processor for voice processing. The microphone 161 functions as an audio input unit for receiving an input of an audio signal and outputting it to the control unit 190. The speaker 162 functions as an audio output unit for outputting an audio signal to the outside of the user terminal 10.
撮像部170は、ユーザ端末10の周囲の現実画像を撮像するカメラである。撮像部170によって撮像された画像は、制御部190によって画像処理がなされ、ディスプレイ132へ出力される。 The image pickup unit 170 is a camera that captures a real image around the user terminal 10. The image captured by the image pickup unit 170 is image-processed by the control unit 190 and output to the display 132.
制御部190は、記憶部150に記憶されるプログラムを読み込んで実行することにより、ユーザ端末10の動作を制御する。制御部190は、例えば、アプリケーションプロセッサによって実現される。 The control unit 190 controls the operation of the user terminal 10 by reading and executing a program stored in the storage unit 150. The control unit 190 is realized by, for example, an application processor.
制御部190がアプリケーションプログラム151を実行する処理について、より詳細に説明する。記憶部150は、アプリケーションプログラム151と、アプリケーション情報152と、ユーザ情報153と、を記憶する。 The process of executing the application program 151 by the control unit 190 will be described in more detail. The storage unit 150 stores the application program 151, the application information 152, and the user information 153.
ユーザ端末10は、例えば、第1サーバ装置20からアプリケーションプログラム151をダウンロードして記憶部150に記憶させる。また、ユーザ端末10は、第1サーバ装置20と通信することで、アプリケーション情報152及びユーザ情報153等の各種のデータを第1サーバ装置20と送受信する。 For example, the user terminal 10 downloads the application program 151 from the first server device 20 and stores it in the storage unit 150. Further, the user terminal 10 communicates with the first server device 20 to send and receive various data such as application information 152 and user information 153 to and from the first server device 20.
アプリケーションプログラム151は、ユーザ端末10において仮想レースを表示するためのプログラムである。アプリケーション情報152は、アプリケーションプログラム151が参照する各種のデータを含む。アプリケーション情報152は、第1情報152Aと、イベント情報152Bと、を含む。 The application program 151 is a program for displaying a virtual race on the user terminal 10. The application information 152 includes various data referred to by the application program 151. The application information 152 includes the first information 152A and the event information 152B.
第1情報152Aは、第1サーバ装置20や第2サーバ装置40から送信された所定のレースに関する情報である。第1情報152Aとしては、例えば、所定のレースの出場者又は移動体(以下、「出場者等」とも称する)のレースタイムを示すレースタイム情報、所定のレースの実施中における出場者等の位置情報、当該位置情報に対応する時間情報と、が含まれる。第1情報には、さらに、出場者等の識別情報と、当該識別情報に対応付けられた色情報と、後述の移動オブジェクトにおける色登録の位置を指定するための色位置情報とが含まれる。 The first information 152A is information regarding a predetermined race transmitted from the first server device 20 and the second server device 40. The first information 152A includes, for example, race time information indicating the race time of a participant or a moving body (hereinafter, also referred to as “participant”) of a predetermined race, position information of the participant or the like during the execution of the predetermined race, and the like. The time information corresponding to the position information and the time information are included. The first information further includes identification information of participants and the like, color information associated with the identification information, and color position information for designating a position of color registration in a moving object described later.
本明細書において、「出場者」とは、人間だけでなく、馬や犬などの動物も含む概念である。また、「移動体」とは、所定のレースにおいて移動の主体となるものであり、出場者が乗る動物や機体、出場者が遠隔操縦する機体などである。マラソンやドッグレース等では、「出場者」と「移動体」は同一となる。 As used herein, the term "participant" is a concept that includes not only humans but also animals such as horses and dogs. Further, the "moving body" is a main body of movement in a predetermined race, and is an animal or an aircraft on which a participant rides, an aircraft remotely controlled by a contestant, or the like. In marathons, dog races, etc., "participants" and "moving bodies" are the same.
第1情報152Aとしては、上記の他にも、例えば、所定のレースの名称、開催日時、レース場データ、出場者データ、移動体データ、オッズ情報、レース予想、レース出走表、レース直前情報、ピットレポート、レース結果、レース動画、レース静止画、過去のレース情報、その他の所定のレースに関する情報誌や情報サイトに掲載されうるような情報などを含んでもよい。 In addition to the above, the first information 152A includes, for example, a predetermined race name, date and time, race field data, contestant data, moving object data, odds information, race prediction, race start table, and information immediately before the race. It may include pit reports, race results, race videos, race stills, past race information, and other information that may be published in information magazines or information sites about a given race.
イベント情報152Bは、所定のレースにおいて発生しうる又は発生したイベントに関する情報であり、例えば、そのイベントに対応する演出表示(エフェクト表示)や演出音に関する情報が含まれる。 The event information 152B is information about an event that can occur or has occurred in a predetermined race, and includes, for example, information about an effect display (effect display) and an effect sound corresponding to the event.
イベントとしては、特に制限はされないが、例えば、先頭の出場者等のゴール到達、所定のレースの順位確定、出場者等の順位の入れ替わり、所定のタイミングにおいて2以上の出場者等の位置の差が所定の閾値以下になること(2以上の出場者等が接戦をすること)、ルール違反の発生のうちの1以上であってもよい。イベント情報152Bは、本例では、所定のレースにおいてイベントが発生した後に第1サーバ装置20から受信する情報である。 The event is not particularly limited, but for example, the goal of the first contestant, etc. is reached, the ranking of a predetermined race is confirmed, the ranking of the contestants, etc. is changed, and the difference in the positions of two or more contestants, etc. at a predetermined timing. May be less than or equal to a predetermined threshold value (two or more contestants or the like have a close battle), or one or more of the occurrence of rule violations. In this example, the event information 152B is information received from the first server device 20 after an event occurs in a predetermined race.
また、イベント情報152Bには、例えば、所定のイベントに対応する二次元画像又は三次元画像に関する情報が含まれる。具体的には、イベント情報152Bには、イベントの内容を報知するテキスト画像や、所定のキャラクタ画像、後述の仮想オブジェクトに付加するエフェクト画像等に関する情報が含まれうる。 Further, the event information 152B includes, for example, information about a two-dimensional image or a three-dimensional image corresponding to a predetermined event. Specifically, the event information 152B may include information about a text image for notifying the content of the event, a predetermined character image, an effect image to be added to a virtual object described later, and the like.
ユーザ情報153は、ユーザ端末10のユーザについての情報を含む。ユーザ情報153は、例えば、ユーザを識別する情報、ユーザ端末10の位置情報、ユーザのレース購入履歴や的中率(例えば、ボートレースであれば購入した舟券の履歴と、購入した舟券の的中率)などを含んでもよい。 The user information 153 includes information about the user of the user terminal 10. The user information 153 includes, for example, information for identifying a user, location information of a user terminal 10, a race purchase history and a hit rate of a user (for example, a history of a boat ticket purchased in the case of a boat race, and a boat ticket purchased. Hit rate) and the like may be included.
制御部190は、アプリケーションプログラム151を読み込んで実行することにより、操作入力受付部191と、送受信部192と、レース属性選択部193と、オブジェクト生成部194と、識別番号付与部195と、色情報登録部196と、表示制御部197と、平坦面検出部198Aと、イベント検出部198Bと、選択情報取得部199と、の各機能を発揮する。 By reading and executing the application program 151, the control unit 190 includes an operation input reception unit 191, a transmission / reception unit 192, a race attribute selection unit 193, an object generation unit 194, an identification number assigning unit 195, and color information. Each function of the registration unit 196, the display control unit 197, the flat surface detection unit 198A, the event detection unit 198B, and the selection information acquisition unit 199 is exhibited.
操作入力受付部191は、タッチスクリーン130の出力に基づいて、ユーザの操作入力を受け付ける。具体的には、操作入力受付部191は、ユーザの指などがタッチパネル131に接触又は接近したことを、タッチスクリーン130を構成する面の横軸及び縦軸からなる座標系の座標として検出する。 The operation input receiving unit 191 accepts the user's operation input based on the output of the touch screen 130. Specifically, the operation input receiving unit 191 detects that the user's finger or the like touches or approaches the touch panel 131 as the coordinates of the coordinate system consisting of the horizontal axis and the vertical axis of the surface constituting the touch screen 130.
操作入力受付部191は、タッチスクリーン130に対するユーザの操作を判別する。操作入力受付部191は、例えば、「接近操作」、「リリース操作」、「タップ操作」、「ダブルタップ操作」、「長押し操作(ロングタッチ操作)」、「ドラッグ操作(スワイプ操作)」、「ムーブ操作」、「フリック操作」、「ピンチイン操作」、「ピンチアウト操作」などのユーザの操作を判別する。 The operation input receiving unit 191 determines the user's operation on the touch screen 130. The operation input reception unit 191 is, for example, "approach operation", "release operation", "tap operation", "double tap operation", "long press operation (long touch operation)", "drag operation (swipe operation)", Determine the user's operation such as "move operation", "flick operation", "pinch-in operation", and "pinch-out operation".
操作入力受付部191は、ユーザ端末10に搭載された加速度センサ等によって検出されるユーザ端末10の動きを、操作入力として受け付けてもよい。 The operation input receiving unit 191 may accept the movement of the user terminal 10 detected by the acceleration sensor or the like mounted on the user terminal 10 as an operation input.
送受信部192は、無線通信IF120及びネットワーク30を介して、第1サーバ装置20や第2サーバ装置40などの外部の通信機器と各種情報の送信及び受信を行う。送受信部192は、例えば、第1サーバ装置20又は第2サーバ装置40から第1情報152Aを受信する。また、送受信部192は、例えば、操作入力受付部191が受け付けた操作入力に応じた情報や、ユーザ情報153に記憶された情報等を第1サーバ装置20又は第2サーバ装置40へと送信する。 The transmission / reception unit 192 transmits and receives various information to and from external communication devices such as the first server device 20 and the second server device 40 via the wireless communication IF 120 and the network 30. The transmission / reception unit 192 receives, for example, the first information 152A from the first server device 20 or the second server device 40. Further, the transmission / reception unit 192 transmits, for example, information corresponding to the operation input received by the operation input reception unit 191 and information stored in the user information 153 to the first server device 20 or the second server device 40. ..
レース属性選択部193は、操作入力受付部191が受け付けた操作入力に応じて、レース属性(具体的には、レースの種類)を選択する。レースの種類としては、上述の通り、例えば、ボートレース(本番のレースや、展示レース)、競馬、競輪、オートレース、F1等のカーレース、ドローンレース、ドッグレース、マラソン、駅伝などが挙げられる。 The race attribute selection unit 193 selects the race attribute (specifically, the type of race) according to the operation input received by the operation input reception unit 191. As described above, examples of race types include boat races (actual races and exhibition races), horse races, bicycle races, auto races, car races such as F1, drone races, dog races, marathons, and relay races. ..
オブジェクト生成部194は、レース属性選択部193で選択された所定のレースに関する情報をユーザに提示するための仮想オブジェクトを、第1情報に基づいて生成する。オブジェクト生成部194は、仮想オブジェクトとして、レース場を表すレース場オブジェクトと、出場者等を表す移動オブジェクトと、を生成する。オブジェクト生成部194は、第1情報に基づいて生成される第2情報をテキスト表示するための仮想表示板を生成してもよい。オブジェクト生成部194は、上記の各オブジェクトの他にも、第1情報に基づいて生成される第2情報を画像表示するための仮想スクリーン、各種の建物オブジェクト、木などの景観を構成する景観オブジェクト、ユーザのアバターとなるオブジェクト等を生成してもよい。 The object generation unit 194 generates a virtual object for presenting information about a predetermined race selected by the race attribute selection unit 193 to the user based on the first information. The object generation unit 194 generates, as virtual objects, a race field object representing a race field and a moving object representing a contestant or the like. The object generation unit 194 may generate a virtual display board for displaying the second information generated based on the first information as text. In addition to the above objects, the object generation unit 194 is a landscape object that constitutes a landscape such as a virtual screen for displaying an image of the second information generated based on the first information, various building objects, and trees. , An object or the like that becomes a user's avatar may be generated.
識別番号付与部195は、オブジェクト生成部194により生成された移動オブジェクトのそれぞれに識別番号を付与する。識別番号付与部195は、レースの属性により異なる移動オブジェクトの台数に応じて、移動オブジェクトの配置順(例えば、枠順)に識別番号を付与する。 The identification number assigning unit 195 assigns an identification number to each of the moving objects generated by the object generation unit 194. The identification number assigning unit 195 assigns identification numbers in the order of arrangement (for example, frame order) of the moving objects according to the number of moving objects that differ depending on the attributes of the race.
色情報登録部196は、レース属性選択部193で選択された所定のレースにおける出場者等の色情報を取得する。色情報登録部196は、さらに取得した色情報を、識別番号付与部195により各移動オブジェクトに付与された識別番号に対応付けて、移動オブジェクトへの色登録を実行する。例えば、色情報登録部196は、第2サーバ装置40から受信した第1情報152Aから識別情報と色情報を取得し、取得した識別情報と色情報とに基づいて、識別番号付与部195により識別番号が付与された各移動オブジェクトの所定部分に所定の色を登録する。 The color information registration unit 196 acquires color information such as participants in a predetermined race selected by the race attribute selection unit 193. The color information registration unit 196 further associates the acquired color information with the identification number assigned to each moving object by the identification number assigning unit 195, and executes color registration to the moving object. For example, the color information registration unit 196 acquires identification information and color information from the first information 152A received from the second server device 40, and is identified by the identification number assigning unit 195 based on the acquired identification information and color information. A predetermined color is registered in a predetermined part of each numbered moving object.
表示制御部197は、撮像部170により撮像されたユーザ端末10の周囲の現実画像に、オブジェクト生成部194により生成された仮想オブジェクトを重畳させた画像(以下、「重畳画像」とも称する)を、ディスプレイ132上に表示させる。表示制御部197は、第1情報152Aに含まれるレースタイム情報に基づいて、色情報登録部196により所定部分が着色された各移動オブジェクトをレース場オブジェクト上において移動させ、所定のレースを仮想的に再現した仮想レースをディスプレイ132上に表示させる。表示制御部197は、レースタイム情報に加えて、第1情報152Aに含まれる出場者等の位置情報、及び当該位置情報に対応する時間情報に基づいて、仮想レースを再現するものであることが好ましい。 The display control unit 197 superimposes an image (hereinafter, also referred to as “superimposed image”) on which a virtual object generated by the object generation unit 194 is superimposed on a real image around the user terminal 10 imaged by the image pickup unit 170. It is displayed on the display 132. Based on the race time information included in the first information 152A, the display control unit 197 moves each moving object whose predetermined portion is colored by the color information registration unit 196 on the racecourse object, and virtually performs the predetermined race. The reproduced virtual race is displayed on the display 132. It is preferable that the display control unit 197 reproduces the virtual race based on the position information of the contestants and the like included in the first information 152A and the time information corresponding to the position information in addition to the race time information. ..
表示制御部197は、操作入力受付部191が受け付けた操作入力に応じて、重畳画像における視点を変更可能であることが好ましい。表示制御部197は、操作入力受付部191が受け付けた操作入力に応じて、ディスプレイ132に各種のメニュー画面やGUI(Graphical User Interface)を表示させたり、ディスプレイ132の表示内容を変更したりする。 It is preferable that the display control unit 197 can change the viewpoint in the superimposed image according to the operation input received by the operation input reception unit 191. The display control unit 197 displays various menu screens and GUIs (Graphical User Interfaces) on the display 132, and changes the display contents of the display 132, in response to the operation input received by the operation input reception unit 191.
また、表示制御部197は、イベント検出部198Bがイベントの発生を検出したことに応じて、該イベントに対応する演出表示をする。演出表示の内容としては、特に制限はされないが、例えば、レース場オブジェクト及び移動オブジェクトの少なくとも一方へのエフェクト表示であってもよく、所定のイベントに対応する二次元画像又は三次元画像の表示、所定のキャラクタ画像の表示、これらの画像をレース進行に応じて動作させることのうちの1以上であってもよい。表示制御部197は、異なるイベントに対応する2以上の演出表示を同時に実行してもよい。 Further, the display control unit 197 displays an effect corresponding to the event in response to the event detection unit 198B detecting the occurrence of the event. The content of the effect display is not particularly limited, but may be, for example, an effect display on at least one of a race field object and a moving object, and a display of a two-dimensional image or a three-dimensional image corresponding to a predetermined event. It may be one or more of displaying predetermined character images and operating these images according to the progress of the race. The display control unit 197 may simultaneously execute two or more effect displays corresponding to different events.
平坦面検出部198Aは、撮像部170により撮像された現実画像内における平坦面を検出する。平坦面の検出は、従来公知の画像認識技術によって実現される。例えば、平坦面検出部198Aが検出した平坦面を選択する操作をユーザが行った場合、当該平坦面にレース場オブジェクトが配置された重畳画像がディスプレイ132に表示される。 The flat surface detection unit 198A detects a flat surface in a real image captured by the image pickup unit 170. The detection of a flat surface is realized by a conventionally known image recognition technique. For example, when the user performs an operation of selecting a flat surface detected by the flat surface detection unit 198A, a superimposed image in which the race field object is arranged on the flat surface is displayed on the display 132.
平坦面は、水平面であることが好ましい。また、平坦面とレース場オブジェクトを構成する底面とのなす角度は0度であってもよいが、鋭角であることが好ましく、例えば、15度~45度の範囲にすることができる。上記角度は、ユーザの操作を受け付けて調節する態様とすることもできる。また、現実世界の平坦面の一部に凸部がある場合や、当該平坦面に載置物がある場合であっても、当該凸部や当該載置物がレース場オブジェクトによって隠れる程度のサイズならば、レース場オブジェクトを配置可能は平坦面として検出してもよい。 The flat surface is preferably a horizontal surface. Further, the angle formed by the flat surface and the bottom surface constituting the race field object may be 0 degrees, but is preferably an acute angle, and can be, for example, in the range of 15 degrees to 45 degrees. The above angle may be adjusted by accepting a user's operation. Also, even if there is a convex part on a part of the flat surface in the real world, or if there is an object on the flat surface, if the convex part or the object is of a size that can be hidden by the racetrack object. , The place where the race field object can be placed may be detected as a flat surface.
イベント検出部198Bは、所定のレースにおける所定のイベントの発生を検出する。イベント検出部198Bは、例えば、現実世界の所定のレースにおいて所定のイベントが発生した場合に第1サーバ装置20から送信されるイベント情報292Bを送受信部192が受信したことに基づいて、所定のイベントの発生を検出する。 The event detection unit 198B detects the occurrence of a predetermined event in a predetermined race. The event detection unit 198B receives, for example, event information 292B transmitted from the first server device 20 when a predetermined event occurs in a predetermined race in the real world, and the transmission / reception unit 192 receives the predetermined event. Detects the occurrence of.
また、イベント検出部198Bは、例えば、第1情報152Aやユーザ情報153に基づいて、所定のイベントの発生を検出してもよい。イベント検出部198Bは、具体的には、出場者等の位置情報と当該位置情報に対応する時間情報とに基づいて、出場者等の順位の入れ替わり等のレース中の出場者等の位置から判断可能なイベントを検出しうる。イベント検出部198Bは、例えば、レース結果に関する情報と、ユーザのレース購入履歴に関する情報とに基づいて、ユーザの購入した出場者等がレースの先頭に位置する等のイベントを検出しうる。 Further, the event detection unit 198B may detect the occurrence of a predetermined event based on, for example, the first information 152A and the user information 153. Specifically, the event detection unit 198B determines from the position of the contestants, etc. during the race, such as the change of ranking of the contestants, etc., based on the position information of the contestants, etc. and the time information corresponding to the position information. Can detect possible events. The event detection unit 198B can detect an event such that a contestant or the like purchased by the user is located at the head of the race based on, for example, information on the race result and information on the race purchase history of the user.
選択情報取得部199は、ユーザが選択した1以上の出場者等を示す1以上の選択情報を取得する。「選択情報」とは、例えば、所定のレースの勝者等としてユーザが予想した出場者等を示す情報である。所定のレースがボートレースや競馬である場合、「選択情報」は、ユーザが購入した舟券や馬券を示す情報でありうる。 The selection information acquisition unit 199 acquires one or more selection information indicating one or more contestants selected by the user. The "selection information" is, for example, information indicating a contestant or the like predicted by the user as a winner or the like of a predetermined race. When the predetermined race is a boat race or a horse race, the "selection information" may be information indicating a boat ticket or a betting ticket purchased by the user.
選択情報取得部199は、例えば、出場者等を選択するための画面上でのユーザの操作入力に基づいて、選択情報を取得する。選択情報取得部199が取得した選択情報は、例えば、第1サーバ装置20又は第2サーバ装置40へ送信される。所定のレースがボートレースや競馬である場合、選択情報が第2サーバ装置40へ送信されたことに応じて、舟券や馬券の購入処理がなされうる。 The selection information acquisition unit 199 acquires selection information based on, for example, a user's operation input on the screen for selecting a contestant or the like. The selection information acquired by the selection information acquisition unit 199 is transmitted to, for example, the first server device 20 or the second server device 40. When the predetermined race is a boat race or a horse race, the purchase process of the boat ticket or the betting ticket may be performed according to the selection information being transmitted to the second server device 40.
(第1サーバ装置)
図3は、第1サーバ装置20の機能的な構成を示すブロック図である。図3を参照して、第1サーバ装置20の詳細な構成を説明する。第1サーバ装置20は、プログラムに従って動作することにより、通信部220と、記憶部250と、制御部290としての機能を発揮する。
(1st server device)
FIG. 3 is a block diagram showing a functional configuration of the first server device 20. A detailed configuration of the first server device 20 will be described with reference to FIG. The first server device 20 exerts functions as a communication unit 220, a storage unit 250, and a control unit 290 by operating according to a program.
通信部220は、第1サーバ装置20がユーザ端末10や第2サーバ装置40などの外部の通信機器とネットワーク30を介して通信するためのインタフェースとして機能する。 The communication unit 220 functions as an interface for the first server device 20 to communicate with an external communication device such as the user terminal 10 or the second server device 40 via the network 30.
記憶部250は、システム1を実現するための各種プログラム及びデータを記憶する。ある局面において、記憶部250は、プログラム251と、レース情報252と、ユーザ情報253とを記憶する。 The storage unit 250 stores various programs and data for realizing the system 1. In a certain aspect, the storage unit 250 stores the program 251 and the race information 252 and the user information 253.
プログラム251は、第1サーバ装置20がユーザ端末10や第2サーバ装置40と通信して、システム1を実現するためのプログラムである。プログラム251は、制御部290に実行されることにより、ユーザ端末10や第2サーバ装置40とデータを送受信する処理、ユーザ端末10のユーザが行った操作内容に応じた処理、レース情報252やユーザ情報253を更新する処理などを第1サーバ装置20に行わせる。 The program 251 is a program for the first server device 20 to communicate with the user terminal 10 and the second server device 40 to realize the system 1. The program 251 is executed by the control unit 290 to send and receive data to and from the user terminal 10 and the second server device 40, a process according to the operation content performed by the user of the user terminal 10, race information 252, and the user. The first server device 20 is made to perform a process of updating the information 253 and the like.
レース情報252は、所定のレースに関連する各種のデータを含む。レース情報252は、例えば、第1情報252Aとイベント情報252Bとを含む。第1情報252Aは、第1情報152Aの元となる情報であり、第1情報152Aは、第1情報252Aの一部でありうる。第1情報252Aは、例えば、第2サーバ装置40から取得された情報である。同様に、イベント情報252Bは、イベント情報152Bの元となる情報であり、イベント情報152Bは、イベント情報252Bの一部でありうる。イベント情報252Bは、例えば、第2サーバ装置40から取得された情報である。 Race information 252 contains various data related to a given race. The race information 252 includes, for example, the first information 252A and the event information 252B. The first information 252A is the source information of the first information 152A, and the first information 152A can be a part of the first information 252A. The first information 252A is, for example, information acquired from the second server device 40. Similarly, the event information 252B is the source information of the event information 152B, and the event information 152B may be a part of the event information 252B. The event information 252B is, for example, information acquired from the second server device 40.
ユーザ情報253は、ユーザ端末10のユーザについての情報である。ユーザ情報253は、ユーザ管理テーブル253Aを含む。ユーザ管理テーブル253Aは、例えば、ユーザを識別する情報、ユーザ端末10の位置情報、ユーザのレース購入履歴や的中率などをユーザ毎に記憶している。 The user information 253 is information about the user of the user terminal 10. The user information 253 includes a user management table 253A. The user management table 253A stores, for example, information for identifying the user, position information of the user terminal 10, a race purchase history of the user, a hit rate, and the like for each user.
制御部290は、プロセッサ29によって実現され、プログラム251を実行することにより、送受信部291、第1情報取得部292A、イベント情報取得部292B、選択情報取得部293、データ管理部294、計時部295としての機能を発揮する。 The control unit 290 is realized by the processor 29, and by executing the program 251, the transmission / reception unit 291, the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, the data management unit 294, and the timekeeping unit 295. Demonstrate the function as.
送受信部291は、通信部220及びネットワーク30を介して、ユーザ端末10や第2サーバ装置40などの外部の通信機器と各種情報の送信及び受信を行う。送受信部291は、例えば、第1情報252A及びイベント情報252Bの少なくとも一部をユーザ端末10へ送信する。また、送受信部291は、例えば、第1情報252A及びイベント情報252Bを第2サーバ装置40から受信する。 The transmission / reception unit 291 transmits and receives various information to and from an external communication device such as a user terminal 10 and a second server device 40 via the communication unit 220 and the network 30. The transmission / reception unit 291 transmits, for example, at least a part of the first information 252A and the event information 252B to the user terminal 10. Further, the transmission / reception unit 291 receives, for example, the first information 252A and the event information 252B from the second server device 40.
第1情報取得部292Aは、送受信部291を介して、第1情報252Aを第2サーバ装置40から取得する。イベント情報取得部292Bは、送受信部291を介して、イベント情報252Bを第2サーバ装置40から取得する。選択情報取得部293は、送受信部291を介して、ユーザの選択情報をユーザ端末10又は第2サーバ装置40から取得する。データ管理部294は、第1情報取得部292A、イベント情報取得部292B、選択情報取得部293等における処理結果に従って、記憶部250に記憶される各種データを更新する処理を行う。計時部295は、時間を計測する処理を行う。計時部295によって計測される時間に基づいて、ユーザ端末10に表示される各種時間(例えば、レース開始までの時間など)が制御されうる。 The first information acquisition unit 292A acquires the first information 252A from the second server device 40 via the transmission / reception unit 291. The event information acquisition unit 292B acquires the event information 252B from the second server device 40 via the transmission / reception unit 291. The selection information acquisition unit 293 acquires the user's selection information from the user terminal 10 or the second server device 40 via the transmission / reception unit 291. The data management unit 294 performs a process of updating various data stored in the storage unit 250 according to the processing results of the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, and the like. The timekeeping unit 295 performs a process of measuring time. Various times displayed on the user terminal 10 (for example, the time until the start of the race) can be controlled based on the time measured by the time measuring unit 295.
(動作例)
次に、図4から図13を参照して、システム1における動作例について説明する。以下では、所定のレースがボートレースである場合を例示して説明をするが、以下の説明は、所定のレースが他のレースであっても適用可能である。また、以下では、ユーザ端末10と第1サーバ装置20の間、及び第1サーバ装置20と第2サーバ装置40の間でデータの送受信を行うものとして説明をするが、ある局面においては、ユーザ端末10と第2サーバ装置40とで直接的にデータの送受信をするように構成してもよい。
(Operation example)
Next, an operation example in the system 1 will be described with reference to FIGS. 4 to 13. In the following, a case where a predetermined race is a boat race will be described as an example, but the following description can be applied even if the predetermined race is another race. Further, in the following description, it is assumed that data is transmitted / received between the user terminal 10 and the first server device 20 and between the first server device 20 and the second server device 40, but in a certain aspect, the user The terminal 10 and the second server device 40 may be configured to directly transmit and receive data.
図4は、現実世界のボートレース場の一例を示す模式図である。ボートレース場401には、2つのターンマーク403が設置されており、各ボートレーサーが乗るボート402a~402fによるレースが実施されている。レースが終了した場合、ボート402a~402fそれぞれのレースタイムを示すレースタイム情報が、第2サーバ装置40から第1サーバ装置20へと送信され、第1サーバ装置20からユーザ端末10へと送信される。 FIG. 4 is a schematic diagram showing an example of a boat race field in the real world. Two turn marks 403 are installed in the boat race field 401, and races are carried out by boats 402a to 402f on which each boat racer rides. When the race is completed, the race time information indicating the race times of the boats 402a to 402f is transmitted from the second server device 40 to the first server device 20, and is transmitted from the first server device 20 to the user terminal 10.
ボートレース場401には、撮像装置(カメラ)404a~404bが設けられている。撮像装置404aは、ボートレース場401の上方からボート402a~402fを視界に収める。撮像装置404bは、ボートレース場401の側方からボート402a~402fを視界に収める。撮像装置404a~404bによって撮像されたボート402a~402fの画像は、第2サーバ装置40へ送信される。第2サーバ装置40では、例えば、各画像の画像解析を行い、各画像の撮影時間におけるボート402a~402fそれぞれの位置を示す位置情報を算出する。算出された位置情報と、当該位置情報に対応する撮影時間に関する時間情報は、第1サーバ装置20へと送信され、第1サーバ装置20からユーザ端末10へと送信される。なお、位置情報の算出は、第1サーバ装置20において実施してもよい。 The boat race field 401 is provided with image pickup devices (cameras) 404a to 404b. The image pickup apparatus 404a captures the boats 402a to 402f in the field of view from above the boat race field 401. The image pickup apparatus 404b keeps the boats 402a to 402f in the field of view from the side of the boat race field 401. The images of the boats 402a to 402f captured by the image pickup devices 404a to 404b are transmitted to the second server device 40. In the second server device 40, for example, image analysis of each image is performed, and position information indicating the position of each of the boats 402a to 402f in the shooting time of each image is calculated. The calculated position information and the time information related to the shooting time corresponding to the position information are transmitted to the first server device 20, and are transmitted from the first server device 20 to the user terminal 10. The location information may be calculated by the first server device 20.
また、撮像装置404a~404bに代えて、又は加えて、ボート402a~402fに、GPSセンサ等の位置センサを設置してもよい。位置センサによって取得されたボート402a~402fの位置情報と、当該位置情報が取得された時間を示す時間情報とは、最終的にユーザ端末10へと送信される。 Further, a position sensor such as a GPS sensor may be installed on the boats 402a to 402f instead of or in addition to the image pickup devices 404a to 404b. The position information of the boats 402a to 402f acquired by the position sensor and the time information indicating the time when the position information was acquired are finally transmitted to the user terminal 10.
図5は、ユーザ端末10に表示される仮想オブジェクトの一例を示す模式図である。図5の例では、仮想オブジェクトとして、レース場オブジェクト501と、移動オブジェクト502a~502fと、2つのターンマークオブジェクト503と、仮想表示板505と、が示されている。 FIG. 5 is a schematic diagram showing an example of a virtual object displayed on the user terminal 10. In the example of FIG. 5, as virtual objects, a race field object 501, moving objects 502a to 502f, two turnmark objects 503, and a virtual display board 505 are shown.
レース場オブジェクト501は、ボートレース場401を仮想的に表示したオブジェクトである。レース場オブジェクト501やターンマークオブジェクト503は、例えば、ボートレース場401のコース情報等のレース場データに基づいて作製されることが好ましく、ボートレース場401と対応する形状であることが好ましい。 The race field object 501 is an object that virtually displays the boat race field 401. The race field object 501 and the turn mark object 503 are preferably created based on race field data such as course information of the boat race field 401, and preferably have a shape corresponding to the boat race field 401.
移動オブジェクト502a~502fは、ボート402a~402fをそれぞれ仮想的に表示したオブジェクトであり、ボートを模した形状である。移動オブジェクト502a~502fは、レースタイム情報や、ボート402a~402fの位置情報及び当該位置情報に対応する時間情報に基づいて、レース場オブジェクト501を移動する。すなわち、レース場オブジェクト501及び移動オブジェクト502a~502fによって、現実世界のレースが仮想的なレースとしてユーザ端末10上で表示される。 The moving objects 502a to 502f are objects that virtually display the boats 402a to 402f, respectively, and have a shape imitating a boat. The moving objects 502a to 502f move the race field object 501 based on the race time information, the position information of the boats 402a to 402f, and the time information corresponding to the position information. That is, the race field object 501 and the moving objects 502a to 502f display the race in the real world as a virtual race on the user terminal 10.
ボート402a~402fをそれぞれ仮想的に表示したものである移動オブジェクト502a~502fは、色情報登録部196により、所定の位置に現実世界のボート402a~402fと一致した着色がなされた状態でユーザ端末10上に表示される。図4及び図5で示すように、移動オブジェクト502aは現実世界のボート402a(1号艇)と同様に白色で着色され、移動オブジェクト502bは現実世界のボート402b(2号艇)と同様に黒色で着色され、移動オブジェクト502cは現実世界のボート402c(3号艇)と同様に赤色で着色され、移動オブジェクト502dは現実世界のボート402d(4号艇)と同様に青色で着色され、移動オブジェクト502eは現実世界のボート402e(5号艇)と同様に黄色で着色され、移動オブジェクト502fは現実世界のボート402f(6号艇)と同様に緑色で着色される。 The moving objects 502a to 502f, which are virtual representations of the boats 402a to 402f, are colored by the color information registration unit 196 in a predetermined position in accordance with the boats 402a to 402f in the real world. 10 Displayed above. As shown in FIGS. 4 and 5, the moving object 502a is colored white like the real-world boat 402a (boat 1) and the moving object 502b is black like the real-world boat 402b (boat 2). Colored with, the moving object 502c is colored red like the real-world boat 402c (boat 3), the moving object 502d is colored blue like the real-world boat 402d (boat 4), and the moving object The 502e is colored yellow like the real world boat 402e (boat 5) and the moving object 502f is colored green like the real world boat 402f (boat 6).
なお、ボート402a~402fの位置情報及び時間情報がなくとも、レースタイム情報のみによって仮想レースを表示することも可能である。ただ、この場合、最終的な着順は現実世界のレースと同じものになるが、レース中の順位などのレース経過を再現することが困難である。 Even if there is no position information and time information of the boats 402a to 402f, it is possible to display the virtual race only by the race time information. However, in this case, the final order of arrival will be the same as in the real world race, but it is difficult to reproduce the race progress such as the ranking during the race.
仮想表示板505は、テキスト情報を表示するオブジェクトである。仮想表示板505は、例えば、ボートレース場401において対応する存在のないオブジェクトである。仮想表示板505に表示するテキスト情報は、特に制限はされず、例えば、順位情報やオッズ情報等であってもよい。また、仮想表示板505に表示するテキスト情報は、ユーザの操作入力に基づいて変更可能であってもよい。 The virtual display board 505 is an object that displays text information. The virtual display board 505 is, for example, a corresponding nonexistent object in the boat race field 401. The text information displayed on the virtual display board 505 is not particularly limited, and may be, for example, ranking information, odds information, or the like. Further, the text information displayed on the virtual display board 505 may be changeable based on the operation input of the user.
図6及び図7は、表示制御処理の一例を示すフローチャートである。なお、以下で説明するフローチャートを構成する各処理の順序は、処理内容に矛盾や不整合が生じない範囲で順不同である。また、各装置が実行する処理は、矛盾の生じない範囲で、他の装置によって実行されてもよい。 6 and 7 are flowcharts showing an example of display control processing. It should be noted that the order of each process constituting the flowchart described below is random as long as there is no contradiction or inconsistency in the process contents. Further, the processing executed by each device may be executed by another device within a range that does not cause a contradiction.
図6及び図7に示される処理は、制御部190がアプリケーションプログラム151を、制御部290がプログラム251をそれぞれ実行することにより実現される。 The processes shown in FIGS. 6 and 7 are realized by the control unit 190 executing the application program 151 and the control unit 290 executing the program 251.
まず、ステップS610において、制御部190は、ユーザの操作入力に基づいて、仮想オブジェクトとしてユーザ端末10上に表示すべきレースの種類(レース属性)を選択する。図8は、アプリケーションプログラム151を実行することによりユーザ端末10上に表示されるレース選択画面800を示す図である。図8に示すように、レース選択画面800には、アプリケーションプログラム151により仮想レースをユーザ端末10上に表示可能な複数種のレース(例えば、ボートレース、競馬、競輪、オートレース)に対応する複数の選択ボタン801a~801dが含まれる。ユーザが、レース選択画面800に表示される複数の選択ボタン801a~801dのいずれかに対して操作入力を行ったことに基づき、制御部190は、レースの種類を選択する。 First, in step S610, the control unit 190 selects a race type (race attribute) to be displayed on the user terminal 10 as a virtual object based on the user's operation input. FIG. 8 is a diagram showing a race selection screen 800 displayed on the user terminal 10 by executing the application program 151. As shown in FIG. 8, on the race selection screen 800, a plurality of races (for example, boat race, horse race, bicycle race, auto race) corresponding to a plurality of types of races (for example, boat race, horse race, bicycle race, auto race) in which a virtual race can be displayed on the user terminal 10 by the application program 151 are displayed. Selection buttons 801a to 801d are included. The control unit 190 selects a race type based on the user inputting an operation to any of the plurality of selection buttons 801a to 801d displayed on the race selection screen 800.
ステップS620において、制御部190は、選択されたレースの種類に応じて、仮想オブジェクトとしてレース場オブジェクトを生成する。本例では、例えば、ユーザがボートレースに対応する選択ボタン801aを選択したことに基づいて、ボートレースのレース場オブジェクトが生成される。 In step S620, the control unit 190 creates a race field object as a virtual object according to the selected race type. In this example, for example, a race field object of a boat race is generated based on the user selecting the selection button 801a corresponding to the boat race.
ステップS630において、制御部190は、選択されたレースの種類に応じて、仮想オブジェクトとして移動オブジェクトを生成する。本例では、ユーザがボートレースに対応する選択ボタン801aを選択したことに基づいて、ボートレースの複数のボート402a~402fにそれぞれ対応する複数の移動オブジェクト502a~502fが生成される。 In step S630, the control unit 190 creates a moving object as a virtual object according to the selected race type. In this example, based on the user selecting the selection button 801a corresponding to the boat race, a plurality of moving objects 502a to 502f corresponding to the plurality of boats 402a to 402f of the boat race are generated.
ステップS640において、制御部190は、生成された複数の移動オブジェクト502a~502fごとに識別番号を付与する。具体的には、制御部190は、現実世界のボートレースにおける1号艇から6号艇までのボート402a~402fに対応させて、移動オブジェクト502a~502fのそれぞれに識別番号を付与する。 In step S640, the control unit 190 assigns an identification number to each of the plurality of generated moving objects 502a to 502f. Specifically, the control unit 190 assigns an identification number to each of the moving objects 502a to 502f in correspondence with the boats 402a to 402f of boats 1 to 6 in the boat race in the real world.
ステップS650において、制御部190は、移動オブジェクト502a~502fに付与された識別番号と色情報とを対応付ける。具体的には、制御部190は、現実世界のボートレースにおける1号艇から6号艇までの指定色と一致する色情報を各移動オブジェクト502a~502fの識別番号と対応付ける。制御部190は、オブジェクト生成部194により生成された複数の移動オブジェクト502a~502fが現実世界のボートレースのボート402a~402fに対応するオブジェクトであることに基づいて、現実世界のボートレースにおける1号艇から6号艇までの指定色と一致する色情報を各移動オブジェクト502a~502fの識別番号と対応付ける。 In step S650, the control unit 190 associates the identification numbers assigned to the moving objects 502a to 502f with the color information. Specifically, the control unit 190 associates the color information matching the designated colors of boats 1 to 6 in the boat race in the real world with the identification numbers of the moving objects 502a to 502f. The control unit 190 is No. 1 in the real world boat race based on the fact that the plurality of moving objects 502a to 502f generated by the object generation unit 194 are objects corresponding to the boats 402a to 402f of the real world boat race. The color information that matches the designated color from the boat to the No. 6 boat is associated with the identification number of each moving object 502a to 502f.
上述の通り、現実世界のボートレースでは枠番に応じて、1号艇の指定色が白色、2号艇の指定色が黒色、3号艇の指定色が赤色、4号艇の指定色が青色、5号艇の指定色が黄色、6号艇の指定色が緑色と規定されている。そこで、制御部190は、1号艇のボート402aに対応する移動オブジェクト502aの識別番号に白色の色情報を対応付け、2号艇のボート402bに対応する移動オブジェクト502bの識別番号に黒色の色情報を対応付け、3号艇のボート402cに対応する移動オブジェクト502cの識別番号に赤色の色情報を対応付ける。さらに、制御部190は、4号艇のボート402dに対応する移動オブジェクト502dの識別番号に青色の色情報を対応付け、5号艇のボート402eに対応する移動オブジェクト502eの識別番号に黄色の色情報を対応付け、6号艇のボート402fに対応する移動オブジェクト502fの識別番号に緑色の色情報を対応付ける。 As mentioned above, in the real world boat race, the designated color of boat 1 is white, the designated color of boat 2 is black, the designated color of boat 3 is red, and the designated color of boat 4 is red. It is stipulated that the designated color of boat No. 5 is blue and the designated color of boat No. 6 is green. Therefore, the control unit 190 associates white color information with the identification number of the moving object 502a corresponding to the boat 402a of the first boat, and black color the identification number of the moving object 502b corresponding to the boat 402b of the second boat. The information is associated with the identification number of the moving object 502c corresponding to the boat 402c of the third boat, and the red color information is associated with the identification number. Further, the control unit 190 associates blue color information with the identification number of the moving object 502d corresponding to the boat 402d of the 4th boat, and yellow color the identification number of the moving object 502e corresponding to the boat 402e of the 5th boat. The information is associated with the identification number of the moving object 502f corresponding to the boat 402f of the No. 6 boat, and the green color information is associated with the identification number.
ステップS660において、制御部190は、移動オブジェクト502a~502fの識別番号と色情報との対応付けに基づいて、各移動オブジェクト502a~502fの指定箇所に色登録を行う。具体的には、制御部190は、そして、制御部190は、各移動オブジェクト502a~502fの識別番号に対応付けられた色情報及び第1情報に含まれる色位置情報に基づいて、各移動オブジェクト502a~502fの所定の位置に色登録を行う。すなわち、各移動オブジェクト502a~502fに対して、現実世界のボートの枠番に応じた色登録がなされる。 In step S660, the control unit 190 performs color registration at designated locations of the moving objects 502a to 502f based on the correspondence between the identification numbers of the moving objects 502a to 502f and the color information. Specifically, the control unit 190, and the control unit 190, each moving object based on the color information associated with the identification numbers of the moving objects 502a to 502f and the color position information included in the first information. Color registration is performed at predetermined positions of 502a to 502f. That is, for each moving object 502a to 502f, color registration is performed according to the frame number of the boat in the real world.
図9(a)は、ボートレースのボート及び選手(ボートレーサー)を模した移動オブジェクト502の一例を示す図であり、図9(b)は、競馬の競走馬及び選手(ジョッキー)を模した移動オブジェクト950の一例を示す図である。図9(a)に示すように、ボートレースの場合、色位置情報には、例えば、移動オブジェクト502のうちボート部分901の側面901A及び選手部分902のヘルメット902Aに色登録がなされることが含まれる。一方で、図9(b)に示すように、競馬の場合、色位置情報には、例えば、移動オブジェクト950のうち選手952の帽子952Aに色登録がなされることが含まれる。また、競走馬951のゼッケン951A及び勝負服952Bを色登録することも可能である。このように、第一情報に含まれる色位置情報は、レースの種類(属性)や移動オブジェクトの形状に応じて変更される。なお、色位置情報はユーザの操作入力により変更可能であってもよい。すなわち、ユーザが所望する移動オブジェクトの所定の位置に色登録がなされるようにユーザが色位置情報を選択・変更することができるようにしてもよい。 FIG. 9 (a) is a diagram showing an example of a moving object 502 that imitates a boat and a player (boat racer) of a boat race, and FIG. 9 (b) is a diagram that imitates a racehorse and a player (jockey) of horse racing. It is a figure which shows an example of a moving object 950. As shown in FIG. 9A, in the case of boat race, the color position information includes, for example, color registration being performed on the side surface 901A of the boat portion 901 and the helmet 902A of the athlete portion 902 of the moving object 502. Is done. On the other hand, as shown in FIG. 9B, in the case of horse racing, the color position information includes, for example, color registration being performed on the hat 952A of the player 952 among the moving objects 950. It is also possible to color-register the racehorse 951 number 951A and the racing uniform 952B. In this way, the color position information included in the first information is changed according to the type (attribute) of the race and the shape of the moving object. The color position information may be changed by the user's operation input. That is, the user may be able to select / change the color position information so that the color is registered at a predetermined position of the moving object desired by the user.
ステップS670において、制御部190は、カメラである撮像部170を起動する。撮像部170によって、ユーザ端末10の周囲の現実画像が撮像される。 In step S670, the control unit 190 activates the image pickup unit 170, which is a camera. The image pickup unit 170 captures a real image around the user terminal 10.
ステップS680において、制御部190は、撮像部170によって撮像された画像内における平坦面を検出する。ステップS690において、制御部190は、検出した平坦面に仮想オブジェクトを配置する。 In step S680, the control unit 190 detects a flat surface in the image captured by the image pickup unit 170. In step S690, the control unit 190 arranges the virtual object on the detected flat surface.
ここで、図10から図12を用いて、ステップS670、S680、及びS690の処理について具体的に説明する。図10は、撮像部170により撮像された現実画像の一例を示す模式図である。図10の例では、平坦なデスク1001上に、キーボード1002と、モニタ装置1003と、が置かれている。 Here, the processes of steps S670, S680, and S690 will be specifically described with reference to FIGS. 10 to 12. FIG. 10 is a schematic diagram showing an example of a real image captured by the imaging unit 170. In the example of FIG. 10, a keyboard 1002 and a monitoring device 1003 are placed on a flat desk 1001.
ステップS670において撮像部170が起動されると、ディスプレイ132上に、撮像部170によって撮像されている現実画像が表示される。次に、ステップS680において、制御部190は、撮像部170によって撮像された画像内、すなわちディスプレイ132に表示された画像内における平坦面を検出する。 When the image pickup unit 170 is activated in step S670, the real image captured by the image pickup unit 170 is displayed on the display 132. Next, in step S680, the control unit 190 detects a flat surface in the image captured by the image pickup unit 170, that is, in the image displayed on the display 132.
図10では、領域1004が平坦面として検出されている。領域1004内にはキーボード1002があるが、キーボード1002はレース場オブジェクト501によって隠れる程度のサイズなので、制御部190は、領域1004を平坦面として検出する。 In FIG. 10, the region 1004 is detected as a flat surface. There is a keyboard 1002 in the area 1004, but since the keyboard 1002 is small enough to be hidden by the race field object 501, the control unit 190 detects the area 1004 as a flat surface.
図10に示す状態で、撮像部170によって撮像される位置を変更すれば、領域1004の位置も変更されうる。領域1004は、ディスプレイ132上において、例えば、所定の色を付加されて、他の部分とは区別可能に表示される。ユーザが領域1004に対してタップ操作等を実行した場合、ステップS690において、制御部190は、領域1004上にレース場オブジェクト501、移動オブジェクト502a~502f等の仮想オブジェクトを配置する。 If the position imaged by the imaging unit 170 is changed in the state shown in FIG. 10, the position of the region 1004 can also be changed. The area 1004 is displayed on the display 132 so as to be distinguishable from other parts, for example, by adding a predetermined color. When the user executes a tap operation or the like on the area 1004, in step S690, the control unit 190 arranges virtual objects such as the race field object 501 and the moving objects 502a to 502f on the area 1004.
図11は、現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図である。図11において、モニタ装置1003を含むドットパターンを付した領域が現実画像であり、その他の領域は仮想オブジェクトが表示されている領域である。仮想オブジェクトが表示されていない領域には、例えば、広告画像を表示してもよい。 FIG. 11 is a schematic diagram showing an example of a screen in which a virtual object is superimposed on a real image and displayed. In FIG. 11, the area with the dot pattern including the monitor device 1003 is the real image, and the other area is the area where the virtual object is displayed. For example, an advertisement image may be displayed in an area where the virtual object is not displayed.
図11では、仮想オブジェクトとして、レース場オブジェクト501と、複数の移動オブジェクト502と、2つのターンマークオブジェクト503と、505と、大型モニタオブジェクト506と、建物オブジェクト507a~507bと、その他の符号を付していない多数のオブジェクト(木オブジェクト、時計オブジェクト等)と、が表示されている。これらのオブジェクトは、例えば、第1サーバ装置20から受信した第1情報152Aに基づいて作製される。 In FIG. 11, racefield objects 501, a plurality of moving objects 502, two turnmark objects 503, 505, large monitor objects 506, building objects 507a to 507b, and other reference numerals are added as virtual objects. A large number of objects (tree objects, clock objects, etc.) that have not been displayed are displayed. These objects are created, for example, based on the first information 152A received from the first server device 20.
図12は、現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図であり、図11に示すレース場オブジェクト501の別態様を示したものである。具体的には、図12は、所定のレースが競馬の場合の例である。 FIG. 12 is a schematic view showing an example of a screen in which a virtual object is superimposed on a real image, and shows another aspect of the race field object 501 shown in FIG. Specifically, FIG. 12 is an example in which a predetermined race is horse racing.
図12においても、モニタ装置1003を含むドットパターンを付した領域が現実画像であり、その他の領域は仮想オブジェクトが表示されている領域である。図12では、仮想オブジェクトとして、レース場オブジェクト511と、複数の移動オブジェクト512と、大型モニタオブジェクト513と、池オブジェクト514と、複数の木オブジェクト515と、がディスプレイ132に表示されている。これらのオブジェクトも、例えば、第1サーバ装置20から受信した第1情報152Aに基づいて作製される。 Also in FIG. 12, the area with the dot pattern including the monitoring device 1003 is the real image, and the other areas are the areas where the virtual objects are displayed. In FIG. 12, as virtual objects, a race field object 511, a plurality of moving objects 512, a large monitor object 513, a pond object 514, and a plurality of tree objects 515 are displayed on the display 132. These objects are also created, for example, based on the first information 152A received from the first server device 20.
レース場オブジェクト511、大型モニタオブジェクト513、池オブジェクト514、及び複数の木オブジェクト515は、例えば、現実世界における所定の競馬場のコース情報等のレース場データに基づいて作製されることが好ましい。複数の移動オブジェクト512は、例えば、競馬に出走する馬および騎手をそれぞれ仮想的に表示したオブジェクトである。 The racetrack object 511, the large monitor object 513, the pond object 514, and the plurality of tree objects 515 are preferably created based on racetrack data such as, for example, course information of a predetermined racetrack in the real world. The plurality of moving objects 512 are, for example, objects that virtually display horses and jockeys running in horse racing.
次に、図7のステップS710において、制御部190は、現実世界のボートレース場401におけるボート402a~402fの位置情報を取得する。すなわち、現実世界においてボート402a~402fによるレースが開始された場合、制御部190は、第1サーバ装置20から、ボート402a~402fの位置情報及び時間情報を取得する。位置情報及び時間情報の取得方法は、図4を用いて説明したとおりである。 Next, in step S710 of FIG. 7, the control unit 190 acquires the position information of the boats 402a to 402f in the boat race field 401 in the real world. That is, when the race by the boats 402a to 402f is started in the real world, the control unit 190 acquires the position information and the time information of the boats 402a to 402f from the first server device 20. The method of acquiring the position information and the time information is as described with reference to FIG.
ステップS720において、制御部190は、ステップS710において取得した位置情報と移動オブジェクトとが連動するよう制御する。具体的には、時間情報と位置情報とを用いて、レース場オブジェクト501上におけるボート402a~402fそれぞれの動きが、ボートレース場401上における移動オブジェクト502a~502fと同様になるように制御する。 In step S720, the control unit 190 controls the position information acquired in step S710 to be linked with the moving object. Specifically, using the time information and the position information, the movements of the boats 402a to 402f on the race field object 501 are controlled to be the same as the movement objects 502a to 502f on the boat race field 401.
ステップS730において、制御部190は、所定のレースにおける所定のイベントの発生を検出する。現実世界の所定のレースにおいて所定のイベントが発生した場合、例えば、制御部290は、レースの画像解析や第2サーバ装置40から受信した情報と、イベント情報252Bに記憶された情報とに基づいて、イベントの発生を検知し、イベントの発生を検知したことを示すイベント情報をユーザ端末10に送信する。ステップS730において、制御部190は、例えば、レース中に該イベント情報を受信したことに基づいて、イベントの発生を検出する。 In step S730, the control unit 190 detects the occurrence of a predetermined event in a predetermined race. When a predetermined event occurs in a predetermined race in the real world, for example, the control unit 290 is based on the image analysis of the race, the information received from the second server device 40, and the information stored in the event information 252B. , The occurrence of the event is detected, and the event information indicating that the occurrence of the event is detected is transmitted to the user terminal 10. In step S730, the control unit 190 detects the occurrence of an event, for example, based on receiving the event information during the race.
また、ステップS730において、制御部190は、レース中における第1サーバ装置20からのイベント情報252Bの送信に依らずに、例えば、第1情報152Aやユーザ情報153に基づいて、所定のイベントの発生を検出してもよい。例えば、制御部190は、出場者等の位置情報及び当該位置情報に対応する時間情報に基づいて、所定のイベントの発生を検出してもよい。 Further, in step S730, the control unit 190 generates a predetermined event based on, for example, the first information 152A and the user information 153, without depending on the transmission of the event information 252B from the first server device 20 during the race. May be detected. For example, the control unit 190 may detect the occurrence of a predetermined event based on the position information of the contestants and the like and the time information corresponding to the position information.
ステップS740において、制御部190は、ステップS650において対応付けられた識別番号と色情報とに基づいて、発生したイベントに対応するエフェクト画像の表示色の登録を実行する。制御部190は、例えば、ボートレースの場合は、移動オブジェクト502a~502fのそれぞれの識別番号に対応する色登録に基づいて、エフェクト画像の表示色を登録する。 In step S740, the control unit 190 executes registration of the display color of the effect image corresponding to the generated event based on the identification number and the color information associated with each other in step S650. For example, in the case of a boat race, the control unit 190 registers the display color of the effect image based on the color registration corresponding to each identification number of the moving objects 502a to 502f.
ステップS750において、制御部190は、表示色の登録が完了したエフェクト画像の表示を実行する。ステップS730において第1サーバ装置20からイベント情報を受信しており、該イベント情報にエフェクト表示に関する情報が含まれている場合、ステップS750では、例えば、該エフェクト表示に関する情報に基づいて、エフェクト画像の表示が実行される。なお、第1サーバ装置20からはイベントID等のイベントを識別する情報のみを受信し、例えば、取得したイベントID等に対応するエフェクト画像に関する情報をイベント情報152Bから取得し、該情報に基づいてエフェクト画像が表示されてもよい。 In step S750, the control unit 190 executes the display of the effect image for which the registration of the display color is completed. When event information is received from the first server device 20 in step S730 and the event information includes information related to the effect display, in step S750, for example, based on the information related to the effect display, the effect image is displayed. The display is executed. It should be noted that only the event identification information such as the event ID is received from the first server device 20, and for example, the information about the effect image corresponding to the acquired event ID or the like is acquired from the event information 152B and based on the information. An effect image may be displayed.
以下、図13及び図14を用いて、ステップS750のエフェクト表示処理を具体的に説明する。図13及び図14は、ステップS750における表示画面の一例を示す模式図である。 Hereinafter, the effect display process in step S750 will be specifically described with reference to FIGS. 13 and 14. 13 and 14 are schematic views showing an example of the display screen in step S750.
エフェクト画像1201は、ステップS730で検出されたイベントの種類に応じて異なる表示がなされる。図13の例では、ディスプレイ132には、レース場オブジェクト501や移動オブジェクト502a~502f等の仮想オブジェクトに加えて、エフェクト表示の対象である移動オブジェクト502cの周囲に、オーラのようなエフェクト画像1201が表示されている。このエフェクト画像1201は、例えば、最終直線イベントの発生に伴って、ユーザが購入している投票権の対象である出場者に付加されたものである。移動オブジェクト502cの周囲に表示されるエフェクト画像1201は、ステップS740において移動オブジェクト502cの表示色と同色の赤色の色登録がなされ、ステップS750において赤色に着色された状態で表示されている。 The effect image 1201 is displayed differently depending on the type of event detected in step S730. In the example of FIG. 13, on the display 132, in addition to virtual objects such as the race field object 501 and the moving objects 502a to 502f, an aura-like effect image 1201 is formed around the moving object 502c which is the target of the effect display. It is displayed. This effect image 1201 is added to, for example, a contestant who is the target of the voting right purchased by the user with the occurrence of the final straight line event. The effect image 1201 displayed around the moving object 502c is registered in red, which is the same color as the display color of the moving object 502c, in step S740, and is displayed in a state of being colored red in step S750.
エフェクト画像1201の付加は、上記の例に限定されず、例えば、最終直線イベントではない他のイベントの発生の検出(例えば、加速が増大したことや、他の出場者よりも速度が速いこと等を検出した場合など)に応じて付加してもよい。また、エフェクト表示は、移動オブジェクト502ではなく、レース場オブジェクト501等の他の仮想オブジェクトに対して付加してもよい。例えば、図14に示すように、移動オブジェクト502a~502fのうちイベント対象の移動オブジェクト502cが通った移動軌跡画像1301をエフェクト表示としてレース場オブジェクト501に対して付加してもよい。移動軌跡画像1301は、上述のエフェクト画像1201と同様に、ステップS740において表示対象である移動オブジェクト502cの表示色と同色の赤色の色登録がなされ、ステップS750において赤色に着色された状態で表示されることが好ましい。なお、図14の例では、図示の簡略化のため、移動オブジェクト502a~502fのうち移動オブジェクト502d,502e,502fの図示を省略している。 The addition of the effect image 1201 is not limited to the above example, and for example, detection of the occurrence of another event other than the final linear event (for example, increased acceleration, faster speed than other contestants, etc.) May be added depending on (such as when) is detected. Further, the effect display may be added to other virtual objects such as the race field object 501 instead of the moving object 502. For example, as shown in FIG. 14, the movement locus image 1301 through which the movement object 502c of the event target has passed among the movement objects 502a to 502f may be added to the race field object 501 as an effect display. Similar to the effect image 1201 described above, the movement locus image 1301 is registered in red, which is the same color as the display color of the moving object 502c to be displayed in step S740, and is displayed in a state of being colored red in step S750. Is preferable. In the example of FIG. 14, the moving objects 502d, 502e, and 502f are omitted from the moving objects 502a to 502f for the sake of simplification.
図13及び図14の例では、イベントの発生時にのみエフェクト画像1201や移動軌跡画像1301が表示されるが、例えば、ユーザが購入している投票権の対象である出場者の移動オブジェクトの近傍にエフェクト画像が常時表示されてもよい。すなわち、ステップS730において、ユーザが所定の出場者の投票権を購入したことに基づいてイベントの発生が検出されてもよい。常時表示されるエフェクト画像としては、例えば、ユーザが購入している投票権の対象である移動オブジェクト502の上部に表示される冠型の二次元画像などが挙げられる。エフェクト画像は、対応する移動オブジェクトの近傍に配置されるものであればその位置や形状は図示の例に制限されるものではなく、三次元画像であってもよい。 In the examples of FIGS. 13 and 14, the effect image 1201 and the movement locus image 1301 are displayed only when an event occurs. For example, in the vicinity of the moving object of the contestant who is the target of the voting right purchased by the user. The effect image may be displayed at all times. That is, in step S730, the occurrence of an event may be detected based on the user purchasing the voting right of a predetermined contestant. Examples of the effect image that is always displayed include a crown-shaped two-dimensional image displayed on the upper part of the moving object 502 that is the target of the voting right purchased by the user. The position and shape of the effect image is not limited to the illustrated example as long as it is arranged in the vicinity of the corresponding moving object, and may be a three-dimensional image.
移動オブジェクト502とエフェクト画像との対応付けは、上記の例に限定されず、例えば、ユーザが購入している投票権の対象外である出場者に付加してもよい。この場合、移動オブジェクト502とエフェクト画像との対応付けは、第1サーバ装置20の管理者等が予め設定してもよいし、ユーザ端末10のユーザが設定してもよい。例えば、ユーザ端末10のユーザが、「お気に入り」と設定した移動オブジェクトに対応してエフェクト画像を表示させるようにしてもよい。
また、移動オブジェクト502に対応させて表示されるエフェクト画像を、選択候補として設定された複数のオブジェクトの中からユーザが選択可能なように構成してもよい。また、選択候補となるオブジェクトの種類は、ユーザの課金等に応じて増加してもよい。
The association between the moving object 502 and the effect image is not limited to the above example, and may be added to, for example, a contestant who is not subject to the voting right purchased by the user. In this case, the association between the moving object 502 and the effect image may be set in advance by the administrator of the first server device 20 or the user of the user terminal 10. For example, the user of the user terminal 10 may display the effect image corresponding to the moving object set as "favorite".
Further, the effect image displayed corresponding to the moving object 502 may be configured so that the user can select it from a plurality of objects set as selection candidates. Further, the types of objects that are candidates for selection may be increased according to the user's billing and the like.
ステップS710~S750の処理は、少なくとも現実世界におけるレースの開始時点から終了時点まで繰り返されるが、現実世界におけるレースの開始前及び終了後においても繰り返されてもよい。 The process of steps S710 to S750 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world.
なお、仮想レースとして表示するレースが過去のレースである場合、仮想レースの開始前に、レースの開始から終了までのボート402a~402fの位置情報等をまとめて取得してもよい。また、位置情報等は取得せずに、レースタイム情報のみを取得して、仮想レースを表示してもよい。仮想レースとして表示するレースは、展示レースであってもよい。 If the race displayed as a virtual race is a past race, the position information of the boats 402a to 402f from the start to the end of the race may be collectively acquired before the start of the virtual race. Further, the virtual race may be displayed by acquiring only the race time information without acquiring the position information and the like. The race displayed as a virtual race may be an exhibition race.
現実世界におけるレースの終了後、ステップS760において、制御部190は、結果表示処理を実行し、アプリケーションプログラム151を終了する操作入力を受け付けたこと等に応じて、一連の表示制御処理を終了する。 After the end of the race in the real world, in step S760, the control unit 190 executes the result display process and ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
上記の実施形態は、本発明の理解を容易にするための例示に過ぎず、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良することができると共に、本発明にはその均等物が含まれることは言うまでもない。 The above embodiments are merely examples for facilitating the understanding of the present invention, and are not intended to limit the interpretation of the present invention. It goes without saying that the present invention can be modified and improved without departing from the spirit thereof, and the present invention includes its equivalents.
[付記事項]
本開示の内容を列記すると以下の通りである。
[Additional notes]
The contents of this disclosure are listed below.
(項目1)
プロセッサ及び撮像部を備えた第1コンピュータにおいて実行されるプログラムであって、
前記プログラムは、前記プロセッサに、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成するステップと、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させ、
前記第1情報には、前記出場者又は前記移動体の識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
前記生成するステップは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
プログラム。
これにより、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示する際に、レースの出場者や移動体に対応する移動オブジェクトを判別し易くなり、仮想レースの視認性を向上させることができる。結果として、ユーザ、特に、観戦初心者のユーザの満足度や利便性を向上させることができる。
(Item 1)
A program executed in a first computer equipped with a processor and an image pickup unit.
The program is attached to the processor.
The step of receiving the first information about a given race in the real world from the second computer,
A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
The step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
The first information includes identification information of the contestant or the moving body and color information associated with the identification information.
The generation step comprises generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
program.
This makes it easier to identify moving objects that correspond to race participants and moving objects when displaying real-world races as virtual races using virtual objects linked to real-world races. The visibility of the race can be improved. As a result, it is possible to improve the satisfaction and convenience of users, especially users who are new to watching games.
(項目2)
前記識別情報と前記色情報との対応関係は、前記所定のレースの枠番に対応する指定色に基づいて、変更される、
項目1に記載のプログラム。
これにより、現実世界のレースでの色分けが仮想レース上で再現可能になるため、ユーザの満足度をさらに向上させることができる。
(Item 2)
The correspondence between the identification information and the color information is changed based on the designated color corresponding to the frame number of the predetermined race.
The program described in item 1.
As a result, the color coding in the real world race can be reproduced on the virtual race, so that the user's satisfaction can be further improved.
(項目3)
前記指定色は、前記所定のレースの属性によって変更される、項目2に記載のプログラム。
現実世界のレースの種類ごとに枠番に設定される色が異なるため、レースの種類に応じて仮想レース上の移動オブジェクトの色登録を変更することで、仮想レースの視認性を向上させることができる。
(Item 3)
The program according to item 2, wherein the designated color is changed by the attribute of the predetermined race.
Since the color set for the frame number differs depending on the type of race in the real world, it is possible to improve the visibility of the virtual race by changing the color registration of the moving object on the virtual race according to the type of race. can.
(項目4)
前記第1情報には、前記移動オブジェクトへの色登録の位置を指定するための色位置情報がさらに含まれ、
前記色位置情報は、前記移動オブジェクトの形状及び前記所定のレースの属性の少なくとも一つに応じて、変更される、
項目1から項目3のいずれかに記載のプログラム。
これにより、現実世界のレースの種類に応じて移動オブジェクトの形状も変化するため、レースに応じた移動オブジェクトの色登録を行うことで、仮想レースの視認性や興趣性をさらに向上させることができる。
(Item 4)
The first information further includes color position information for designating the position of color registration to the moving object.
The color position information is changed according to the shape of the moving object and at least one of the attributes of the predetermined race.
The program according to any one of items 1 to 3.
As a result, the shape of the moving object changes according to the type of race in the real world, so by registering the color of the moving object according to the race, the visibility and interest of the virtual race can be further improved. ..
(項目5)
前記表示するステップでは、前記第1情報に基づいて、前記レース場オブジェクト上において前記移動オブジェクトを移動させ、前記所定のレースを仮想的に表示するとともに、前記レース場オブジェクト及び前記移動オブジェクトの少なくとも一方にエフェクト表示を行い、
前記色情報に基づいて、前記エフェクト表示の少なくとも一部が前記所定の色で着色される、
項目1から項目4のいずれかに記載のプログラム。
これにより、ユーザは、仮想レース中に対象の移動オブジェクトをより目立たせることができ、ユーザがレースの進行状況を容易に把握することが可能になる。結果として、ユーザの利便性を向上させることができる。
(Item 5)
In the display step, the moving object is moved on the race field object based on the first information, the predetermined race is virtually displayed, and at least one of the race field object and the moving object is displayed. Display the effect on
Based on the color information, at least a part of the effect display is colored with the predetermined color.
The program according to any one of items 1 to 4.
As a result, the user can make the target moving object more conspicuous during the virtual race, and the user can easily grasp the progress of the race. As a result, the convenience of the user can be improved.
(項目6)
前記エフェクト表示は、前記レース場オブジェクト上における前記移動オブジェクトの移動軌跡を表す表示である、項目5に記載のプログラム。
これにより、仮想レースの視認性をさらに向上させることができる。
(Item 6)
The program according to item 5, wherein the effect display is a display showing a movement locus of the moving object on the race field object.
This makes it possible to further improve the visibility of the virtual race.
(項目7)
プロセッサ及び撮像部を備えた第1コンピュータにおいて実行される情報処理方法であって、
前記情報処理方法は、前記プロセッサに、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成するステップと、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させることを含み、
前記第1情報には、前記移動オブジェクトの識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
前記生成するステップは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
情報処理方法。
これにより、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示する際に、レースの出場者や移動体に対応する移動オブジェクトを判別し易くなり、仮想レースの視認性を向上させることができる。結果として、ユーザ、特に、観戦初心者のユーザの満足度や利便性を向上させることができる。
(Item 7)
An information processing method executed in a first computer equipped with a processor and an image pickup unit.
The information processing method is applied to the processor.
The step of receiving the first information about a given race in the real world from the second computer,
A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
The step of superimposing and displaying the virtual object on the real image around the first computer captured by the imaging unit is included.
The first information includes the identification information of the moving object and the color information associated with the identification information.
The generation step comprises generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
Information processing method.
This makes it easier to identify moving objects that correspond to race participants and moving objects when displaying real-world races as virtual races using virtual objects linked to real-world races. The visibility of the race can be improved. As a result, it is possible to improve the satisfaction and convenience of users, especially users who are new to watching games.
(項目8)
プロセッサ及び撮像部を備えた情報処理装置であって、前記プロセッサは、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信し、
前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成し、
前記撮像部により撮像された前記情報処理装置の周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
前記第1情報には、前記移動オブジェクトの識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
前記仮想オブジェクトを生成することは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
情報処理装置。
これにより、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示する際に、レースの出場者や移動体に対応する移動オブジェクトを判別し易くなり、仮想レースの視認性を向上させることができる。結果として、ユーザ、特に、観戦初心者のユーザの満足度や利便性を向上させることができる。
(Item 8)
An information processing device including a processor and an image pickup unit, wherein the processor is
Receive the first information about a given race in the real world from the second computer,
Based on the first information, a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race is generated.
The virtual object is superimposed and displayed on a real image around the information processing apparatus captured by the image pickup unit.
The first information includes the identification information of the moving object and the color information associated with the identification information.
Creating the virtual object comprises creating the moving object, which is at least partially colored with a predetermined color, based on the color information.
Information processing equipment.
This makes it easier to identify moving objects that correspond to race participants and moving objects when displaying real-world races as virtual races using virtual objects linked to real-world races. The visibility of the race can be improved. As a result, it is possible to improve the satisfaction and convenience of users, especially users who are new to watching games.
(項目9)
第1プロセッサ及び撮像装置を備える第1コンピュータと、第2プロセッサを備え、前記第1コンピュータと通信接続可能な第2コンピュータと、を含むシステムあって、
前記第2プロセッサは、
現実世界における所定のレースに関する第1情報を取得し、前記第1情報を前記第1コンピュータへ送信するものであり、
前記第1プロセッサは、
前記第1情報を第2コンピュータから受信し、
前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成し、
前記撮像装置により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
前記第1情報には、前記移動オブジェクトの識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
前記仮想オブジェクトを生成することは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
システム。
これにより、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示する際に、レースの出場者や移動体に対応する移動オブジェクトを判別し易くなり、仮想レースの視認性を向上させることができる。結果として、ユーザ、特に、観戦初心者のユーザの満足度や利便性を向上させることができる。
(Item 9)
There is a system including a first computer provided with a first processor and an image pickup apparatus, and a second computer provided with a second processor and capable of communicating with the first computer.
The second processor is
It acquires the first information about a predetermined race in the real world and transmits the first information to the first computer.
The first processor is
The first information is received from the second computer,
Based on the first information, a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race is generated.
The virtual object is superimposed and displayed on a real image around the first computer captured by the image pickup device.
The first information includes identification information of the moving object and color information associated with the identification information.
Creating the virtual object comprises creating the moving object, which is at least partially colored with a predetermined color, based on the color information.
system.
This makes it easier to identify moving objects that correspond to race participants and moving objects when displaying real-world races as virtual races using virtual objects linked to real-world races. The visibility of the race can be improved. As a result, it is possible to improve the satisfaction and convenience of users, especially users who are new to watching games.
1:システム、10:ユーザ端末、20:第1サーバ装置、30:ネットワーク、40:第2サーバ装置、130:タッチスクリーン、150:(ユーザ端末の)記憶部、190:(ユーザ端末の)制御部、191:操作入力受付部、192:送受信部、193:レース属性選択部、194:オブジェクト生成部、195:識別番号付与部、196:色情報登録部、197:表示制御部、198A:平坦面検出部、198B:イベント検出部、199:選択情報取得部、250:(第1サーバ装置の)記憶部、290:(第1サーバ装置の)制御部、401:ボートレース場、402a~402f:ボート、501:レース場オブジェクト、502(502a~502f):移動オブジェクト、1201:エフェクト画像、1301:移動軌跡画像
 
1: System, 10: User terminal, 20: First server device, 30: Network, 40: Second server device, 130: Touch screen, 150: Storage unit (of user terminal), 190: Control (of user terminal) Unit, 191: Operation input reception unit, 192: Transmission / reception unit, 193: Race attribute selection unit, 194: Object generation unit, 195: Identification number assignment unit, 196: Color information registration unit, 197: Display control unit, 198A: Flat Surface detection unit, 198B: Event detection unit, 199: Selection information acquisition unit, 250: Storage unit (of the first server device), 290: Control unit (of the first server device), 401: Boat race field, 402a to 402f : Boat, 501: Race field object, 502 (502a to 502f): Moving object, 1201: Effect image, 1301: Movement trajectory image
 

Claims (9)

  1. プロセッサ及び撮像部を備えた第1コンピュータにおいて実行されるプログラムであって、
    前記プログラムは、前記プロセッサに、
    現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
    前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成するステップと、
    前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させ、
    前記第1情報には、前記出場者又は前記移動体の識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
    前記生成するステップは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
    プログラム。
    A program executed in a first computer equipped with a processor and an image pickup unit.
    The program is attached to the processor.
    The step of receiving the first information about a given race in the real world from the second computer,
    A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
    The step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
    The first information includes identification information of the contestant or the moving body and color information associated with the identification information.
    The generation step comprises generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
    program.
  2. 前記識別情報と前記色情報との対応関係は、前記所定のレースの枠番に対応する指定色に基づいて、変更される、
    請求項1に記載のプログラム。
    The correspondence between the identification information and the color information is changed based on the designated color corresponding to the frame number of the predetermined race.
    The program according to claim 1.
  3. 前記指定色は、前記所定のレースの属性によって変更される、請求項2に記載のプログラム。 The program according to claim 2, wherein the designated color is changed by the attribute of the predetermined race.
  4. 前記第1情報には、前記移動オブジェクトへの色登録の位置を指定するための色位置情報がさらに含まれ、
    前記色位置情報は、前記移動オブジェクトの形状及び前記所定のレースの属性の少なくとも一つに応じて、変更される、
    請求項1から請求項3のいずれか一項に記載のプログラム。
    The first information further includes color position information for designating the position of color registration to the moving object.
    The color position information is changed according to the shape of the moving object and at least one of the attributes of the predetermined race.
    The program according to any one of claims 1 to 3.
  5. 前記表示するステップでは、前記第1情報に基づいて、前記レース場オブジェクト上において前記移動オブジェクトを移動させ、前記所定のレースを仮想的に表示するとともに、前記レース場オブジェクト及び前記移動オブジェクトの少なくとも一方にエフェクト表示を行い、
    前記色情報に基づいて、前記エフェクト表示の少なくとも一部が前記所定の色で着色される、
    請求項1から請求項4のいずれか一項に記載のプログラム。
    In the display step, the moving object is moved on the race field object based on the first information, the predetermined race is virtually displayed, and at least one of the race field object and the moving object is displayed. Display the effect on
    Based on the color information, at least a part of the effect display is colored with the predetermined color.
    The program according to any one of claims 1 to 4.
  6. 前記エフェクト表示は、前記レース場オブジェクト上における前記移動オブジェクトの移動軌跡を表す表示である、請求項5に記載のプログラム。 The program according to claim 5, wherein the effect display is a display showing a movement locus of the moving object on the race field object.
  7. プロセッサ及び撮像部を備えた第1コンピュータにおいて実行される情報処理方法であって、
    前記情報処理方法は、前記プロセッサに、
    現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
    前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成するステップと、
    前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させることを含み、
    前記第1情報には、前記移動オブジェクトの識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
    前記生成するステップは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
    情報処理方法。
    An information processing method executed in a first computer equipped with a processor and an image pickup unit.
    The information processing method is applied to the processor.
    The step of receiving the first information about a given race in the real world from the second computer,
    A step of generating a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race based on the first information.
    The step of superimposing and displaying the virtual object on the real image around the first computer captured by the imaging unit is included.
    The first information includes the identification information of the moving object and the color information associated with the identification information.
    The generation step comprises generating the moving object, which is at least partially colored with a predetermined color, based on the color information.
    Information processing method.
  8. プロセッサ及び撮像部を備えた情報処理装置であって、前記プロセッサは、
    現実世界における所定のレースに関する第1情報を第2コンピュータから受信し、
    前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成し、
    前記撮像部により撮像された前記情報処理装置の周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
    前記第1情報には、前記移動オブジェクトの識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
    前記仮想オブジェクトを生成することは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
    情報処理装置。
    An information processing device including a processor and an image pickup unit, wherein the processor is
    Receive the first information about a given race in the real world from the second computer,
    Based on the first information, a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race is generated.
    The virtual object is superimposed and displayed on a real image around the information processing apparatus captured by the image pickup unit.
    The first information includes the identification information of the moving object and the color information associated with the identification information.
    Creating the virtual object comprises creating the moving object, which is at least partially colored with a predetermined color, based on the color information.
    Information processing equipment.
  9. 第1プロセッサ及び撮像装置を備える第1コンピュータと、第2プロセッサを備え、前記第1コンピュータと通信接続可能な第2コンピュータと、を含むシステムあって、
    前記第2プロセッサは、
    現実世界における所定のレースに関する第1情報を取得し、前記第1情報を前記第1コンピュータへ送信するものであり、前記第1プロセッサは、
    前記第1情報を第2コンピュータから受信し、
    前記第1情報に基づいて、レース場を表すレース場オブジェクトと、前記所定のレースの出場者又は移動体を表す移動オブジェクトと、を含む仮想オブジェクトを生成し、
    前記撮像装置により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
    前記第1情報には、前記移動オブジェクトの識別情報と、前記識別情報に対応付けられた色情報とが含まれ、
    前記仮想オブジェクトを生成することは、前記色情報に基づいて、少なくとも一部が所定の色で着色された前記移動オブジェクトを生成することを含む、
    システム。
     
    There is a system including a first computer provided with a first processor and an image pickup apparatus, and a second computer provided with a second processor and capable of communicating with the first computer.
    The second processor is
    The first processor acquires the first information about a predetermined race in the real world and transmits the first information to the first computer.
    The first information is received from the second computer,
    Based on the first information, a virtual object including a race field object representing a race field and a moving object representing a contestant or a moving body of the predetermined race is generated.
    The virtual object is superimposed and displayed on a real image around the first computer captured by the image pickup device.
    The first information includes the identification information of the moving object and the color information associated with the identification information.
    Creating the virtual object comprises creating the moving object, which is at least partially colored with a predetermined color, based on the color information.
    system.
PCT/JP2021/041114 2020-11-20 2021-11-09 Program, information processing method, information processing device, and system WO2022107639A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-193639 2020-11-20
JP2020193639A JP6896932B1 (en) 2020-11-20 2020-11-20 Programs, information processing methods, information processing devices, and systems

Publications (1)

Publication Number Publication Date
WO2022107639A1 true WO2022107639A1 (en) 2022-05-27

Family

ID=76540450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/041114 WO2022107639A1 (en) 2020-11-20 2021-11-09 Program, information processing method, information processing device, and system

Country Status (2)

Country Link
JP (1) JP6896932B1 (en)
WO (1) WO2022107639A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001251671A (en) * 2000-03-03 2001-09-14 Matsushita Electric Ind Co Ltd Mutual position information notice system and mutual position information notice method
JP2003204481A (en) * 2001-08-16 2003-07-18 Space Tag Inc Image-information distribution system
JP2012157510A (en) * 2011-01-31 2012-08-23 Fujitsu Frontech Ltd Passage position display system, terminal device for the same, and program
JP2019003346A (en) * 2017-06-13 2019-01-10 株式会社Mgrシステム企画 Activity support method and activity support apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001251671A (en) * 2000-03-03 2001-09-14 Matsushita Electric Ind Co Ltd Mutual position information notice system and mutual position information notice method
JP2003204481A (en) * 2001-08-16 2003-07-18 Space Tag Inc Image-information distribution system
JP2012157510A (en) * 2011-01-31 2012-08-23 Fujitsu Frontech Ltd Passage position display system, terminal device for the same, and program
JP2019003346A (en) * 2017-06-13 2019-01-10 株式会社Mgrシステム企画 Activity support method and activity support apparatus

Also Published As

Publication number Publication date
JP6896932B1 (en) 2021-06-30
JP2022082211A (en) 2022-06-01

Similar Documents

Publication Publication Date Title
CN103561293A (en) Supplemental video content on a mobile device
JP6903805B1 (en) Programs, information processing methods, information processing devices, and systems
WO2022118652A1 (en) Program, information processing method, information processing device, and system
JP6857785B1 (en) Programs, information processing methods, information processing devices, and systems
JP7212721B2 (en) Program, information processing method, information processing apparatus, and system
JP6896932B1 (en) Programs, information processing methods, information processing devices, and systems
JP7303845B2 (en) Program, information processing method, information processing apparatus, and system
JP6857776B1 (en) Programs, information processing methods, information processing devices, and systems
JP6934547B1 (en) Programs, information processing methods, information processing devices, and systems
WO2022014303A1 (en) Program, information processing method, information processing device, and system
JP6866540B1 (en) Programs, information processing methods, information processing devices, and systems
JP7324801B2 (en) Program, information processing method, information processing apparatus, and system
JP6934552B1 (en) Programs, information processing methods, information processing devices, and systems
JP7482821B2 (en) PROGRAM, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING APPARATUS, AND SYSTEM
WO2022107638A1 (en) Program, information processing method, information processing device, and system
JP7303846B2 (en) Program, information processing method, information processing apparatus, and system
WO2022239403A1 (en) Program, information processing method, information processing device, and system
WO2022137376A1 (en) Method, computer-readable medium, and information processing device
JP2022080829A (en) Program, information processing method, information processing device and system
JP2022138681A (en) Program, information processing method, information processing device and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21894513

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21894513

Country of ref document: EP

Kind code of ref document: A1