WO2022118652A1 - Program, information processing method, information processing device, and system - Google Patents

Program, information processing method, information processing device, and system Download PDF

Info

Publication number
WO2022118652A1
WO2022118652A1 PCT/JP2021/042180 JP2021042180W WO2022118652A1 WO 2022118652 A1 WO2022118652 A1 WO 2022118652A1 JP 2021042180 W JP2021042180 W JP 2021042180W WO 2022118652 A1 WO2022118652 A1 WO 2022118652A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
race
user
predetermined
contestant
Prior art date
Application number
PCT/JP2021/042180
Other languages
French (fr)
Japanese (ja)
Inventor
功淳 馬場
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Publication of WO2022118652A1 publication Critical patent/WO2022118652A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present disclosure relates to programs, information processing methods, information processing devices, and systems.
  • Patent Documents 1 and 2 disclose techniques related to AR (Augmented Reality).
  • One aspect of the present disclosure provides a program, information processing method, information processing device, and system capable of improving the recognition of the contestants of the real world race corresponding to the virtual race through the virtual race displayed on the computer.
  • the purpose is to do.
  • the program is attached to the processor.
  • the step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
  • the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
  • the virtual object includes at least a moving object corresponding to the contestant or the moving object.
  • the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
  • the program is offered.
  • FIG. 1 is a diagram showing a configuration of a system 1 according to the present embodiment.
  • the system 1 can display, for example, a predetermined race performed in the real world as a virtual race using a virtual object on an information processing device used by a user.
  • the "predetermined race” is not particularly limited as long as it is a race carried out in the real world, and for example, a boat race (actual race or exhibition race), a horse race, a bicycle race, an auto race, etc.
  • Car races such as F1, drone races, dog races, marathons, ekiden, etc. can be mentioned.
  • the system 1 is a user such as a user terminal 10A, a user terminal 10B, and a user terminal 10C (hereinafter, user terminals 10A, 10B, 10C) which are information processing devices (first computers) used by each user. It includes a plurality of user terminals 10 such as (generally referred to as "user terminal 10"), a first server device (second computer) 20, a second server device 40, and a network 30.
  • the user terminal 10A and the user terminal 10B are connected to the network 30 by communicating with the radio base station 31.
  • the user terminal 10C connects to the network 30 by communicating with a wireless router 32 installed in a facility such as a house.
  • the user terminal 10 is, for example, a portable terminal provided with a touch screen, and may be a smartphone, a phablet, a tablet, or the like.
  • the user terminal 10 executes, for example, a program installed via a platform for distributing an application or the like, or a program including pre-installed website browsing software.
  • the user terminal 10 communicates with the first server device 20 by executing the above program, and transmits / receives data related to a predetermined race, data related to the user, and the like to / from the first server device 20. It is possible to display a virtual race on the terminal 10.
  • the first server device 20 receives data related to a predetermined race from the second server device 40.
  • the first server device 20 appropriately transmits data related to a predetermined race to the user terminal 10.
  • the first server device 20 stores and manages data related to a predetermined race and data related to each user.
  • the first server device 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor (second processor) 29 as hardware configurations, and these provide a communication bus. They are connected to each other via.
  • IF Interface
  • second processor second processor
  • the communication IF 22 supports various communication standards such as a LAN (Local Area Network) standard, and functions as an interface for transmitting and receiving data to and from a user terminal 10 and a second server device 40.
  • LAN Local Area Network
  • the input / output IF 23 functions as an interface for accepting input of information to the first server device 20 and outputting information to the outside of the first server device 20.
  • the input / output IF 23 may include an input receiving unit that accepts the connection of an information input device such as a mouse and a keyboard, and an output unit that accepts the connection of an information output device such as a display for displaying an image or the like.
  • the memory 25 is a storage device for storing data or the like used for processing.
  • the memory 25 provides the processor 29 with a work area for temporary use, for example, when the processor 29 performs processing.
  • the memory 25 includes a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
  • the storage 26 is a storage device for storing various programs and data for the processor 29 to read and execute.
  • the information stored in the storage 26 includes data related to a predetermined race, data related to each user, and the like.
  • the storage 26 may be configured to include a storage device such as an HDD (Hard Disk Drive) or a flash memory.
  • the storage is not limited to the form included in the server device, and a cloud service can also be used.
  • the processor 29 controls the operation of the first server device 20 by reading and executing a program or the like stored in the storage 26.
  • the processor 29 may be configured to include, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and the like.
  • the second server device 40 stores and manages data related to a predetermined race.
  • the second server device 40 is, for example, a server device managed by a predetermined race organizer or an organization (publisher of a race magazine, race video distributor, or radio distributor) that transmits information about a predetermined race to the outside. Etc.) is a server device managed by.
  • the second server device 40 appropriately transmits data related to a predetermined race to the first server device 20.
  • the second server device 40 may transmit data related to a predetermined race to the user terminal 10.
  • the hardware configuration of the second server device 40 may be the same as that of the first server device 20 as long as there is no contradiction. There may be a plurality of second server devices 40.
  • FIG. 2 is a block diagram showing an example of the functional configuration of the user terminal 10.
  • the user terminal 10 includes an antenna 110, a wireless communication IF 120, a touch screen 130, an input / output IF 140, a storage unit 150, a voice processing unit 160, a microphone 161 and a speaker 162. It includes an image pickup unit 170 and a control unit (first processor) 190.
  • the antenna 110 radiates a signal emitted by the user terminal 10 into space as a radio wave. Further, the antenna 110 receives radio waves from the space and gives a received signal to the wireless communication IF 120.
  • the wireless communication IF 120 Since the user terminal 10 communicates with other communication devices, the wireless communication IF 120 performs modulation / demodulation processing for transmitting / receiving signals via the antenna 110 or the like.
  • the wireless communication IF 120 is a communication module for wireless communication including a tuner, a high frequency circuit, etc., performs modulation / demodulation and frequency conversion of the wireless signal transmitted / received by the user terminal 10, and supplies the received signal to the control unit 190.
  • the touch screen 130 receives input from the user and outputs information to the user on the display 132.
  • the touch screen 130 includes a touch panel 131 for receiving user operation input and a display 132.
  • the touch panel 131 detects that a user's finger or the like has approached by using, for example, a capacitance type touch panel.
  • the display 132 is realized by, for example, an LCD (Liquid Crystal Display), an organic EL (electroluminescence), or other display device.
  • the input / output IF 140 functions as an interface for accepting input of information to the user terminal 10 and outputting information to the outside of the user terminal 10.
  • the storage unit 150 is composed of a flash memory, an HDD, or the like, and stores a program used by the user terminal 10 and various data received by the user terminal 10 from the first server device 20 or the like.
  • the voice processing unit 160 performs modulation / demodulation of the voice signal.
  • the voice processing unit 160 modulates the signal given from the microphone 161 and gives the modulated signal to the control unit 190. Further, the voice processing unit 160 gives a voice signal to the speaker 162.
  • the voice processing unit 160 is realized by, for example, a processor for voice processing.
  • the microphone 161 functions as an audio input unit for receiving an input of an audio signal and outputting it to the control unit 190.
  • the speaker 162 functions as an audio output unit for outputting an audio signal to the outside of the user terminal 10.
  • the image pickup unit 170 is a camera that captures a real image around the user terminal 10.
  • the image captured by the image pickup unit 170 is image-processed by the control unit 190 and output to the display 132.
  • the control unit 190 controls the operation of the user terminal 10 by reading and executing a program stored in the storage unit 150.
  • the control unit 190 is realized by, for example, an application processor.
  • the storage unit 150 stores the application program 151, the application information 152, and the user information 153.
  • the user terminal 10 downloads the application program 151 from the first server device 20 and stores it in the storage unit 150. Further, the user terminal 10 communicates with the first server device 20 to send and receive various data such as application information 152 and user information 153 to and from the first server device 20.
  • the application program 151 is a program for displaying a virtual race on the user terminal 10.
  • the application information 152 includes various data referred to by the application program 151.
  • the application information 152 includes the first information 152A, the event information 152B, and the avatar information 152C.
  • the first information 152A is information regarding a predetermined race transmitted from the first server device 20 and the second server device 40.
  • the first information 152A includes, for example, race time information indicating the race time of a participant or a moving body (hereinafter, also referred to as “participant”) of a predetermined race, position information of the participant or the like during the execution of the predetermined race, and the like.
  • the time information corresponding to the position information and the time information are included.
  • the term "participant” is a concept that includes not only humans but also animals such as horses and dogs.
  • the "moving body” is a main body of movement in a predetermined race, and is an animal or an aircraft on which a participant rides, an aircraft remotely controlled by a contestant, or the like. In marathons, dog races, etc., "participants” and “moving bodies” are the same.
  • the first information 152A includes, for example, a predetermined race name, date and time, race field data, contestant data, moving object data, odds information, race prediction, race start table, and information immediately before the race. It may include pit reports, race results, race videos, race stills, past race information, and other information that may be published in information magazines or information sites about a given race.
  • the event information 152B is information about an event that can occur or has occurred in a predetermined race, and includes, for example, information about an effect display (effect display) and an effect sound corresponding to the event.
  • the event is not particularly limited, but for example, the goal of the first contestant, etc. is reached, the ranking of a predetermined race is confirmed, the ranking of the contestants, etc. is changed, and the difference in the positions of two or more contestants, etc. at a predetermined timing. May be less than or equal to a predetermined threshold value (two or more contestants or the like have a close battle), or one or more of the occurrence of rule violations.
  • the event information 152B is information received from the first server device 20 after an event occurs in a predetermined race.
  • the event information 152B includes, for example, information about a two-dimensional image or a three-dimensional image corresponding to a predetermined event.
  • the event information 152B may include information about a text image for notifying the content of the event, a predetermined character image, an effect image added to a virtual object described later, and the like.
  • the avatar information 152C includes information about an object that becomes a user's avatar (hereinafter, referred to as an avatar object). Specifically, the avatar information 152C may include information such as gender, type (occupation), hairstyle, skin color, and action of an avatar object (an example of an avatar). The avatar information 152C can be edited by the user via the avatar editing screen described later.
  • the user information 153 includes information about the user of the user terminal 10.
  • the user information 153 includes, for example, information for identifying a user, position information of a user terminal 10, a race purchase history and a hit rate of a user (for example, a history of a boat ticket purchased in the case of a boat race, and a boat ticket purchased. Hit rate) and the like may be included.
  • control unit 190 By reading and executing the application program 151, the control unit 190 reads and executes the operation input reception unit 191, the transmission / reception unit 192, the object generation unit 193, the display control unit 194, the flat surface detection unit 195, and the event detection unit. Each function of 196 and the selection information acquisition unit 197 is exhibited.
  • the operation input receiving unit 191 accepts the user's operation input based on the output of the touch screen 130. Specifically, the operation input receiving unit 191 detects that the user's finger or the like touches or approaches the touch panel 131 as the coordinates of the coordinate system consisting of the horizontal axis and the vertical axis of the surface constituting the touch screen 130.
  • the operation input receiving unit 191 determines the user's operation on the touch screen 130.
  • the operation input reception unit 191 is, for example, “approach operation”, “release operation”, “tap operation”, “double tap operation”, “long press operation (long touch operation)”, “drag operation (swipe operation)", Determine the user's operation such as “move operation”, “flick operation”, “pinch-in operation", and "pinch-out operation”.
  • the operation input receiving unit 191 may accept the movement of the user terminal 10 detected by the acceleration sensor or the like mounted on the user terminal 10 as an operation input.
  • the transmission / reception unit 192 transmits and receives various information to and from external communication devices such as the first server device 20 and the second server device 40 via the wireless communication IF 120 and the network 30.
  • the transmission / reception unit 192 receives, for example, the first information 152A from the first server device 20 or the second server device 40. Further, the transmission / reception unit 192 transmits, for example, information corresponding to the operation input received by the operation input reception unit 191 and information stored in the user information 153 to the first server device 20 or the second server device 40. ..
  • the object generation unit 193 generates a virtual object for presenting information about a predetermined race to the user based on the first information.
  • the object generation unit 193 generates, as virtual objects, a race field object representing a race field and a moving object representing a contestant or the like.
  • the object generation unit 193 may generate a virtual display board for displaying the second information generated based on the first information as text.
  • the object generation unit 193 is a landscape object that constitutes a landscape such as a virtual screen for displaying an image of the second information generated based on the first information, various building objects, and trees. Etc. may be generated.
  • the object generation unit 193 may generate an avatar object based on the avatar information 152C.
  • the object generation unit 193 may generate, for example, a three-dimensional model of a cartoon or animation character, a mascot character of various sports competitions, a celebrity including a celebrity, or the like as an avatar object.
  • the avatar object may be a two-dimensional image limited to a three-dimensional model.
  • the object generation unit 193 may replace the generated avatar object with a portion indicating a contestant of the moving object.
  • the display control unit 194 superimposes an image (hereinafter, also referred to as “superimposed image”) on which a virtual object generated by the object generation unit 193 is superimposed on a real image around the user terminal 10 imaged by the image pickup unit 170. It is displayed on the display 132.
  • the display control unit 194 moves each moving object on the race track object based on the race time information included in the first information 152A, and displays a virtual race that virtually reproduces a predetermined race on the display 132. It is preferable that the display control unit 194 reproduces the virtual race based on the position information of the contestants and the like included in the first information 152A and the time information corresponding to the position information in addition to the race time information. ..
  • the display control unit 194 can change the viewpoint in the superimposed image according to the operation input received by the operation input reception unit 191.
  • the display control unit 194 displays various menu screens and GUIs (Graphical User Interfaces) on the display 132, and changes the display contents of the display 132, in response to the operation input received by the operation input reception unit 191.
  • GUIs Graphic User Interfaces
  • the display control unit 194 displays an effect corresponding to the event in response to the event detection unit 196 detecting the occurrence of the event.
  • the content of the effect display is not particularly limited, but may be, for example, a display of a predetermined avatar object, an effect display on at least one of a race field object and a moving object, and a two-dimensional image corresponding to a predetermined event. Alternatively, it may be one or more of displaying three-dimensional images and operating these images according to the progress of the race.
  • the display control unit 194 may simultaneously execute two or more effect displays corresponding to different events.
  • the flat surface detection unit 195 detects a flat surface in the real image captured by the image pickup unit 170.
  • the detection of a flat surface is realized by a conventionally known image recognition technique. For example, when the user performs an operation of selecting a flat surface detected by the flat surface detection unit 195, a superimposed image in which the race field object is arranged on the flat surface is displayed on the display 132.
  • the flat surface is preferably a horizontal surface.
  • the angle formed by the flat surface and the bottom surface constituting the race field object may be 0 degrees, but is preferably an acute angle, and can be, for example, in the range of 15 degrees to 45 degrees. The above angle may be adjusted by accepting a user's operation. Also, even if there is a convex part on a part of the flat surface in the real world, or if there is an object on the flat surface, if the convex part or the object is of a size that can be hidden by the racetrack object. , The place where the race field object can be placed may be detected as a flat surface.
  • the event detection unit 196 detects the occurrence of a predetermined event in a predetermined race.
  • the event detection unit 196 is, for example, based on the fact that the transmission / reception unit 192 receives event information transmitted from the first server device 20 when a predetermined event occurs in a predetermined race in the real world. Detect the occurrence.
  • the event detection unit 196 may detect the occurrence of a predetermined event based on, for example, the first information 152A and the user information 153. Specifically, the event detection unit 196 determines from the position of the contestants, etc. during the race, such as the change of ranking of the contestants, etc., based on the position information of the contestants, etc. and the time information corresponding to the position information. Can detect possible events. The event detection unit 196 can detect an event such that a contestant or the like purchased by the user is located at the head of the race based on, for example, information on the race result and information on the race purchase history of the user.
  • the selection information acquisition unit 197 acquires one or more selection information indicating one or more contestants selected by the user.
  • the "selection information" is, for example, information indicating a contestant or the like predicted by the user as a winner or the like of a predetermined race.
  • the "selection information" may be information indicating a boat ticket or a betting ticket purchased by the user.
  • the selection information acquisition unit 197 acquires selection information based on, for example, a user's operation input on the screen for selecting a contestant or the like.
  • the selection information acquired by the selection information acquisition unit 197 is transmitted to, for example, the first server device 20 or the second server device 40.
  • the predetermined race is a boat race or a horse race
  • the purchase process of the boat ticket or the betting ticket may be performed according to the selection information being transmitted to the second server device 40.
  • FIG. 3 is a block diagram showing a functional configuration of the first server device 20. A detailed configuration of the first server device 20 will be described with reference to FIG.
  • the first server device 20 exerts functions as a communication unit 220, a storage unit 250, and a control unit 290 by operating according to a program.
  • the communication unit 220 functions as an interface for the first server device 20 to communicate with an external communication device such as the user terminal 10 or the second server device 40 via the network 30.
  • the storage unit 250 stores various programs and data for realizing the system 1.
  • the storage unit 250 stores the program 251 and the race information 252 and the user information 253.
  • the program 251 is a program for the first server device 20 to communicate with the user terminal 10 and the second server device 40 to realize the system 1.
  • the program 251 is executed by the control unit 290 to send and receive data to and from the user terminal 10 and the second server device 40, a process according to the operation content performed by the user of the user terminal 10, race information 252, and the user.
  • the first server device 20 is made to perform a process of updating the information 253 and the like.
  • Race information 252 contains various data related to a given race.
  • the race information 252 includes, for example, the first information 252A and the event information 252B.
  • the first information 252A is the source information of the first information 152A, and the first information 152A can be a part of the first information 252A.
  • the first information 252A is, for example, information acquired from the second server device 40.
  • the event information 252B is the source information of the event information 152B, and the event information 152B may be a part of the event information 252B.
  • the event information 252B is, for example, information acquired from the second server device 40.
  • the user information 253 is information about the user of the user terminal 10.
  • the user information 253 includes a user management table 253A.
  • the user management table 253A stores, for example, information for identifying the user, position information of the user terminal 10, a race purchase history of the user, a hit rate, and the like for each user.
  • the control unit 290 is realized by the processor 29, and by executing the program 251, the transmission / reception unit 291, the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, the data management unit 294, and the timekeeping unit 295. Demonstrate the function as.
  • the transmission / reception unit 291 transmits and receives various information to and from an external communication device such as a user terminal 10 and a second server device 40 via the communication unit 220 and the network 30.
  • the transmission / reception unit 291 transmits, for example, at least a part of the first information 252A and the event information 252B to the user terminal 10. Further, the transmission / reception unit 291 receives, for example, the first information 252A and the event information 252B from the second server device 40.
  • the first information acquisition unit 292A acquires the first information 252A from the second server device 40 via the transmission / reception unit 291.
  • the event information acquisition unit 292B acquires the event information 252B from the second server device 40 via the transmission / reception unit 291.
  • the selection information acquisition unit 293 acquires the user's selection information from the user terminal 10 or the second server device 40 via the transmission / reception unit 291.
  • the data management unit 294 performs a process of updating various data stored in the storage unit 250 according to the processing results of the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, and the like.
  • the timekeeping unit 295 performs a process of measuring time. Various times displayed on the user terminal 10 (for example, the time until the start of the race) can be controlled based on the time measured by the time measuring unit 295.
  • FIG. 4 is a schematic diagram showing an example of a boat race field in the real world.
  • Two turn marks 403 are installed in the boat race field 401, and races are carried out by boats 402a to 402f on which each boat racer rides.
  • the race time information indicating the race times of the boats 402a to 402f is transmitted from the second server device 40 to the first server device 20, and is transmitted from the first server device 20 to the user terminal 10.
  • the boat race field 401 is provided with image pickup devices (cameras) 404a to 404b.
  • the image pickup apparatus 404a captures the boats 402a to 402f in the field of view from above the boat race field 401.
  • the image pickup apparatus 404b keeps the boats 402a to 402f in the field of view from the side of the boat race field 401.
  • the images of the boats 402a to 402f captured by the image pickup devices 404a to 404b are transmitted to the second server device 40.
  • image analysis of each image is performed, and position information indicating the position of each of the boats 402a to 402f in the shooting time of each image is calculated.
  • the calculated position information and the time information related to the shooting time corresponding to the position information are transmitted to the first server device 20, and are transmitted from the first server device 20 to the user terminal 10.
  • the location information may be calculated by the first server device 20.
  • a position sensor such as a GPS sensor may be installed on the boats 402a to 402f instead of or in addition to the image pickup devices 404a to 404b.
  • the position information of the boats 402a to 402f acquired by the position sensor and the time information indicating the time when the position information was acquired are finally transmitted to the user terminal 10.
  • FIG. 5 is a schematic diagram showing an example of a virtual object displayed on the user terminal 10.
  • a race field object 501 moving objects 502a to 502f, two turnmark objects 503, and a virtual display board 505 are shown.
  • the race field object 501 is an object that virtually displays the boat race field 401.
  • the race field object 501 and the turn mark object 503 are preferably created based on race field data such as course information of the boat race field 401, and preferably have a shape corresponding to the boat race field 401.
  • the moving objects 502a to 502f are objects that virtually display the boats 402a to 402f, respectively, and have a shape imitating a boat.
  • the moving objects 502a to 502f move the race field object 501 based on the race time information, the position information of the boats 402a to 402f, and the time information corresponding to the position information. That is, the race field object 501 and the moving objects 502a to 502f display the race in the real world as a virtual race on the user terminal 10.
  • the virtual display board 505 is an object that displays text information.
  • the virtual display board 505 is, for example, a corresponding nonexistent object in the boat race field 401.
  • the text information displayed on the virtual display board 505 is not particularly limited, and may be, for example, ranking information, odds information, or the like. Further, the text information displayed on the virtual display board 505 may be changeable based on the operation input of the user.
  • 6 and 7 are flowcharts showing an example of display control processing. It should be noted that the order of each process constituting the flowchart described below is random as long as there is no contradiction or inconsistency in the process contents. Further, the processing executed by each device may be executed by another device within a range that does not cause a contradiction.
  • FIGS. 6 and 7 The processes shown in FIGS. 6 and 7 are realized by the control unit 190 executing the application program 151 and the control unit 290 executing the program 251.
  • step S610 the control unit 190 generates a base avatar that is the base of the avatar object.
  • step S620 the control unit 190 acquires the editing information of the avatar object based on the operation input of the user.
  • step S630 the control unit 190 updates the avatar information 152C stored in the storage unit 150 based on the editing information acquired in step S620.
  • FIG. 8 is a diagram showing an example of an avatar editing screen 800 for a user to edit an avatar object based on the generated base avatar.
  • the avatar editing screen 800 shown in FIG. 8 has a base avatar 801 and a gender button 802 for changing the gender of the base avatar 801 and an attribute change column 803 for changing various attributes other than the gender of the base avatar 801. Includes.
  • the base avatar 801 is generated in the form of a male, but can be changed to a female by tapping the gender button 802 or the like.
  • the attribute change column 803 for example, a plurality of options are displayed for each attribute such as "type (occupation)", “skin color”, “winning action”, and “losing action” of the avatar object.
  • FIG. 9 is a diagram showing a selection table 900 showing options for each attribute.
  • the selection table 900 shown in FIG. 9 may be included in, for example, the avatar information 152C stored in the storage unit 150.
  • the selection table 900 includes, for example, "type”, “skin color”, “winning action”, “losing action”, and “vehicle” as attributes, and examples 1 to 4 are included in each attribute. Choices are registered.
  • the control unit 190 displays the attribute change field 803 of the avatar edit screen 800 based on the selection table 900. As a result, the user can select a favorite option from a plurality of options for each attribute.
  • the control unit 190 Based on the user's operation input to the gender button 802 and the attribute change field 803, the control unit 190 acquires information such as the gender, type, skin color, and action of the avatar object as the editing information of the avatar object. For example, the user taps the gender button 802 to select the gender of the base avatar 801 as "male”, taps the attribute change field 803 to select the type "hero”, and wins the action "guts pose”. And select the losing action "Boat Blast". In step S620, the control unit 190 acquires information for changing the avatar object as editing information based on these operation inputs of the user.
  • control unit 190 takes the form of a "male" "hero” in step S630, and if the avatar object wins the race, "guts pose".
  • the avatar information 152C is updated so that the action of "boat blast” is executed when the action of is executed and the race is lost.
  • the process of editing and registering a single avatar object is described in steps S620 and S630, but the present invention is not limited to this example.
  • this step it is possible to edit and register multiple avatar objects.
  • the options for editing the avatar object are not limited to the options displayed on the avatar edit screen 800 shown in FIG. 8 and the selection table 900 shown in FIG. 9, and for example, the user himself / herself likes an image (user). You may be able to capture an image of yourself, etc.) and edit the avatar object based on the captured image.
  • step S640 the control unit 190 acquires information (hereinafter, referred to as contestant information) regarding the contestants or moving objects of a predetermined race from the first server device 20.
  • the process of step S640 may be executed at a predetermined frequency or when a specific event occurs. That is, the contestant information can be updated every time the process of step S640 is executed.
  • the specific event is not particularly limited, but for example, when the contestant information is changed (for example, when the odds change), the start or end of the race, the ranking is confirmed, etc. There are cases.
  • step S650 the control unit 190 associates and registers the avatar information with the acquired contestant information. That is, the control unit 190 registers the contestant information included in the first information 152A acquired from the first server device 20 and stored in the storage unit 150 in association with the avatar information 152C.
  • the association between the contestant information and the avatar information 152C may be performed, for example, based on the selection information acquired based on the user's operation input on the screen for selecting the contestant or the like. That is, the control unit 190 may combine the contestant information of the contestant who purchased the boat ticket or the betting ticket with the avatar information 152C.
  • step S660 the control unit 190 activates the image pickup unit 170, which is a camera.
  • the image pickup unit 170 captures a real image around the user terminal 10.
  • step S670 the control unit 190 detects a flat surface in the image captured by the image pickup unit 170.
  • step S680 the control unit 190 arranges the virtual object on the detected flat surface.
  • FIG. 10 is a schematic diagram showing an example of a real image captured by the imaging unit 170.
  • a keyboard 1002 and a monitoring device 1003 are placed on a flat desk 1001.
  • step S670 When the image pickup unit 170 is activated in step S670, the real image captured by the image pickup unit 170 is displayed on the display 132.
  • step S680 the control unit 190 detects a flat surface in the image captured by the image pickup unit 170, that is, in the image displayed on the display 132.
  • the region 1004 is detected as a flat surface.
  • the control unit 190 detects the area 1004 as a flat surface.
  • the position of the region 1004 can also be changed.
  • the area 1004 is displayed on the display 132 so as to be distinguishable from other parts, for example, by adding a predetermined color.
  • the control unit 190 arranges virtual objects such as the race field object 501 and the moving objects 502a to 502f on the area 1004.
  • FIG. 11 is a schematic diagram showing an example of a screen in which a virtual object is superimposed on a real image and displayed.
  • the area with the dot pattern including the monitor device 1003 is the real image, and the other area is the area where the virtual object is displayed.
  • an advertisement image may be displayed in an area where the virtual object is not displayed.
  • a race field object 501 As virtual objects, a race field object 501, a plurality of moving objects 502, two turnmark objects 503, a large monitor object 506, building objects 507a to 507b, and other reference numerals are not attached. A large number of objects (tree objects, clock objects, etc.) and are displayed. These objects are created, for example, based on the first information 152A received from the first server device 20.
  • FIG. 12 is a schematic view showing an example of a screen in which a virtual object is superimposed on a real image, and shows another aspect of the race field object 501 shown in FIG. Specifically, FIG. 12 is an example in which a predetermined race is horse racing.
  • the area with the dot pattern including the monitoring device 1003 is the real image, and the other areas are the areas where the virtual objects are displayed.
  • virtual objects a race field object 511, a plurality of moving objects 512, a large monitor object 513, a pond object 514, and a plurality of tree objects 515 are displayed on the display 132. These objects are also created, for example, based on the first information 152A received from the first server device 20.
  • the racetrack object 511, the large monitor object 513, the pond object 514, and the plurality of tree objects 515 are preferably created based on racetrack data such as, for example, course information of a predetermined racetrack in the real world.
  • the plurality of moving objects 512 are, for example, objects that virtually display horses and jockeys running in horse racing.
  • step S690 the control unit 190 associates the contestant information with the plurality of moving objects 502.
  • the avatar information 152C is associated with the moving object 502 together with the contestant information and registered in the storage unit 150.
  • step S710 of FIG. 7 the control unit 190 acquires the position information of the boats 402a to 402f in the boat race field 401 in the real world. That is, when the race by the boats 402a to 402f is started in the real world, the control unit 190 acquires the position information and the time information of the boats 402a to 402f from the first server device 20.
  • the method of acquiring the position information and the time information is as described with reference to FIG.
  • step S720 the control unit 190 controls the position information acquired in step S710 to be linked with the moving object. Specifically, using the time information and the position information, the movements of the boats 402a to 402f on the race field object 501 are controlled to be the same as the movement objects 502a to 502f on the boat race field 401.
  • step S730 when the control unit 190 accepts the user's selection operation for a specific moving object among the plurality of moving objects 502a to 502f, the process of step S740 is executed.
  • step S740 the control unit 190 displays detailed contestant information corresponding to the moving object 502 for which the user's selection operation was performed in step S730 together with the avatar information.
  • step S750 the detailed display of these information is ended in step S750.
  • the control unit 190 ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
  • FIG. 13 is a schematic diagram showing an example of a method of displaying detailed contestant information together with avatar information.
  • FIG. 13 is a screen displayed on the display 132 when, for example, the user performs an operation (for example, a tap operation) to select the moving object 502a in the state of FIG. 11.
  • FIG. 13 is a schematic diagram showing an example of a screen in a state where detailed information is displayed.
  • moving objects 502a to 502b are displayed on the race field object 501. Further, in FIG. 13, the detailed information object 1310 is displayed around the moving object 502a. When the moving object 502a moves, the detailed information object 1310 may also move so as to follow the moving object 502a.
  • the detailed information object 1310 includes a name field 1320, an avatar display field 1330, a chart display field 1340, and an end button 1350.
  • the name field 1320 the name of the racer boarding the moving object 502a is displayed.
  • the avatar display field 1330 for example, an image of the avatar object registered in association with the racer (participant information) boarding the moving object 502a is displayed.
  • the image of the avatar object displayed in the avatar display field 1330 may be a two-dimensional image or a three-dimensional image.
  • the display mode of the avatar object displayed in the avatar display field 1330 may be changed, in particular, according to the current order. For example, a smiling avatar object may be displayed in the case of a higher rank, and a crying face avatar object may be displayed in the case of a lower rank.
  • a graph for example, a radar chart
  • racer characteristic data in a plurality of items such as "win rate”, “double rate”, “start”, "speed”, and “turn” are displayed.
  • the information displayed in the chart display field 1340 may be a graph of the contestant information obtained as text information by the control unit 190, or the contestant information acquired in step S640 from the beginning. May be.
  • the end button 1350 is a virtual button for hiding the detailed information object 1310 in step S750.
  • the detailed information object 1310 may be hidden by tapping the outside of the detailed information object 1310.
  • FIG. 13 the detailed information object 1310 is provided with a chart display field 1340 in which a radar chart is displayed, but the present invention is not limited to this example.
  • FIG. 14 is a schematic diagram showing an example of a screen in which another example of the detailed information object is shown.
  • the detailed information display field 1440 is displayed instead of the chart display field 1340. ..
  • detailed contestant information is displayed as text information.
  • the chart display field 1340 and the detailed information display field 1440 may be collectively displayed on the detailed information object by combining the example of FIG. 13 and the example of FIG.
  • steps S710 to S750 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world.
  • the position information of the boats 402a to 402f from the start to the end of the race may be collectively acquired before the start of the virtual race. Further, the virtual race may be displayed by acquiring only the race time information without acquiring the position information and the like.
  • the race displayed as a virtual race may be an exhibition race.
  • step S760 the control unit 190 executes the result display process and ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
  • FIG. 15 is a flowchart showing an example of the display control process in the second operation example. Since steps S710 and S720 in FIG. 15 are the same as steps S710 and S720 shown in FIG. 7, detailed description thereof will be omitted.
  • step S1510 the control unit 190 acquires selection information. Specifically, the control unit 190 acquires information indicating a boat ticket or betting ticket purchased by the user as selection information.
  • step S1520 the control unit 190 registers the avatar information in the moving object to be selected based on the acquired selection information.
  • the control unit 190 changes the display of the moving object based on the registered avatar information.
  • FIG. 16 is a schematic diagram showing an example of the display screen in step S1530.
  • the control unit 190 acquires the fact that the user has purchased the boat ticket of the boat 402a as selection information, and registers the user's avatar information in association with the moving object 502a corresponding to the boat 402a.
  • the registered avatar information is, for example, the avatar information updated in step S630 based on the editing information acquired in step S620.
  • the control unit 190 changes the display of the racer image displayed on the moving object 502a to an avatar object based on the avatar information registered in association with the moving object 502a.
  • the racer on the moving object 502a is changed to the avatar object 1620 (an example of the first avatar object) in the form of a monster based on the editing information updated by the user and displayed.
  • an avatar object 1610 (an example of a second avatar object) in the form of a normal racer is displayed on the moving object 502b corresponding to the boat 402b for which the user has not purchased a boat ticket.
  • the moving object 502a representing the contestants selected by the user is displayed in a display mode different from the moving object 502b representing the contestants not selected by the user.
  • the avatar object 1620 on the moving object 502a and the avatar object 1610 on the moving object 502b move together with the moving objects 502a and 502b.
  • the control unit 190 detects the occurrence of a predetermined event in a predetermined race.
  • a predetermined event occurs in a predetermined race in the real world, for example, the control unit 290 is based on the image analysis of the race, the information received from the second server device 40, and the information stored in the event information 252B. , The occurrence of an event is detected, and the event information indicating that the occurrence of the event has been detected is transmitted to the user terminal 10.
  • the control unit 190 detects the occurrence of an event, for example, based on receiving the event information during the race.
  • the control unit 190 generates a predetermined event based on, for example, the first information 152A and the user information 153, without depending on the transmission of the event information from the first server device 20 during the race. It may be detected.
  • the control unit 190 may detect the occurrence of a predetermined event based on the position information of the contestants and the like and the time information corresponding to the position information.
  • step S1550 the control unit 190 adds a specific action to the avatar information registered in the moving object 502.
  • the event information is received from the first server device 20 in step S1540 and the avatar information corresponding to the event information is registered
  • step S1550 for example, the display of the action object is displayed based on the avatar information. Will be executed. It should be noted that only the information identifying the event such as the event ID is received from the first server device 20, for example, the information corresponding to the acquired event ID or the like is acquired from the event information 152B, and the action is taken based on the event information 152B. The object may be displayed.
  • FIG. 17 is a schematic diagram showing an example of the display screen in step S1550.
  • an action object (action image) is displayed for the moving object 502a corresponding to the boat 402a for which the user has purchased a boat ticket (voting right).
  • the display 132 displays an action object 1710 that imitates a flame against the monster object 1620 displayed on the moving object 502a.
  • This action object 1710 is added to a contestant who is the target of the voting right purchased by the user, for example, with the occurrence of the final straight line event.
  • the addition of the action object 1710 is not limited to the above example, for example, detection of the occurrence of another event other than the final straight line event (for example, the contestant who is the subject of the voting right wins, the contestant loses). It may be added according to the fact that the acceleration has increased, the speed is faster than other contestants, etc.). For example, the action attached to the avatar object may be changed according to the current rank.
  • the action object added when the event occurs can be selected by the user on the avatar editing screen 800 of FIG.
  • the user can use the avatar editing screen 800 of FIG. 8 among the plurality of options displayed in the “winning action” column of the selection table 900 shown in FIG.
  • the action object that represents the winning action selected in is displayed.
  • the contestant who is the target of the voting right loses, the user can use the avatar editing screen of FIG. 8 among the plurality of options displayed in the "losing action" column of the selection table 900 shown in FIG.
  • An action object representing the losing action selected in 800 is displayed.
  • the action object may be attached to another virtual object such as the race field object 501 instead of the moving object 502.
  • the action object 1710 is displayed only when an event occurs, but depending on the type of action, the action object is always near the moving object of the contestant who is the target of the voting right purchased by the user. It may be displayed. That is, in step S730, the occurrence of an event may be detected based on the user purchasing the voting right of a predetermined contestant.
  • the association between the moving object 502 and the action object is not limited to the above example, and may be added to, for example, a contestant who is not subject to the voting right purchased by the user.
  • the association between the moving object 502 and the action object may be set in advance by the administrator of the first server device 20 or the user of the user terminal 10.
  • the user of the user terminal 10 may display the action object corresponding to the moving object set as "favorite".
  • the action object displayed corresponding to the moving object 502 may be configured so that the user can select it from a plurality of objects set as selection candidates. Further, the types of objects that are candidates for selection may be increased according to the user's billing and the like.
  • steps S710 to S1550 in FIG. 15 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world. .. Since the process of step S760 in FIG. 15 is the same as the process of step S760 in FIG. 7, detailed description thereof will be omitted.
  • the program is attached to the processor.
  • the step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
  • the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
  • the virtual object includes at least a moving object corresponding to the contestant or the moving object.
  • the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object. program.
  • the first information further includes position information of the contestant or the moving body during the execution of the predetermined race, and time information corresponding to the position information.
  • the display step displays a virtual race in which the moving object is moved based on the position information and the time information to virtually represent the predetermined race, and in the virtual race, at least the moving object is displayed. Including displaying a portion to include the predetermined avatar, The program described in item 1. As a result, the visibility and interest of the virtual race can be improved, and the user's satisfaction can be further improved.
  • the program is applied to the processor and further to the processor. It is intended to execute a step of acquiring one or more selection information indicating one or more contestants or the moving object selected by the user. In the step to be displayed, the user has not selected one or more of the predetermined avatars selected by the user, the first avatar representing the contestant or the moving object, based on the selection information.
  • the present invention includes displaying in a display mode different from that of the second avatar representing the contestant or the moving object.
  • the program according to item 1 or item 2.
  • the display step comprises changing the display mode of the predetermined avatar according to the current ranking in the predetermined race.
  • the first computer further comprises a storage unit for storing a database relating to the predetermined avatar.
  • the program is applied to the processor and further to the processor.
  • the step of editing the predetermined avatar based on a plurality of items preset in the database is executed.
  • the program according to any one of items 1 to 4.
  • the user can edit the avatar according to his / her preference, thereby improving the familiarity of the user with the contestants and the like.
  • the contestant information displayed in the displayed step includes at least a characteristic chart of the contestant or the moving object.
  • the program according to any one of items 1 to 5. This allows the user to understand the characteristics of each moving object. Further, since the characteristic information is usually one of the information that the user is highly interested in, the satisfaction of the user can be improved by displaying the characteristic information together with the moving object.
  • the receiving step is executed at a predetermined frequency or when a specific event occurs, and is executed.
  • the information processing method is applied to the processor.
  • the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
  • the virtual object includes at least a moving object corresponding to the contestant or the moving object.
  • the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
  • Information processing method As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race. In addition, since the user's sense of familiarity with the contestants can be enhanced, the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
  • An information processing device including a processor and an image pickup unit, wherein the processor is Receive the first information about a given race in the real world from the second computer, A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
  • the virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
  • the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
  • the virtual object includes at least a moving object corresponding to the contestant or the moving object.
  • the display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
  • the second processor is Acquire the first information about a predetermined race in the real world, transmit the first information to the first computer, and then The first processor is The first information is received from the second computer, A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
  • the virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
  • the first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
  • the virtual object includes at least a moving object corresponding to the contestant or the moving object.
  • the superimposed display is to display at least a part of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
  • Including the system As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race.
  • the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
  • the first processor further Acquire one or more selection information indicating the one or more contestants or the moving object selected by the user, and obtain the selection information.
  • the first information includes the position information of the contestant or the moving body during the execution of the predetermined race, and the time information corresponding to the position information.
  • the display step displays a virtual race that virtually represents the predetermined race by moving the moving object, and at least a part of the moving object includes the predetermined avatar in the virtual race. Including displaying on The position information is acquired based on the position sensor provided on the contestant or the moving body, or the image of the predetermined race captured by the image pickup device during the execution of the predetermined race.
  • the system according to item 10. This makes it easy to obtain the position information of the contestants and the like. As a result, the ranking and offense and defense during the race in the real world can be reproduced on the virtual race, so that the user's satisfaction can be improved.
  • Storage unit (of 1 server device) 290: Control unit (of 1st server device), 401: Boat race field, 402a to 402f: Boat, 501: Race field object, 502 (502a to 502f): Moving object, 1310: Detailed information object, 1330: Avatar display field, 1340: Chart display field, 1610, 1620: Avatar object, 1710: Action object

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

This program, executed by a first computer, causes a processor to execute: a step of receiving first information relating to a race in the real world from a second computer; a step of generating, on the basis of the first information, a virtual object including a racetrack object representing a racetrack, and a moving object representing a participant or a moving body in a prescribed race; and a step of displaying the virtual object superimposed on a real image captured by an imaging unit. The first information includes, at least, participant information relating to at least one of the participant and the moving object in the prescribed race, and the virtual object includes, at least, a moving object 502 corresponding to the participant information. In response to an operation input by a user with respect to the moving object, at least a portion of the participant information is displayed in association with a prescribed avatar 1330 corresponding to the moving object.

Description

プログラム、情報処理方法、情報処理装置、及びシステムPrograms, information processing methods, information processing devices, and systems
本開示は、プログラム、情報処理方法、情報処理装置、及びシステムに関する。 The present disclosure relates to programs, information processing methods, information processing devices, and systems.
特許文献1及び2には、AR(Augmented Reality)に関する技術が開示されている。 Patent Documents 1 and 2 disclose techniques related to AR (Augmented Reality).
特開2020-58658号公報Japanese Unexamined Patent Publication No. 2020-58658 特開2020-77187号公報Japanese Unexamined Patent Publication No. 2020-77187
現実世界においては、ボートレースや競馬などの様々なレースが実施されている。これらのレースを観戦するために実際にレース場へ足を運ぶ人々もいるが、実際にレース場へ足を運ぶことは、時間的又は地理的な制約により不可能な場合もある。上記のような状況から、現実世界のレースに連動した仮想オブジェクトを用いて、現実世界のレースを仮想的なレースとして表示することができれば、実際にレース場へ行かずともレースを擬似的に観戦することができ、有益である。 In the real world, various races such as boat races and horse races are held. Some people actually go to the race track to watch these races, but it may not be possible due to time or geographical constraints. From the above situation, if the real world race can be displayed as a virtual race using a virtual object linked to the real world race, you can watch the race in a simulated manner without actually going to the race track. Can be beneficial.
ところで、ボートレースや競馬などの様々なレースにおいて、レースの出場者の認知度の向上には改善の余地がある。 By the way, in various races such as boat races and horse races, there is room for improvement in raising the awareness of race contestants.
本開示の一態様は、コンピュータに表示される仮想レースを通じて、当該仮想レースに対応する現実世界のレースの出場者の認知度を向上可能なプログラム、情報処理方法、情報処理装置、及びシステムを提供することを目的とする。 One aspect of the present disclosure provides a program, information processing method, information processing device, and system capable of improving the recognition of the contestants of the real world race corresponding to the virtual race through the virtual race displayed on the computer. The purpose is to do.
本開示に示す一実施形態によれば、
プロセッサ及び撮像部を備えた第1コンピュータにおいて実行されるプログラムであって、
前記プログラムは、前記プロセッサに、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成するステップと、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させ、
前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
プログラムが提供される。
According to one embodiment presented in the present disclosure.
A program executed in a first computer equipped with a processor and an image pickup unit.
The program is attached to the processor.
The step of receiving the first information about a given race in the real world from the second computer,
A step of generating a virtual object for presenting the second information about the predetermined race to the user of the first computer based on the first information, and
The step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
The virtual object includes at least a moving object corresponding to the contestant or the moving object.
The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
The program is offered.
本開示に示す一実施形態によれば、第1コンピュータに表示される仮想レースを通じて、当該仮想レースに対応する現実世界のレースの出場者の認知度を向上させることができる。 According to one embodiment shown in the present disclosure, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race through the virtual race displayed on the first computer.
ある実施の形態に従うシステムの構成例を示す図である。It is a figure which shows the configuration example of the system which follows a certain embodiment. ある実施の形態に従うユーザ端末の機能的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional configuration of the user terminal according to a certain embodiment. ある実施の形態に従うサーバの機能的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the functional configuration of the server according to a certain embodiment. ある実施の形態に従う現実世界のレース場の一例を示す模式図である。It is a schematic diagram which shows an example of the race field of the real world according to a certain embodiment. ある実施の形態に従うユーザ端末に表示される仮想オブジェクトの一例を示す模式図である。It is a schematic diagram which shows an example of the virtual object displayed on the user terminal according to a certain embodiment. ある実施の形態に従う表示制御処理の一例を示すフローチャートである。It is a flowchart which shows an example of the display control processing according to a certain embodiment. ある実施の形態に従う表示制御処理の一例を示すフローチャートである。It is a flowchart which shows an example of the display control processing according to a certain embodiment. ある実施の形態に従うアバター編集画面の一例を示す図である。It is a figure which shows an example of the avatar edit screen which follows a certain embodiment. ある実施の形態に従うアバターオブジェクトの各属性の選択肢を示す選択テーブルを示す図である。It is a figure which shows the selection table which shows the choice of each attribute of the avatar object according to a certain embodiment. ある実施の形態に従う撮像部により撮像された現実画像の一例を示す模式図である。It is a schematic diagram which shows an example of the real image which was taken by the image pickup part according to a certain embodiment. ある実施の形態に従う現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図である。It is a schematic diagram which shows an example of the screen which superposed virtual object on the real image which follows a certain embodiment. ある実施の形態に従う現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図である。It is a schematic diagram which shows an example of the screen which superposed virtual object on the real image which follows a certain embodiment. ある実施の形態に従う仮想オブジェクトに詳細な出場者情報をアバター情報とともに表示する方法の一例を示す模式図である。It is a schematic diagram which shows an example of the method of displaying the detailed contestant information together with the avatar information in the virtual object according to a certain embodiment. ある実施の形態に従う詳細な出場者情報の表示方法の別の例が示された状態の画面の一例を示す模式図である。It is a schematic diagram which shows an example of the screen in the state which showed another example of the display method of the detailed contestant information according to a certain embodiment. ある実施の形態に従う第二動作例における表示制御処理の一例を示すフローチャートである。It is a flowchart which shows an example of the display control processing in the 2nd operation example which follows a certain embodiment. ある実施の形態に従うアバターオブジェクトの表示画面の一例を示す模式図である。It is a schematic diagram which shows an example of the display screen of the avatar object according to a certain embodiment. ある実施の形態に従うアバターオブジェクトの表示画面の一例を示す模式図である。It is a schematic diagram which shows an example of the display screen of the avatar object according to a certain embodiment.
以下、この技術的思想の実施の形態について図面を参照しながら詳細に説明する。以下の説明では、同一の要素には同一の符号を付し、重複する説明を適宜省略する。本開示において示される1以上の実施形態において、各実施形態が含む要素を互いに組み合わせることができ、かつ、当該組み合わせられた結果物も本開示が示す実施形態の一部をなすものとする。 Hereinafter, embodiments of this technical idea will be described in detail with reference to the drawings. In the following description, the same elements are designated by the same reference numerals, and duplicate description will be omitted as appropriate. In one or more embodiments shown in the present disclosure, the elements included in each embodiment may be combined with each other, and the combined result shall also form part of the embodiments shown in the present disclosure.
(システムの構成)
図1は、本実施の形態に従うシステム1の構成を示す図である。システム1は、例えば、現実世界で実施される所定のレースを、ユーザが使用する情報処理装置上において仮想オブジェクトを用いた仮想レースとして表示することが可能なものである。本明細書において、「所定のレース」とは、現実世界で実施されるレースであれば特に制限はされず、例えば、ボートレース(本番のレースや、展示レース)、競馬、競輪、オートレース、F1等のカーレース、ドローンレース、ドッグレース、マラソン、駅伝などが挙げられる。
(System configuration)
FIG. 1 is a diagram showing a configuration of a system 1 according to the present embodiment. The system 1 can display, for example, a predetermined race performed in the real world as a virtual race using a virtual object on an information processing device used by a user. In the present specification, the "predetermined race" is not particularly limited as long as it is a race carried out in the real world, and for example, a boat race (actual race or exhibition race), a horse race, a bicycle race, an auto race, etc. Car races such as F1, drone races, dog races, marathons, ekiden, etc. can be mentioned.
図1に示すように、システム1は、各ユーザが使用する情報処理装置(第1コンピュータ)であるユーザ端末10A、ユーザ端末10B及びユーザ端末10C(以下、ユーザ端末10A、10B、10Cなどのユーザ端末を総称して「ユーザ端末10」とも称する)など複数のユーザ端末10と、第1サーバ装置(第2コンピュータ)20と、第2サーバ装置40と、ネットワーク30と、を含む。 As shown in FIG. 1, the system 1 is a user such as a user terminal 10A, a user terminal 10B, and a user terminal 10C (hereinafter, user terminals 10A, 10B, 10C) which are information processing devices (first computers) used by each user. It includes a plurality of user terminals 10 such as (generally referred to as "user terminal 10"), a first server device (second computer) 20, a second server device 40, and a network 30.
ユーザ端末10Aとユーザ端末10Bとは、無線基地局31と通信することにより、ネットワーク30と接続する。ユーザ端末10Cは、家屋などの施設に設置される無線ルータ32と通信することにより、ネットワーク30と接続する。ユーザ端末10は、例えば、タッチスクリーンを備える携帯型端末であり、スマートフォン、ファブレット、タブレットなどでありうる。 The user terminal 10A and the user terminal 10B are connected to the network 30 by communicating with the radio base station 31. The user terminal 10C connects to the network 30 by communicating with a wireless router 32 installed in a facility such as a house. The user terminal 10 is, for example, a portable terminal provided with a touch screen, and may be a smartphone, a phablet, a tablet, or the like.
ユーザ端末10は、例えば、アプリ等を配信するプラットフォームを介してインストールされたプログラム、又は、予めプリインストールされているウェブサイト閲覧用ソフトウエアなどを含むプログラムを実行する。ユーザ端末10は、上記プログラムの実行により、第1サーバ装置20と通信し、所定のレースに関連するデータやユーザに関連するデータ等を第1サーバ装置20との間で送受信することにより、ユーザ端末10上で仮想レースを表示することを可能とする。 The user terminal 10 executes, for example, a program installed via a platform for distributing an application or the like, or a program including pre-installed website browsing software. The user terminal 10 communicates with the first server device 20 by executing the above program, and transmits / receives data related to a predetermined race, data related to the user, and the like to / from the first server device 20. It is possible to display a virtual race on the terminal 10.
第1サーバ装置20は、所定のレースに関連するデータを、第2サーバ装置40から受信する。第1サーバ装置20は、所定のレースに関連するデータを、適宜、ユーザ端末10へ送信する。第1サーバ装置20は、所定のレースに関連するデータや、各ユーザに関連するデータを記憶して管理する。 The first server device 20 receives data related to a predetermined race from the second server device 40. The first server device 20 appropriately transmits data related to a predetermined race to the user terminal 10. The first server device 20 stores and manages data related to a predetermined race and data related to each user.
第1サーバ装置20は、ハードウェア構成として、通信IF(Interface)22と、入出力IF23と、メモリ25と、ストレージ26と、プロセッサ(第2プロセッサ)29と、を備え、これらが通信バスを介して互いに接続されている。 The first server device 20 includes a communication IF (Interface) 22, an input / output IF 23, a memory 25, a storage 26, and a processor (second processor) 29 as hardware configurations, and these provide a communication bus. They are connected to each other via.
通信IF22は、例えばLAN(Local Area Network)規格など各種の通信規格に対応しており、ユーザ端末10や第2サーバ装置40などとの間でデータを送受信するためのインタフェースとして機能する。 The communication IF 22 supports various communication standards such as a LAN (Local Area Network) standard, and functions as an interface for transmitting and receiving data to and from a user terminal 10 and a second server device 40.
入出力IF23は、第1サーバ装置20への情報の入力を受け付けるとともに、第1サーバ装置20の外部へ情報を出力するためのインタフェースとして機能する。入出力IF23は、マウス、キーボード等の情報入力機器の接続を受け付ける入力受付部と、画像等を表示するためのディスプレイ等の情報出力機器の接続を受け付ける出力部とを含みうる。 The input / output IF 23 functions as an interface for accepting input of information to the first server device 20 and outputting information to the outside of the first server device 20. The input / output IF 23 may include an input receiving unit that accepts the connection of an information input device such as a mouse and a keyboard, and an output unit that accepts the connection of an information output device such as a display for displaying an image or the like.
メモリ25は、処理に使用されるデータ等を記憶するための記憶装置である。メモリ25は、例えば、プロセッサ29が処理を行う際に一時的に使用するための作業領域をプロセッサ29に提供する。メモリ25は、ROM(Read       Only Memory)、RAM(Random Access Memory)等の記憶装置を含んで構成されている。 The memory 25 is a storage device for storing data or the like used for processing. The memory 25 provides the processor 29 with a work area for temporary use, for example, when the processor 29 performs processing. The memory 25 includes a storage device such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
ストレージ26は、プロセッサ29が読み込んで実行するための各種プログラム及びデータを記憶するための記憶装置である。ストレージ26が記憶する情報には、所定のレースに関連するデータや、各ユーザに関連するデータ等が含まれる。ストレージ26は、HDD(Hard Disk Drive)、フラッシュメモリ等の記憶装置を含んで構成されうる。なお、ストレージは、サーバ装置に含まれる形態に限られず、クラウドサービスを利用することもできる。 The storage 26 is a storage device for storing various programs and data for the processor 29 to read and execute. The information stored in the storage 26 includes data related to a predetermined race, data related to each user, and the like. The storage 26 may be configured to include a storage device such as an HDD (Hard Disk Drive) or a flash memory. The storage is not limited to the form included in the server device, and a cloud service can also be used.
プロセッサ29は、ストレージ26に記憶されるプログラム等を読み込んで実行することにより、第1サーバ装置20の動作を制御する。プロセッサ29は、例えば、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、GPU(Graphics Processing Unit)等を含んで構成されうる。 The processor 29 controls the operation of the first server device 20 by reading and executing a program or the like stored in the storage 26. The processor 29 may be configured to include, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a GPU (Graphics Processing Unit), and the like.
第2サーバ装置40は、所定のレースに関連するデータを記憶して管理している。第2サーバ装置40は、例えば、所定のレースの開催者が管理するサーバ装置や、所定のレースに関する情報を外部へ発信する団体(レース専門誌の発行者、レースの映像配信者又はラジオ配信者など)が管理するサーバ装置である。第2サーバ装置40は、所定のレースに関連するデータを、適宜、第1サーバ装置20へ送信する。ある局面においては、第2サーバ装置40は、所定のレースに関連するデータをユーザ端末10へ送信してもよい。第2サーバ装置40のハードウェア構成は、矛盾の生じない範囲で、第1サーバ装置20と同様であってもよい。第2サーバ装置40は、複数あってもよい。 The second server device 40 stores and manages data related to a predetermined race. The second server device 40 is, for example, a server device managed by a predetermined race organizer or an organization (publisher of a race magazine, race video distributor, or radio distributor) that transmits information about a predetermined race to the outside. Etc.) is a server device managed by. The second server device 40 appropriately transmits data related to a predetermined race to the first server device 20. In certain aspects, the second server device 40 may transmit data related to a predetermined race to the user terminal 10. The hardware configuration of the second server device 40 may be the same as that of the first server device 20 as long as there is no contradiction. There may be a plurality of second server devices 40.
(ユーザ端末)
図2は、ユーザ端末10の機能的な構成の一例を示すブロック図である。図2に示すように、ユーザ端末10は、アンテナ110と、無線通信IF120と、タッチスクリーン130と、入出力IF140と、記憶部150と、音声処理部160と、マイク161と、スピーカ162と、撮像部170と、制御部(第1プロセッサ)190と、を含む。
(User terminal)
FIG. 2 is a block diagram showing an example of the functional configuration of the user terminal 10. As shown in FIG. 2, the user terminal 10 includes an antenna 110, a wireless communication IF 120, a touch screen 130, an input / output IF 140, a storage unit 150, a voice processing unit 160, a microphone 161 and a speaker 162. It includes an image pickup unit 170 and a control unit (first processor) 190.
アンテナ110は、ユーザ端末10が発する信号を電波として空間へ放射する。また、アンテナ110は、空間から電波を受信して受信信号を無線通信IF120へ与える。 The antenna 110 radiates a signal emitted by the user terminal 10 into space as a radio wave. Further, the antenna 110 receives radio waves from the space and gives a received signal to the wireless communication IF 120.
無線通信IF120は、ユーザ端末10が他の通信機器と通信するため、アンテナ110等を介して信号を送受信するための変復調処理などを行う。無線通信IF120は、チューナー、高周波回路などを含む無線通信用の通信モジュールであり、ユーザ端末10が送受信する無線信号の変復調や周波数変換を行い、受信信号を制御部190へ与える。 Since the user terminal 10 communicates with other communication devices, the wireless communication IF 120 performs modulation / demodulation processing for transmitting / receiving signals via the antenna 110 or the like. The wireless communication IF 120 is a communication module for wireless communication including a tuner, a high frequency circuit, etc., performs modulation / demodulation and frequency conversion of the wireless signal transmitted / received by the user terminal 10, and supplies the received signal to the control unit 190.
タッチスクリーン130は、ユーザからの入力を受け付けて、ユーザに対し情報をディスプレイ132に出力する。タッチスクリーン130は、ユーザの操作入力を受け付けるためのタッチパネル131と、ディスプレイ132と、を含む。タッチパネル131は、例えば、静電容量方式のものを用いることによって、ユーザの指などが接近したことを検出する。ディスプレイ132は、例えばLCD(Liquid Crystal Display)、有機EL(electroluminescence)その他の表示装置によって実現される。 The touch screen 130 receives input from the user and outputs information to the user on the display 132. The touch screen 130 includes a touch panel 131 for receiving user operation input and a display 132. The touch panel 131 detects that a user's finger or the like has approached by using, for example, a capacitance type touch panel. The display 132 is realized by, for example, an LCD (Liquid Crystal Display), an organic EL (electroluminescence), or other display device.
入出力IF140は、ユーザ端末10への情報の入力を受け付けるとともに、ユーザ端末10の外部へ情報を出力するためのインタフェースとして機能する。 The input / output IF 140 functions as an interface for accepting input of information to the user terminal 10 and outputting information to the outside of the user terminal 10.
記憶部150は、フラッシュメモリ、HDD等により構成され、ユーザ端末10が使用するプログラム、及び、ユーザ端末10が第1サーバ装置20等から受信する各種データ等を記憶する。 The storage unit 150 is composed of a flash memory, an HDD, or the like, and stores a program used by the user terminal 10 and various data received by the user terminal 10 from the first server device 20 or the like.
音声処理部160は、音声信号の変復調を行う。音声処理部160は、マイク161から与えられる信号を変調して、変調後の信号を制御部190へ与える。また、音声処理部160は、音声信号をスピーカ162へ与える。音声処理部160は、例えば、音声処理用のプロセッサによって実現される。マイク161は、音声信号の入力を受け付けて制御部190へ出力するための音声入力部として機能する。スピーカ162は、音声信号を、ユーザ端末10の外部へ出力するための音声出力部として機能する。 The voice processing unit 160 performs modulation / demodulation of the voice signal. The voice processing unit 160 modulates the signal given from the microphone 161 and gives the modulated signal to the control unit 190. Further, the voice processing unit 160 gives a voice signal to the speaker 162. The voice processing unit 160 is realized by, for example, a processor for voice processing. The microphone 161 functions as an audio input unit for receiving an input of an audio signal and outputting it to the control unit 190. The speaker 162 functions as an audio output unit for outputting an audio signal to the outside of the user terminal 10.
撮像部170は、ユーザ端末10の周囲の現実画像を撮像するカメラである。撮像部170によって撮像された画像は、制御部190によって画像処理がなされ、ディスプレイ132へ出力される。 The image pickup unit 170 is a camera that captures a real image around the user terminal 10. The image captured by the image pickup unit 170 is image-processed by the control unit 190 and output to the display 132.
制御部190は、記憶部150に記憶されるプログラムを読み込んで実行することにより、ユーザ端末10の動作を制御する。制御部190は、例えば、アプリケーションプロセッサによって実現される。 The control unit 190 controls the operation of the user terminal 10 by reading and executing a program stored in the storage unit 150. The control unit 190 is realized by, for example, an application processor.
制御部190がアプリケーションプログラム151を実行する処理について、より詳細に説明する。記憶部150は、アプリケーションプログラム151と、アプリケーション情報152と、ユーザ情報153と、を記憶する。 The process in which the control unit 190 executes the application program 151 will be described in more detail. The storage unit 150 stores the application program 151, the application information 152, and the user information 153.
ユーザ端末10は、例えば、第1サーバ装置20からアプリケーションプログラム151をダウンロードして記憶部150に記憶させる。また、ユーザ端末10は、第1サーバ装置20と通信することで、アプリケーション情報152及びユーザ情報153等の各種のデータを第1サーバ装置20と送受信する。 For example, the user terminal 10 downloads the application program 151 from the first server device 20 and stores it in the storage unit 150. Further, the user terminal 10 communicates with the first server device 20 to send and receive various data such as application information 152 and user information 153 to and from the first server device 20.
アプリケーションプログラム151は、ユーザ端末10において仮想レースを表示するためのプログラムである。アプリケーション情報152は、アプリケーションプログラム151が参照する各種のデータを含む。アプリケーション情報152は、第1情報152Aと、イベント情報152Bと、アバター情報152Cと、を含む。 The application program 151 is a program for displaying a virtual race on the user terminal 10. The application information 152 includes various data referred to by the application program 151. The application information 152 includes the first information 152A, the event information 152B, and the avatar information 152C.
第1情報152Aは、第1サーバ装置20や第2サーバ装置40から送信された所定のレースに関する情報である。第1情報152Aとしては、例えば、所定のレースの出場者又は移動体(以下、「出場者等」とも称する)のレースタイムを示すレースタイム情報、所定のレースの実施中における出場者等の位置情報、当該位置情報に対応する時間情報と、が含まれる。 The first information 152A is information regarding a predetermined race transmitted from the first server device 20 and the second server device 40. The first information 152A includes, for example, race time information indicating the race time of a participant or a moving body (hereinafter, also referred to as “participant”) of a predetermined race, position information of the participant or the like during the execution of the predetermined race, and the like. The time information corresponding to the position information and the time information are included.
本明細書において、「出場者」とは、人間だけでなく、馬や犬などの動物も含む概念である。また、「移動体」とは、所定のレースにおいて移動の主体となるものであり、出場者が乗る動物や機体、出場者が遠隔操縦する機体などである。マラソンやドッグレース等では、「出場者」と「移動体」は同一となる。 As used herein, the term "participant" is a concept that includes not only humans but also animals such as horses and dogs. Further, the "moving body" is a main body of movement in a predetermined race, and is an animal or an aircraft on which a participant rides, an aircraft remotely controlled by a contestant, or the like. In marathons, dog races, etc., "participants" and "moving bodies" are the same.
第1情報152Aとしては、上記の他にも、例えば、所定のレースの名称、開催日時、レース場データ、出場者データ、移動体データ、オッズ情報、レース予想、レース出走表、レース直前情報、ピットレポート、レース結果、レース動画、レース静止画、過去のレース情報、その他の所定のレースに関する情報誌や情報サイトに掲載されうるような情報などを含んでもよい。 In addition to the above, the first information 152A includes, for example, a predetermined race name, date and time, race field data, contestant data, moving object data, odds information, race prediction, race start table, and information immediately before the race. It may include pit reports, race results, race videos, race stills, past race information, and other information that may be published in information magazines or information sites about a given race.
イベント情報152Bは、所定のレースにおいて発生しうる又は発生したイベントに関する情報であり、例えば、そのイベントに対応する演出表示(エフェクト表示)や演出音に関する情報が含まれる。 The event information 152B is information about an event that can occur or has occurred in a predetermined race, and includes, for example, information about an effect display (effect display) and an effect sound corresponding to the event.
イベントとしては、特に制限はされないが、例えば、先頭の出場者等のゴール到達、所定のレースの順位確定、出場者等の順位の入れ替わり、所定のタイミングにおいて2以上の出場者等の位置の差が所定の閾値以下になること(2以上の出場者等が接戦をすること)、ルール違反の発生のうちの1以上であってもよい。イベント情報152Bは、本例では、所定のレースにおいてイベントが発生した後に第1サーバ装置20から受信する情報である。 The event is not particularly limited, but for example, the goal of the first contestant, etc. is reached, the ranking of a predetermined race is confirmed, the ranking of the contestants, etc. is changed, and the difference in the positions of two or more contestants, etc. at a predetermined timing. May be less than or equal to a predetermined threshold value (two or more contestants or the like have a close battle), or one or more of the occurrence of rule violations. In this example, the event information 152B is information received from the first server device 20 after an event occurs in a predetermined race.
また、イベント情報152Bには、例えば、所定のイベントに対応する二次元画像又は三次元画像に関する情報が含まれる。具体的には、イベント情報152Bには、イベントの内容を報知するテキスト画像や、所定のキャラクタ画像、後述の仮想オブジェクトに付加するエフェクト画像等に関する情報が含まれうる。 Further, the event information 152B includes, for example, information about a two-dimensional image or a three-dimensional image corresponding to a predetermined event. Specifically, the event information 152B may include information about a text image for notifying the content of the event, a predetermined character image, an effect image added to a virtual object described later, and the like.
アバター情報152Cは、ユーザのアバターとなるオブジェクト(以下、アバターオブジェクトと称する)に関する情報が含まれる。具体的には、アバター情報152Cには、アバターオブジェクト(アバターの一例)の性別、タイプ(職業)、髪型、肌色、アクション等の情報が含まれうる。アバター情報152Cは、後述のアバター編集画面を介してユーザが編集可能である。 The avatar information 152C includes information about an object that becomes a user's avatar (hereinafter, referred to as an avatar object). Specifically, the avatar information 152C may include information such as gender, type (occupation), hairstyle, skin color, and action of an avatar object (an example of an avatar). The avatar information 152C can be edited by the user via the avatar editing screen described later.
ユーザ情報153は、ユーザ端末10のユーザについての情報を含む。ユーザ情報153は、例えば、ユーザを識別する情報、ユーザ端末10の位置情報、ユーザのレース購入履歴や的中率(例えば、ボートレースであれば購入した舟券の履歴と、購入した舟券の的中率)などを含んでもよい。 The user information 153 includes information about the user of the user terminal 10. The user information 153 includes, for example, information for identifying a user, position information of a user terminal 10, a race purchase history and a hit rate of a user (for example, a history of a boat ticket purchased in the case of a boat race, and a boat ticket purchased. Hit rate) and the like may be included.
制御部190は、アプリケーションプログラム151を読み込んで実行することにより、操作入力受付部191と、送受信部192と、オブジェクト生成部193と、表示制御部194と、平坦面検出部195と、イベント検出部196と、選択情報取得部197と、の各機能を発揮する。 By reading and executing the application program 151, the control unit 190 reads and executes the operation input reception unit 191, the transmission / reception unit 192, the object generation unit 193, the display control unit 194, the flat surface detection unit 195, and the event detection unit. Each function of 196 and the selection information acquisition unit 197 is exhibited.
操作入力受付部191は、タッチスクリーン130の出力に基づいて、ユーザの操作入力を受け付ける。具体的には、操作入力受付部191は、ユーザの指などがタッチパネル131に接触又は接近したことを、タッチスクリーン130を構成する面の横軸及び縦軸からなる座標系の座標として検出する。 The operation input receiving unit 191 accepts the user's operation input based on the output of the touch screen 130. Specifically, the operation input receiving unit 191 detects that the user's finger or the like touches or approaches the touch panel 131 as the coordinates of the coordinate system consisting of the horizontal axis and the vertical axis of the surface constituting the touch screen 130.
操作入力受付部191は、タッチスクリーン130に対するユーザの操作を判別する。操作入力受付部191は、例えば、「接近操作」、「リリース操作」、「タップ操作」、「ダブルタップ操作」、「長押し操作(ロングタッチ操作)」、「ドラッグ操作(スワイプ操作)」、「ムーブ操作」、「フリック操作」、「ピンチイン操作」、「ピンチアウト操作」などのユーザの操作を判別する。 The operation input receiving unit 191 determines the user's operation on the touch screen 130. The operation input reception unit 191 is, for example, "approach operation", "release operation", "tap operation", "double tap operation", "long press operation (long touch operation)", "drag operation (swipe operation)", Determine the user's operation such as "move operation", "flick operation", "pinch-in operation", and "pinch-out operation".
操作入力受付部191は、ユーザ端末10に搭載された加速度センサ等によって検出されるユーザ端末10の動きを、操作入力として受け付けてもよい。 The operation input receiving unit 191 may accept the movement of the user terminal 10 detected by the acceleration sensor or the like mounted on the user terminal 10 as an operation input.
送受信部192は、無線通信IF120及びネットワーク30を介して、第1サーバ装置20や第2サーバ装置40などの外部の通信機器と各種情報の送信及び受信を行う。送受信部192は、例えば、第1サーバ装置20又は第2サーバ装置40から第1情報152Aを受信する。また、送受信部192は、例えば、操作入力受付部191が受け付けた操作入力に応じた情報や、ユーザ情報153に記憶された情報等を第1サーバ装置20又は第2サーバ装置40へと送信する。 The transmission / reception unit 192 transmits and receives various information to and from external communication devices such as the first server device 20 and the second server device 40 via the wireless communication IF 120 and the network 30. The transmission / reception unit 192 receives, for example, the first information 152A from the first server device 20 or the second server device 40. Further, the transmission / reception unit 192 transmits, for example, information corresponding to the operation input received by the operation input reception unit 191 and information stored in the user information 153 to the first server device 20 or the second server device 40. ..
オブジェクト生成部193は、所定のレースに関する情報をユーザに提示するための仮想オブジェクトを、第1情報に基づいて生成する。オブジェクト生成部193は、仮想オブジェクトとして、レース場を表すレース場オブジェクトと、出場者等を表す移動オブジェクトと、を生成する。オブジェクト生成部193は、第1情報に基づいて生成される第2情報をテキスト表示するための仮想表示板を生成してもよい。オブジェクト生成部193は、上記の各オブジェクトの他にも、第1情報に基づいて生成される第2情報を画像表示するための仮想スクリーン、各種の建物オブジェクト、木などの景観を構成する景観オブジェクト等を生成してもよい。 The object generation unit 193 generates a virtual object for presenting information about a predetermined race to the user based on the first information. The object generation unit 193 generates, as virtual objects, a race field object representing a race field and a moving object representing a contestant or the like. The object generation unit 193 may generate a virtual display board for displaying the second information generated based on the first information as text. In addition to the above objects, the object generation unit 193 is a landscape object that constitutes a landscape such as a virtual screen for displaying an image of the second information generated based on the first information, various building objects, and trees. Etc. may be generated.
オブジェクト生成部193は、上記の各オブジェクトのほかに、アバター情報152Cに基づいて、アバターオブジェクトを生成してもよい。オブジェクト生成部193は、アバターオブジェクトとして、例えば、漫画やアニメーションのキャラクタ、各種スポーツ競技のマスコットキャラクタ、芸能人を含む有名人などの三次元モデルを生成してもよい。アバターオブジェクトは、三次元モデルに限られる、二次元画像であってもよい。オブジェクト生成部193は、生成したアバターオブジェクトを、移動オブジェクトの出場者を示す部分に代替させてもよい。 In addition to each of the above objects, the object generation unit 193 may generate an avatar object based on the avatar information 152C. The object generation unit 193 may generate, for example, a three-dimensional model of a cartoon or animation character, a mascot character of various sports competitions, a celebrity including a celebrity, or the like as an avatar object. The avatar object may be a two-dimensional image limited to a three-dimensional model. The object generation unit 193 may replace the generated avatar object with a portion indicating a contestant of the moving object.
表示制御部194は、撮像部170により撮像されたユーザ端末10の周囲の現実画像に、オブジェクト生成部193により生成された仮想オブジェクトを重畳させた画像(以下、「重畳画像」とも称する)を、ディスプレイ132上に表示させる。表示制御部194は、第1情報152Aに含まれるレースタイム情報に基づいて、各移動オブジェクトをレース場オブジェクト上において移動させ、所定のレースを仮想的に再現した仮想レースをディスプレイ132上に表示させる。表示制御部194は、レースタイム情報に加えて、第1情報152Aに含まれる出場者等の位置情報、及び当該位置情報に対応する時間情報に基づいて、仮想レースを再現するものであることが好ましい。 The display control unit 194 superimposes an image (hereinafter, also referred to as “superimposed image”) on which a virtual object generated by the object generation unit 193 is superimposed on a real image around the user terminal 10 imaged by the image pickup unit 170. It is displayed on the display 132. The display control unit 194 moves each moving object on the race track object based on the race time information included in the first information 152A, and displays a virtual race that virtually reproduces a predetermined race on the display 132. It is preferable that the display control unit 194 reproduces the virtual race based on the position information of the contestants and the like included in the first information 152A and the time information corresponding to the position information in addition to the race time information. ..
表示制御部194は、操作入力受付部191が受け付けた操作入力に応じて、重畳画像における視点を変更可能であることが好ましい。表示制御部194は、操作入力受付部191が受け付けた操作入力に応じて、ディスプレイ132に各種のメニュー画面やGUI(Graphical User Interface)を表示させたり、ディスプレイ132の表示内容を変更したりする。 It is preferable that the display control unit 194 can change the viewpoint in the superimposed image according to the operation input received by the operation input reception unit 191. The display control unit 194 displays various menu screens and GUIs (Graphical User Interfaces) on the display 132, and changes the display contents of the display 132, in response to the operation input received by the operation input reception unit 191.
また、表示制御部194は、イベント検出部196がイベントの発生を検出したことに応じて、該イベントに対応する演出表示をする。演出表示の内容としては、特に制限はされないが、例えば、所定のアバターオブジェクトの表示であってもよく、レース場オブジェクト及び移動オブジェクトの少なくとも一方へのエフェクト表示、所定のイベントに対応する二次元画像又は三次元画像の表示、これらの画像をレース進行に応じて動作させることのうちの1以上であってもよい。表示制御部194は、異なるイベントに対応する2以上の演出表示を同時に実行してもよい。 Further, the display control unit 194 displays an effect corresponding to the event in response to the event detection unit 196 detecting the occurrence of the event. The content of the effect display is not particularly limited, but may be, for example, a display of a predetermined avatar object, an effect display on at least one of a race field object and a moving object, and a two-dimensional image corresponding to a predetermined event. Alternatively, it may be one or more of displaying three-dimensional images and operating these images according to the progress of the race. The display control unit 194 may simultaneously execute two or more effect displays corresponding to different events.
平坦面検出部195は、撮像部170により撮像された現実画像内における平坦面を検出する。平坦面の検出は、従来公知の画像認識技術によって実現される。例えば、平坦面検出部195が検出した平坦面を選択する操作をユーザが行った場合、当該平坦面にレース場オブジェクトが配置された重畳画像がディスプレイ132に表示される。 The flat surface detection unit 195 detects a flat surface in the real image captured by the image pickup unit 170. The detection of a flat surface is realized by a conventionally known image recognition technique. For example, when the user performs an operation of selecting a flat surface detected by the flat surface detection unit 195, a superimposed image in which the race field object is arranged on the flat surface is displayed on the display 132.
平坦面は、水平面であることが好ましい。また、平坦面とレース場オブジェクトを構成する底面とのなす角度は0度であってもよいが、鋭角であることが好ましく、例えば、15度~45度の範囲にすることができる。上記角度は、ユーザの操作を受け付けて調節する態様とすることもできる。また、現実世界の平坦面の一部に凸部がある場合や、当該平坦面に載置物がある場合であっても、当該凸部や当該載置物がレース場オブジェクトによって隠れる程度のサイズならば、レース場オブジェクトを配置可能は平坦面として検出してもよい。 The flat surface is preferably a horizontal surface. Further, the angle formed by the flat surface and the bottom surface constituting the race field object may be 0 degrees, but is preferably an acute angle, and can be, for example, in the range of 15 degrees to 45 degrees. The above angle may be adjusted by accepting a user's operation. Also, even if there is a convex part on a part of the flat surface in the real world, or if there is an object on the flat surface, if the convex part or the object is of a size that can be hidden by the racetrack object. , The place where the race field object can be placed may be detected as a flat surface.
イベント検出部196は、所定のレースにおける所定のイベントの発生を検出する。イベント検出部196は、例えば、現実世界の所定のレースにおいて所定のイベントが発生した場合に第1サーバ装置20から送信されるイベント情報を送受信部192が受信したことに基づいて、所定のイベントの発生を検出する。 The event detection unit 196 detects the occurrence of a predetermined event in a predetermined race. The event detection unit 196 is, for example, based on the fact that the transmission / reception unit 192 receives event information transmitted from the first server device 20 when a predetermined event occurs in a predetermined race in the real world. Detect the occurrence.
また、イベント検出部196は、例えば、第1情報152Aやユーザ情報153に基づいて、所定のイベントの発生を検出してもよい。イベント検出部196は、具体的には、出場者等の位置情報と当該位置情報に対応する時間情報とに基づいて、出場者等の順位の入れ替わり等のレース中の出場者等の位置から判断可能なイベントを検出しうる。イベント検出部196は、例えば、レース結果に関する情報と、ユーザのレース購入履歴に関する情報とに基づいて、ユーザの購入した出場者等がレースの先頭に位置する等のイベントを検出しうる。 Further, the event detection unit 196 may detect the occurrence of a predetermined event based on, for example, the first information 152A and the user information 153. Specifically, the event detection unit 196 determines from the position of the contestants, etc. during the race, such as the change of ranking of the contestants, etc., based on the position information of the contestants, etc. and the time information corresponding to the position information. Can detect possible events. The event detection unit 196 can detect an event such that a contestant or the like purchased by the user is located at the head of the race based on, for example, information on the race result and information on the race purchase history of the user.
選択情報取得部197は、ユーザが選択した1以上の出場者等を示す1以上の選択情報を取得する。「選択情報」とは、例えば、所定のレースの勝者等としてユーザが予想した出場者等を示す情報である。所定のレースがボートレースや競馬である場合、「選択情報」は、ユーザが購入した舟券や馬券を示す情報でありうる。 The selection information acquisition unit 197 acquires one or more selection information indicating one or more contestants selected by the user. The "selection information" is, for example, information indicating a contestant or the like predicted by the user as a winner or the like of a predetermined race. When a predetermined race is a boat race or a horse race, the "selection information" may be information indicating a boat ticket or a betting ticket purchased by the user.
選択情報取得部197は、例えば、出場者等を選択するための画面上でのユーザの操作入力に基づいて、選択情報を取得する。選択情報取得部197が取得した選択情報は、例えば、第1サーバ装置20又は第2サーバ装置40へ送信される。所定のレースがボートレースや競馬である場合、選択情報が第2サーバ装置40へ送信されたことに応じて、舟券や馬券の購入処理がなされうる。 The selection information acquisition unit 197 acquires selection information based on, for example, a user's operation input on the screen for selecting a contestant or the like. The selection information acquired by the selection information acquisition unit 197 is transmitted to, for example, the first server device 20 or the second server device 40. When the predetermined race is a boat race or a horse race, the purchase process of the boat ticket or the betting ticket may be performed according to the selection information being transmitted to the second server device 40.
(第1サーバ装置)
図3は、第1サーバ装置20の機能的な構成を示すブロック図である。図3を参照して、第1サーバ装置20の詳細な構成を説明する。第1サーバ装置20は、プログラムに従って動作することにより、通信部220と、記憶部250と、制御部290としての機能を発揮する。
(1st server device)
FIG. 3 is a block diagram showing a functional configuration of the first server device 20. A detailed configuration of the first server device 20 will be described with reference to FIG. The first server device 20 exerts functions as a communication unit 220, a storage unit 250, and a control unit 290 by operating according to a program.
通信部220は、第1サーバ装置20がユーザ端末10や第2サーバ装置40などの外部の通信機器とネットワーク30を介して通信するためのインタフェースとして機能する。 The communication unit 220 functions as an interface for the first server device 20 to communicate with an external communication device such as the user terminal 10 or the second server device 40 via the network 30.
記憶部250は、システム1を実現するための各種プログラム及びデータを記憶する。ある局面において、記憶部250は、プログラム251と、レース情報252と、ユーザ情報253とを記憶する。 The storage unit 250 stores various programs and data for realizing the system 1. In a certain aspect, the storage unit 250 stores the program 251 and the race information 252 and the user information 253.
プログラム251は、第1サーバ装置20がユーザ端末10や第2サーバ装置40と通信して、システム1を実現するためのプログラムである。プログラム251は、制御部290に実行されることにより、ユーザ端末10や第2サーバ装置40とデータを送受信する処理、ユーザ端末10のユーザが行った操作内容に応じた処理、レース情報252やユーザ情報253を更新する処理などを第1サーバ装置20に行わせる。 The program 251 is a program for the first server device 20 to communicate with the user terminal 10 and the second server device 40 to realize the system 1. The program 251 is executed by the control unit 290 to send and receive data to and from the user terminal 10 and the second server device 40, a process according to the operation content performed by the user of the user terminal 10, race information 252, and the user. The first server device 20 is made to perform a process of updating the information 253 and the like.
レース情報252は、所定のレースに関連する各種のデータを含む。レース情報252は、例えば、第1情報252Aとイベント情報252Bとを含む。第1情報252Aは、第1情報152Aの元となる情報であり、第1情報152Aは、第1情報252Aの一部でありうる。第1情報252Aは、例えば、第2サーバ装置40から取得された情報である。同様に、イベント情報252Bは、イベント情報152Bの元となる情報であり、イベント情報152Bは、イベント情報252Bの一部でありうる。イベント情報252Bは、例えば、第2サーバ装置40から取得された情報である。 Race information 252 contains various data related to a given race. The race information 252 includes, for example, the first information 252A and the event information 252B. The first information 252A is the source information of the first information 152A, and the first information 152A can be a part of the first information 252A. The first information 252A is, for example, information acquired from the second server device 40. Similarly, the event information 252B is the source information of the event information 152B, and the event information 152B may be a part of the event information 252B. The event information 252B is, for example, information acquired from the second server device 40.
ユーザ情報253は、ユーザ端末10のユーザについての情報である。ユーザ情報253は、ユーザ管理テーブル253Aを含む。ユーザ管理テーブル253Aは、例えば、ユーザを識別する情報、ユーザ端末10の位置情報、ユーザのレース購入履歴や的中率などをユーザ毎に記憶している。 The user information 253 is information about the user of the user terminal 10. The user information 253 includes a user management table 253A. The user management table 253A stores, for example, information for identifying the user, position information of the user terminal 10, a race purchase history of the user, a hit rate, and the like for each user.
制御部290は、プロセッサ29によって実現され、プログラム251を実行することにより、送受信部291、第1情報取得部292A、イベント情報取得部292B、選択情報取得部293、データ管理部294、計時部295としての機能を発揮する。 The control unit 290 is realized by the processor 29, and by executing the program 251, the transmission / reception unit 291, the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, the data management unit 294, and the timekeeping unit 295. Demonstrate the function as.
送受信部291は、通信部220及びネットワーク30を介して、ユーザ端末10や第2サーバ装置40などの外部の通信機器と各種情報の送信及び受信を行う。送受信部291は、例えば、第1情報252A及びイベント情報252Bの少なくとも一部をユーザ端末10へ送信する。また、送受信部291は、例えば、第1情報252A及びイベント情報252Bを第2サーバ装置40から受信する。 The transmission / reception unit 291 transmits and receives various information to and from an external communication device such as a user terminal 10 and a second server device 40 via the communication unit 220 and the network 30. The transmission / reception unit 291 transmits, for example, at least a part of the first information 252A and the event information 252B to the user terminal 10. Further, the transmission / reception unit 291 receives, for example, the first information 252A and the event information 252B from the second server device 40.
第1情報取得部292Aは、送受信部291を介して、第1情報252Aを第2サーバ装置40から取得する。イベント情報取得部292Bは、送受信部291を介して、イベント情報252Bを第2サーバ装置40から取得する。選択情報取得部293は、送受信部291を介して、ユーザの選択情報をユーザ端末10又は第2サーバ装置40から取得する。データ管理部294は、第1情報取得部292A、イベント情報取得部292B、選択情報取得部293等における処理結果に従って、記憶部250に記憶される各種データを更新する処理を行う。計時部295は、時間を計測する処理を行う。計時部295によって計測される時間に基づいて、ユーザ端末10に表示される各種時間(例えば、レース開始までの時間など)が制御されうる。 The first information acquisition unit 292A acquires the first information 252A from the second server device 40 via the transmission / reception unit 291. The event information acquisition unit 292B acquires the event information 252B from the second server device 40 via the transmission / reception unit 291. The selection information acquisition unit 293 acquires the user's selection information from the user terminal 10 or the second server device 40 via the transmission / reception unit 291. The data management unit 294 performs a process of updating various data stored in the storage unit 250 according to the processing results of the first information acquisition unit 292A, the event information acquisition unit 292B, the selection information acquisition unit 293, and the like. The timekeeping unit 295 performs a process of measuring time. Various times displayed on the user terminal 10 (for example, the time until the start of the race) can be controlled based on the time measured by the time measuring unit 295.
(第一動作例)
次に、図4から図14を参照して、システム1における第一動作例について説明する。以下では、所定のレースがボートレースである場合を例示して説明をするが、以下の説明は、所定のレースが他のレースであっても適用可能である。また、以下では、ユーザ端末10と第1サーバ装置20の間、及び第1サーバ装置20と第2サーバ装置40の間でデータの送受信を行うものとして説明をするが、ある局面においては、ユーザ端末10と第2サーバ装置40とで直接的にデータの送受信をするように構成してもよい。
(First operation example)
Next, an example of the first operation in the system 1 will be described with reference to FIGS. 4 to 14. In the following, a case where a predetermined race is a boat race will be described as an example, but the following description can be applied even if the predetermined race is another race. Further, in the following description, it is assumed that data is transmitted / received between the user terminal 10 and the first server device 20 and between the first server device 20 and the second server device 40, but in a certain aspect, the user The terminal 10 and the second server device 40 may be configured to directly transmit and receive data.
図4は、現実世界のボートレース場の一例を示す模式図である。ボートレース場401には、2つのターンマーク403が設置されており、各ボートレーサーが乗るボート402a~402fによるレースが実施されている。レースが終了した場合、ボート402a~402fそれぞれのレースタイムを示すレースタイム情報が、第2サーバ装置40から第1サーバ装置20へと送信され、第1サーバ装置20からユーザ端末10へと送信される。 FIG. 4 is a schematic diagram showing an example of a boat race field in the real world. Two turn marks 403 are installed in the boat race field 401, and races are carried out by boats 402a to 402f on which each boat racer rides. When the race is completed, the race time information indicating the race times of the boats 402a to 402f is transmitted from the second server device 40 to the first server device 20, and is transmitted from the first server device 20 to the user terminal 10.
ボートレース場401には、撮像装置(カメラ)404a~404bが設けられている。撮像装置404aは、ボートレース場401の上方からボート402a~402fを視界に収める。撮像装置404bは、ボートレース場401の側方からボート402a~402fを視界に収める。撮像装置404a~404bによって撮像されたボート402a~402fの画像は、第2サーバ装置40へ送信される。第2サーバ装置40では、例えば、各画像の画像解析を行い、各画像の撮影時間におけるボート402a~402fそれぞれの位置を示す位置情報を算出する。算出された位置情報と、当該位置情報に対応する撮影時間に関する時間情報は、第1サーバ装置20へと送信され、第1サーバ装置20からユーザ端末10へと送信される。なお、位置情報の算出は、第1サーバ装置20において実施してもよい。 The boat race field 401 is provided with image pickup devices (cameras) 404a to 404b. The image pickup apparatus 404a captures the boats 402a to 402f in the field of view from above the boat race field 401. The image pickup apparatus 404b keeps the boats 402a to 402f in the field of view from the side of the boat race field 401. The images of the boats 402a to 402f captured by the image pickup devices 404a to 404b are transmitted to the second server device 40. In the second server device 40, for example, image analysis of each image is performed, and position information indicating the position of each of the boats 402a to 402f in the shooting time of each image is calculated. The calculated position information and the time information related to the shooting time corresponding to the position information are transmitted to the first server device 20, and are transmitted from the first server device 20 to the user terminal 10. The location information may be calculated by the first server device 20.
また、撮像装置404a~404bに代えて、又は加えて、ボート402a~402fに、GPSセンサ等の位置センサを設置してもよい。位置センサによって取得されたボート402a~402fの位置情報と、当該位置情報が取得された時間を示す時間情報とは、最終的にユーザ端末10へと送信される。 Further, a position sensor such as a GPS sensor may be installed on the boats 402a to 402f instead of or in addition to the image pickup devices 404a to 404b. The position information of the boats 402a to 402f acquired by the position sensor and the time information indicating the time when the position information was acquired are finally transmitted to the user terminal 10.
図5は、ユーザ端末10に表示される仮想オブジェクトの一例を示す模式図である。図5の例では、仮想オブジェクトとして、レース場オブジェクト501と、移動オブジェクト502a~502fと、2つのターンマークオブジェクト503と、仮想表示板505と、が示されている。 FIG. 5 is a schematic diagram showing an example of a virtual object displayed on the user terminal 10. In the example of FIG. 5, as virtual objects, a race field object 501, moving objects 502a to 502f, two turnmark objects 503, and a virtual display board 505 are shown.
レース場オブジェクト501は、ボートレース場401を仮想的に表示したオブジェクトである。レース場オブジェクト501やターンマークオブジェクト503は、例えば、ボートレース場401のコース情報等のレース場データに基づいて作製されることが好ましく、ボートレース場401と対応する形状であることが好ましい。 The race field object 501 is an object that virtually displays the boat race field 401. The race field object 501 and the turn mark object 503 are preferably created based on race field data such as course information of the boat race field 401, and preferably have a shape corresponding to the boat race field 401.
移動オブジェクト502a~502fは、ボート402a~402fをそれぞれ仮想的に表示したオブジェクトであり、ボートを模した形状である。移動オブジェクト502a~502fは、レースタイム情報や、ボート402a~402fの位置情報及び当該位置情報に対応する時間情報に基づいて、レース場オブジェクト501を移動する。すなわち、レース場オブジェクト501及び移動オブジェクト502a~502fによって、現実世界のレースが仮想的なレースとしてユーザ端末10上で表示される。 The moving objects 502a to 502f are objects that virtually display the boats 402a to 402f, respectively, and have a shape imitating a boat. The moving objects 502a to 502f move the race field object 501 based on the race time information, the position information of the boats 402a to 402f, and the time information corresponding to the position information. That is, the race field object 501 and the moving objects 502a to 502f display the race in the real world as a virtual race on the user terminal 10.
なお、ボート402a~402fの位置情報及び時間情報がなくとも、レースタイム情報のみによって仮想レースを表示することも可能である。ただ、この場合、最終的な着順は現実世界のレースと同じものになるが、レース中の順位などのレース経過を再現することが困難である。 Even if there is no position information and time information of the boats 402a to 402f, it is possible to display the virtual race only by the race time information. However, in this case, the final order of arrival will be the same as in the real world race, but it is difficult to reproduce the race progress such as the ranking during the race.
仮想表示板505は、テキスト情報を表示するオブジェクトである。仮想表示板505は、例えば、ボートレース場401において対応する存在のないオブジェクトである。仮想表示板505に表示するテキスト情報は、特に制限はされず、例えば、順位情報やオッズ情報等であってもよい。また、仮想表示板505に表示するテキスト情報は、ユーザの操作入力に基づいて変更可能であってもよい。 The virtual display board 505 is an object that displays text information. The virtual display board 505 is, for example, a corresponding nonexistent object in the boat race field 401. The text information displayed on the virtual display board 505 is not particularly limited, and may be, for example, ranking information, odds information, or the like. Further, the text information displayed on the virtual display board 505 may be changeable based on the operation input of the user.
図6及び図7は、表示制御処理の一例を示すフローチャートである。なお、以下で説明するフローチャートを構成する各処理の順序は、処理内容に矛盾や不整合が生じない範囲で順不同である。また、各装置が実行する処理は、矛盾の生じない範囲で、他の装置によって実行されてもよい。 6 and 7 are flowcharts showing an example of display control processing. It should be noted that the order of each process constituting the flowchart described below is random as long as there is no contradiction or inconsistency in the process contents. Further, the processing executed by each device may be executed by another device within a range that does not cause a contradiction.
図6及び図7に示される処理は、制御部190がアプリケーションプログラム151を、制御部290がプログラム251をそれぞれ実行することにより実現される。 The processes shown in FIGS. 6 and 7 are realized by the control unit 190 executing the application program 151 and the control unit 290 executing the program 251.
まず、ステップS610において、制御部190は、アバターオブジェクトのベースとなるベースアバターを生成する。ステップS620において、制御部190は、ユーザの操作入力に基づいて、アバターオブジェクトの編集情報を取得する。ステップS630において、制御部190は、ステップS620で取得した編集情報に基づいて、記憶部150に記憶されているアバター情報152Cを更新する。 First, in step S610, the control unit 190 generates a base avatar that is the base of the avatar object. In step S620, the control unit 190 acquires the editing information of the avatar object based on the operation input of the user. In step S630, the control unit 190 updates the avatar information 152C stored in the storage unit 150 based on the editing information acquired in step S620.
ここで、図8及び図9を用いて、ステップS610~S630の処理について具体的に説明する。図8は、生成されたベースアバターに基づいて、ユーザがアバターオブジェクトを編集するためのアバター編集画面800の一例を示す図である。図8に示すアバター編集画面800は、ベースアバター801と、ベースアバター801の性別を変更するための性別ボタン802と、ベースアバター801の性別以外の各種属性を変更するための属性変更欄803とを含んでいる。図8に示す例では、ベースアバター801は、男性の姿で生成されるが、性別ボタン802をタップ等することにより、女性に変更可能である。属性変更欄803においては、例えば、アバターオブジェクトの「タイプ(職業)」、「肌色」、「勝利アクション」、「負けアクション」などの各属性について、複数の選択肢が表示される。 Here, the processes of steps S610 to S630 will be specifically described with reference to FIGS. 8 and 9. FIG. 8 is a diagram showing an example of an avatar editing screen 800 for a user to edit an avatar object based on the generated base avatar. The avatar editing screen 800 shown in FIG. 8 has a base avatar 801 and a gender button 802 for changing the gender of the base avatar 801 and an attribute change column 803 for changing various attributes other than the gender of the base avatar 801. Includes. In the example shown in FIG. 8, the base avatar 801 is generated in the form of a male, but can be changed to a female by tapping the gender button 802 or the like. In the attribute change column 803, for example, a plurality of options are displayed for each attribute such as "type (occupation)", "skin color", "winning action", and "losing action" of the avatar object.
図9は、各属性の選択肢を示す選択テーブル900を示す図である。図9に示す選択テーブル900は、例えば、記憶部150に記憶されているアバター情報152Cに含まれうる。図9の例では、選択テーブル900には、例えば、属性として「タイプ」、「肌色」、「勝利アクション」、「負けアクション」、「乗り物」が含まれ、各属性において例1~例4までの選択肢が登録されている。制御部190は、この選択テーブル900に基づいて、アバター編集画面800の属性変更欄803を表示する。これにより、ユーザは属性ごとの複数の選択肢から好みの選択肢を選ぶことができる。 FIG. 9 is a diagram showing a selection table 900 showing options for each attribute. The selection table 900 shown in FIG. 9 may be included in, for example, the avatar information 152C stored in the storage unit 150. In the example of FIG. 9, the selection table 900 includes, for example, "type", "skin color", "winning action", "losing action", and "vehicle" as attributes, and examples 1 to 4 are included in each attribute. Choices are registered. The control unit 190 displays the attribute change field 803 of the avatar edit screen 800 based on the selection table 900. As a result, the user can select a favorite option from a plurality of options for each attribute.
性別ボタン802及び属性変更欄803に対するユーザの操作入力に基づいて、制御部190は、アバターオブジェクトの編集情報として、アバターオブジェクトの性別、タイプ、肌色、アクションなどの情報を取得する。例えば、ユーザが、性別ボタン802をタップ等してベースアバター801の性別を「男性」と選択するとともに、属性変更欄803をタップ等してタイプ「勇者」を選択し、勝利アクション「ガッツポーズ」を選択し、負けアクション「ボート爆破」を選択したとする。制御部190は、ステップS620にて、ユーザのこれらの操作入力に基づいて、アバターオブジェクトを変更するための情報を編集情報として取得する。その後、制御部190は、ステップS620で取得した編集情報に基づいて、ステップS630にて、アバターオブジェクトが「男性」の「勇者」の姿であって、レースに勝利した場合には「ガッツポーズ」のアクションを実行し、レースに負けた場合には「ボート爆破」のアクションを実行するようにアバター情報152Cを更新する。 Based on the user's operation input to the gender button 802 and the attribute change field 803, the control unit 190 acquires information such as the gender, type, skin color, and action of the avatar object as the editing information of the avatar object. For example, the user taps the gender button 802 to select the gender of the base avatar 801 as "male", taps the attribute change field 803 to select the type "hero", and wins the action "guts pose". And select the losing action "Boat Blast". In step S620, the control unit 190 acquires information for changing the avatar object as editing information based on these operation inputs of the user. After that, based on the editing information acquired in step S620, the control unit 190 takes the form of a "male" "hero" in step S630, and if the avatar object wins the race, "guts pose". The avatar information 152C is updated so that the action of "boat blast" is executed when the action of is executed and the race is lost.
なお、上記の例では、ステップS620及びS630において、単一のアバターオブジェクトの編集及び登録を行う処理を説明しているが、この例に限られない。本ステップにおいて、複数のアバターオブジェクトの編集及び登録が可能である。また、アバターオブジェクトを編集するための選択肢は、図8に示すアバター編集画面800や図9で示す選択テーブル900で表示される選択肢に限られるものではなく、例えば、ユーザ自身が好みの画像(ユーザ自身の写真など)を取り込んで、取り込まれた当該画像に基づいてアバターオブジェクトを編集可能としてもよい。 In the above example, the process of editing and registering a single avatar object is described in steps S620 and S630, but the present invention is not limited to this example. In this step, it is possible to edit and register multiple avatar objects. Further, the options for editing the avatar object are not limited to the options displayed on the avatar edit screen 800 shown in FIG. 8 and the selection table 900 shown in FIG. 9, and for example, the user himself / herself likes an image (user). You may be able to capture an image of yourself, etc.) and edit the avatar object based on the captured image.
ステップS640において、制御部190は、第1サーバ装置20から所定のレースの出場者又は移動体に関する情報(以下、出場者情報と称する)を取得する。ステップS640の処理は、所定の頻度で実行される、または特定のイベントが発生した場合に実行されうる。すなわち、出場者情報は、ステップS640の処理が実行される毎に更新されうる。なお、特定のイベントとは、特に制限はされないが、例えば、出場者情報が変更された場合(例えば、オッズに変動があった場合など)や、レースの開始や終了、順位の確定等があった場合が挙げられる。 In step S640, the control unit 190 acquires information (hereinafter, referred to as contestant information) regarding the contestants or moving objects of a predetermined race from the first server device 20. The process of step S640 may be executed at a predetermined frequency or when a specific event occurs. That is, the contestant information can be updated every time the process of step S640 is executed. The specific event is not particularly limited, but for example, when the contestant information is changed (for example, when the odds change), the start or end of the race, the ranking is confirmed, etc. There are cases.
ステップS650において、制御部190は、取得した出場者情報に対してアバター情報を関連付けて登録する。すなわち、制御部190は、第1サーバ装置20から取得されて記憶部150に記憶されている第1情報152Aに含まれる出場者情報とアバター情報152Cとを関連付けて登録する。出場者情報とアバター情報152Cとの関連付けは、例えば、出場者等を選択するための画面上でのユーザの操作入力に基づいて取得された選択情報に基づいて行われてもよい。すなわち、制御部190は、ユーザが舟券や馬券を購入した出場者の出場者情報とアバター情報152Cとを結合してもよい。 In step S650, the control unit 190 associates and registers the avatar information with the acquired contestant information. That is, the control unit 190 registers the contestant information included in the first information 152A acquired from the first server device 20 and stored in the storage unit 150 in association with the avatar information 152C. The association between the contestant information and the avatar information 152C may be performed, for example, based on the selection information acquired based on the user's operation input on the screen for selecting the contestant or the like. That is, the control unit 190 may combine the contestant information of the contestant who purchased the boat ticket or the betting ticket with the avatar information 152C.
ステップS660において、制御部190は、カメラである撮像部170を起動する。撮像部170によって、ユーザ端末10の周囲の現実画像が撮像される。 In step S660, the control unit 190 activates the image pickup unit 170, which is a camera. The image pickup unit 170 captures a real image around the user terminal 10.
ステップS670において、制御部190は、撮像部170によって撮像された画像内における平坦面を検出する。ステップS680において、制御部190は、検出した平坦面に仮想オブジェクトを配置する。 In step S670, the control unit 190 detects a flat surface in the image captured by the image pickup unit 170. In step S680, the control unit 190 arranges the virtual object on the detected flat surface.
ここで、図10から図12を用いて、ステップS660、S670、及びS680の処理について具体的に説明する。図10は、撮像部170により撮像された現実画像の一例を示す模式図である。図10の例では、平坦なデスク1001上に、キーボード1002と、モニタ装置1003と、が置かれている。 Here, the processes of steps S660, S670, and S680 will be specifically described with reference to FIGS. 10 to 12. FIG. 10 is a schematic diagram showing an example of a real image captured by the imaging unit 170. In the example of FIG. 10, a keyboard 1002 and a monitoring device 1003 are placed on a flat desk 1001.
ステップS670において撮像部170が起動されると、ディスプレイ132上に、撮像部170によって撮像されている現実画像が表示される。次に、ステップS680において、制御部190は、撮像部170によって撮像された画像内、すなわちディスプレイ132に表示された画像内における平坦面を検出する。 When the image pickup unit 170 is activated in step S670, the real image captured by the image pickup unit 170 is displayed on the display 132. Next, in step S680, the control unit 190 detects a flat surface in the image captured by the image pickup unit 170, that is, in the image displayed on the display 132.
図10では、領域1004が平坦面として検出されている。領域1004内にはキーボード1002があるが、キーボード1002はレース場オブジェクト501によって隠れる程度のサイズなので、制御部190は、領域1004を平坦面として検出する。 In FIG. 10, the region 1004 is detected as a flat surface. There is a keyboard 1002 in the area 1004, but since the keyboard 1002 is small enough to be hidden by the race field object 501, the control unit 190 detects the area 1004 as a flat surface.
図10に示す状態で、撮像部170によって撮像される位置を変更すれば、領域1004の位置も変更されうる。領域1004は、ディスプレイ132上において、例えば、所定の色を付加されて、他の部分とは区別可能に表示される。ユーザが領域1004に対してタップ操作等を実行した場合、ステップS690において、制御部190は、領域1004上にレース場オブジェクト501、移動オブジェクト502a~502f等の仮想オブジェクトを配置する。 If the position imaged by the imaging unit 170 is changed in the state shown in FIG. 10, the position of the region 1004 can also be changed. The area 1004 is displayed on the display 132 so as to be distinguishable from other parts, for example, by adding a predetermined color. When the user executes a tap operation or the like on the area 1004, in step S690, the control unit 190 arranges virtual objects such as the race field object 501 and the moving objects 502a to 502f on the area 1004.
図11は、現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図である。図11において、モニタ装置1003を含むドットパターンを付した領域が現実画像であり、その他の領域は仮想オブジェクトが表示されている領域である。仮想オブジェクトが表示されていない領域には、例えば、広告画像を表示してもよい。 FIG. 11 is a schematic diagram showing an example of a screen in which a virtual object is superimposed on a real image and displayed. In FIG. 11, the area with the dot pattern including the monitor device 1003 is the real image, and the other area is the area where the virtual object is displayed. For example, an advertisement image may be displayed in an area where the virtual object is not displayed.
図11では、仮想オブジェクトとして、レース場オブジェクト501と、複数の移動オブジェクト502と、2つのターンマークオブジェクト503と、大型モニタオブジェクト506と、建物オブジェクト507a~507bと、その他の符号を付していない多数のオブジェクト(木オブジェクト、時計オブジェクト等)と、が表示されている。これらのオブジェクトは、例えば、第1サーバ装置20から受信した第1情報152Aに基づいて作製される。 In FIG. 11, as virtual objects, a race field object 501, a plurality of moving objects 502, two turnmark objects 503, a large monitor object 506, building objects 507a to 507b, and other reference numerals are not attached. A large number of objects (tree objects, clock objects, etc.) and are displayed. These objects are created, for example, based on the first information 152A received from the first server device 20.
図12は、現実画像に仮想オブジェクトを重畳させて表示した画面の一例を示す模式図であり、図11に示すレース場オブジェクト501の別態様を示したものである。具体的には、図12は、所定のレースが競馬の場合の例である。 FIG. 12 is a schematic view showing an example of a screen in which a virtual object is superimposed on a real image, and shows another aspect of the race field object 501 shown in FIG. Specifically, FIG. 12 is an example in which a predetermined race is horse racing.
図12においても、モニタ装置1003を含むドットパターンを付した領域が現実画像であり、その他の領域は仮想オブジェクトが表示されている領域である。図12では、仮想オブジェクトとして、レース場オブジェクト511と、複数の移動オブジェクト512と、大型モニタオブジェクト513と、池オブジェクト514と、複数の木オブジェクト515と、がディスプレイ132に表示されている。これらのオブジェクトも、例えば、第1サーバ装置20から受信した第1情報152Aに基づいて作製される。 Also in FIG. 12, the area with the dot pattern including the monitoring device 1003 is the real image, and the other areas are the areas where the virtual objects are displayed. In FIG. 12, as virtual objects, a race field object 511, a plurality of moving objects 512, a large monitor object 513, a pond object 514, and a plurality of tree objects 515 are displayed on the display 132. These objects are also created, for example, based on the first information 152A received from the first server device 20.
レース場オブジェクト511、大型モニタオブジェクト513、池オブジェクト514、及び複数の木オブジェクト515は、例えば、現実世界における所定の競馬場のコース情報等のレース場データに基づいて作製されることが好ましい。複数の移動オブジェクト512は、例えば、競馬に出走する馬および騎手をそれぞれ仮想的に表示したオブジェクトである。 The racetrack object 511, the large monitor object 513, the pond object 514, and the plurality of tree objects 515 are preferably created based on racetrack data such as, for example, course information of a predetermined racetrack in the real world. The plurality of moving objects 512 are, for example, objects that virtually display horses and jockeys running in horse racing.
ステップS690において、制御部190は、複数の移動オブジェクト502に対して出場者情報を関連付ける。このとき、アバター情報152Cが結合されている出場者情報については、出場者情報とともにアバター情報152Cを移動オブジェクト502に関連付けて記憶部150に登録される。 In step S690, the control unit 190 associates the contestant information with the plurality of moving objects 502. At this time, as for the contestant information to which the avatar information 152C is combined, the avatar information 152C is associated with the moving object 502 together with the contestant information and registered in the storage unit 150.
次に、図7のステップS710において、制御部190は、現実世界のボートレース場401におけるボート402a~402fの位置情報を取得する。すなわち、現実世界においてボート402a~402fによるレースが開始された場合、制御部190は、第1サーバ装置20から、ボート402a~402fの位置情報及び時間情報を取得する。位置情報及び時間情報の取得方法は、図4を用いて説明したとおりである。 Next, in step S710 of FIG. 7, the control unit 190 acquires the position information of the boats 402a to 402f in the boat race field 401 in the real world. That is, when the race by the boats 402a to 402f is started in the real world, the control unit 190 acquires the position information and the time information of the boats 402a to 402f from the first server device 20. The method of acquiring the position information and the time information is as described with reference to FIG.
ステップS720において、制御部190は、ステップS710において取得した位置情報と移動オブジェクトとが連動するよう制御する。具体的には、時間情報と位置情報とを用いて、レース場オブジェクト501上におけるボート402a~402fそれぞれの動きが、ボートレース場401上における移動オブジェクト502a~502fと同様になるように制御する。 In step S720, the control unit 190 controls the position information acquired in step S710 to be linked with the moving object. Specifically, using the time information and the position information, the movements of the boats 402a to 402f on the race field object 501 are controlled to be the same as the movement objects 502a to 502f on the boat race field 401.
ステップS730において、制御部190が複数の移動オブジェクト502a~502fのうち特定の移動オブジェクトに対するユーザの選択操作を受け付けた場合、ステップS740の処理が実行される。 In step S730, when the control unit 190 accepts the user's selection operation for a specific moving object among the plurality of moving objects 502a to 502f, the process of step S740 is executed.
ステップS740において、制御部190は、ステップS730においてユーザの選択操作がなされた移動オブジェクト502に対応する詳細な出場者情報をアバター情報とともに表示する。次に、詳細な出場者情報及びアバター情報の表示を終了するための操作入力を制御部190が受け付けた場合、ステップS750において、これらの情報の詳細表示を終了する。制御部190は、アプリケーションプログラム151を終了する操作入力を受け付けたこと等に応じて、一連の表示制御処理を終了する。 In step S740, the control unit 190 displays detailed contestant information corresponding to the moving object 502 for which the user's selection operation was performed in step S730 together with the avatar information. Next, when the control unit 190 receives the operation input for ending the display of the detailed contestant information and the avatar information, the detailed display of these information is ended in step S750. The control unit 190 ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
ここで、図13及び図14を用いて、ステップS740及びS750の処理を具体的に説明する。図13は、詳細な出場者情報をアバター情報とともに表示する方法の一例を示す模式図である。図13は、例えば、図11の状態においてユーザが移動オブジェクト502aを選択する操作(例えば、タップ操作)をした場合にディスプレイ132に表示される画面である。図13は、詳細情報が表示された状態の画面の一例を示す模式図である。 Here, the processes of steps S740 and S750 will be specifically described with reference to FIGS. 13 and 14. FIG. 13 is a schematic diagram showing an example of a method of displaying detailed contestant information together with avatar information. FIG. 13 is a screen displayed on the display 132 when, for example, the user performs an operation (for example, a tap operation) to select the moving object 502a in the state of FIG. 11. FIG. 13 is a schematic diagram showing an example of a screen in a state where detailed information is displayed.
図13では、レース場オブジェクト501上に、移動オブジェクト502a~502bが表示されている。さらに、図13では、移動オブジェクト502aの周囲に、詳細情報オブジェクト1310が表示されている。移動オブジェクト502aが移動した場合、詳細情報オブジェクト1310も移動オブジェクト502aに追従するように移動してもよい。 In FIG. 13, moving objects 502a to 502b are displayed on the race field object 501. Further, in FIG. 13, the detailed information object 1310 is displayed around the moving object 502a. When the moving object 502a moves, the detailed information object 1310 may also move so as to follow the moving object 502a.
詳細情報オブジェクト1310には、氏名欄1320、アバター表示欄1330、チャート表示欄1340、及び終了ボタン1350が含まれている。氏名欄1320には、移動オブジェクト502aに搭乗するレーサーの氏名が表示されている。アバター表示欄1330には、例えば、移動オブジェクト502aに搭乗するレーサー(出場者情報)に関連付けて登録されているアバターオブジェクトの画像が表示される。アバター表示欄1330に表示されるアバターオブジェクトの画像は、2次元画像であってもよく、3次元画像であってもよい。また、出場者情報が変更された場合、特に、現在の順位に応じて、アバター表示欄1330に表示されるアバターオブジェクトの表示態様を変更させてもよい。例えば、上位の順位の場合には、笑顔のアバターオブジェクトを表示し、下位の順位の場合には泣き顔のアバターオブジェクトを表示するようにしてもよい。 The detailed information object 1310 includes a name field 1320, an avatar display field 1330, a chart display field 1340, and an end button 1350. In the name field 1320, the name of the racer boarding the moving object 502a is displayed. In the avatar display field 1330, for example, an image of the avatar object registered in association with the racer (participant information) boarding the moving object 502a is displayed. The image of the avatar object displayed in the avatar display field 1330 may be a two-dimensional image or a three-dimensional image. Further, when the contestant information is changed, the display mode of the avatar object displayed in the avatar display field 1330 may be changed, in particular, according to the current order. For example, a smiling avatar object may be displayed in the case of a higher rank, and a crying face avatar object may be displayed in the case of a lower rank.
チャート表示欄1340には、移動オブジェクト502aに搭乗するレーサーの特性を表示するためのグラフ(例えば、レーダーチャート)が表示される。チャート表示欄1340には、例えば、「勝率」、「2連率」、「出走」、「スピード」、「ターン」という複数の項目におけるレーサーの特性データが表示される。チャート表示欄1340に表示される情報は、テキスト情報として入手した出場者情報を制御部190がグラフ化したものであってもよいし、ステップS640において取得した当初からグラフ化されていた出場者情報であってもよい。 In the chart display field 1340, a graph (for example, a radar chart) for displaying the characteristics of the racer boarding the moving object 502a is displayed. In the chart display column 1340, for example, racer characteristic data in a plurality of items such as "win rate", "double rate", "start", "speed", and "turn" are displayed. The information displayed in the chart display field 1340 may be a graph of the contestant information obtained as text information by the control unit 190, or the contestant information acquired in step S640 from the beginning. May be.
終了ボタン1350は、ステップS750において、詳細情報オブジェクト1310を非表示とするための仮想ボタンである。なお、詳細情報オブジェクト1310の外部をタップ操作されたこと等によって、詳細情報オブジェクト1310を非表示としてもよい。 The end button 1350 is a virtual button for hiding the detailed information object 1310 in step S750. The detailed information object 1310 may be hidden by tapping the outside of the detailed information object 1310.
図13では、詳細情報オブジェクト1310に、レーダーチャートが表示されるチャート表示欄1340が設けられているがこの例に限られない。図14は、詳細情報オブジェクトの別の例が示された状態の画面の一例を示す模式図である。
図14に示す例においては、詳細情報オブジェクト1410は、氏名欄1320と、アバター表示欄1330と、終了ボタン1350に加えて、チャート表示欄1340の代わりに、詳細情報表示欄1440が表示されている。詳細情報表示欄1440には、詳細な出場者情報がテキスト情報として表示されている。なお、図13の例と図14の例とを組み合わせて、詳細情報オブジェクトにアバター表示欄1330に加えて、チャート表示欄1340と詳細情報表示欄1440とをまとめて表示してもよい。
In FIG. 13, the detailed information object 1310 is provided with a chart display field 1340 in which a radar chart is displayed, but the present invention is not limited to this example. FIG. 14 is a schematic diagram showing an example of a screen in which another example of the detailed information object is shown.
In the example shown in FIG. 14, in the detailed information object 1410, in addition to the name field 1320, the avatar display field 1330, and the end button 1350, the detailed information display field 1440 is displayed instead of the chart display field 1340. .. In the detailed information display column 1440, detailed contestant information is displayed as text information. In addition to the avatar display field 1330, the chart display field 1340 and the detailed information display field 1440 may be collectively displayed on the detailed information object by combining the example of FIG. 13 and the example of FIG.
ステップS710~S750の処理は、少なくとも現実世界におけるレースの開始時点から終了時点まで繰り返されるが、現実世界におけるレースの開始前及び終了後においても繰り返されてもよい。 The process of steps S710 to S750 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world.
なお、仮想レースとして表示するレースが過去のレースである場合、仮想レースの開始前に、レースの開始から終了までのボート402a~402fの位置情報等をまとめて取得してもよい。また、位置情報等は取得せずに、レースタイム情報のみを取得して、仮想レースを表示してもよい。仮想レースとして表示するレースは、展示レースであってもよい。 If the race displayed as a virtual race is a past race, the position information of the boats 402a to 402f from the start to the end of the race may be collectively acquired before the start of the virtual race. Further, the virtual race may be displayed by acquiring only the race time information without acquiring the position information and the like. The race displayed as a virtual race may be an exhibition race.
現実世界におけるレースの終了後、ステップS760において、制御部190は、結果表示処理を実行し、アプリケーションプログラム151を終了する操作入力を受け付けたこと等に応じて、一連の表示制御処理を終了する。 After the end of the race in the real world, in step S760, the control unit 190 executes the result display process and ends a series of display control processes in response to receiving an operation input for terminating the application program 151.
(第二動作例)
次に、図15~図17を参照して、システム1における第二動作例について説明する。図15は、第二動作例における表示制御処理の一例を示すフローチャートである。図15のステップS710及びS720は、図7に示すステップS710及びS720と同一であるため、詳細な説明は省略する。
(Second operation example)
Next, a second operation example in the system 1 will be described with reference to FIGS. 15 to 17. FIG. 15 is a flowchart showing an example of the display control process in the second operation example. Since steps S710 and S720 in FIG. 15 are the same as steps S710 and S720 shown in FIG. 7, detailed description thereof will be omitted.
ステップS1510において、制御部190は、選択情報を取得する。具体的には、制御部190は、ユーザが購入した舟券や馬券を示す情報を選択情報として取得する。 In step S1510, the control unit 190 acquires selection information. Specifically, the control unit 190 acquires information indicating a boat ticket or betting ticket purchased by the user as selection information.
ステップS1520において、制御部190は、取得した選択情報に基づいて、選択対象の移動オブジェクトにアバター情報を登録する。ステップS1530において、制御部190は、登録されたアバター情報に基づいて、移動オブジェクトの表示を変更する。 In step S1520, the control unit 190 registers the avatar information in the moving object to be selected based on the acquired selection information. In step S1530, the control unit 190 changes the display of the moving object based on the registered avatar information.
以下、図16を用いて、ステップS1530の表示処理を具体的に説明する。図16は、ステップS1530における表示画面の一例を示す模式図である。図16に示す例において、ユーザが移動オブジェクト502aに相当する現実世界のボート402aの舟券を購入したとする。制御部190は、ユーザがボート402aの舟券を購入したことを選択情報として取得し、ボート402aに対応する移動オブジェクト502aに関連付けてユーザのアバター情報を登録する。登録されるアバター情報は、例えば、ステップS620において取得された編集情報に基づいてステップS630で更新されたアバター情報である。次いで、制御部190は、移動オブジェクト502a上に表示されるレーサーの画像を、移動オブジェクト502aに関連付けて登録されたアバター情報に基づくアバターオブジェクトに表示変更する。図16の例では、移動オブジェクト502a上のレーサーが、ユーザにより更新された編集情報に基づいて、怪獣の姿のアバターオブジェクト1620(第1アバターオブジェクトの一例)に変更されて表示されている。一方で、ユーザが舟券を購入していないボート402bに対応する移動オブジェクト502b上には、通常のレーサーの姿のアバターオブジェクト1610(第2アバターオブジェクトの一例)が表示されている。すなわち、取得した選択情報に基づいて、ユーザが選択した出場者等を表す移動オブジェクト502aが、ユーザが選択していない出場者等を表す移動オブジェクト502bとは異なる表示態様で表示される。移動オブジェクト502a,502bが移動した場合、移動オブジェクト502a上のアバターオブジェクト1620及び移動オブジェクト502b上のアバターオブジェクト1610が移動オブジェクト502a,502bとともに移動する。 Hereinafter, the display process of step S1530 will be specifically described with reference to FIG. FIG. 16 is a schematic diagram showing an example of the display screen in step S1530. In the example shown in FIG. 16, it is assumed that the user purchases a boat ticket for a real-world boat 402a corresponding to the moving object 502a. The control unit 190 acquires the fact that the user has purchased the boat ticket of the boat 402a as selection information, and registers the user's avatar information in association with the moving object 502a corresponding to the boat 402a. The registered avatar information is, for example, the avatar information updated in step S630 based on the editing information acquired in step S620. Next, the control unit 190 changes the display of the racer image displayed on the moving object 502a to an avatar object based on the avatar information registered in association with the moving object 502a. In the example of FIG. 16, the racer on the moving object 502a is changed to the avatar object 1620 (an example of the first avatar object) in the form of a monster based on the editing information updated by the user and displayed. On the other hand, an avatar object 1610 (an example of a second avatar object) in the form of a normal racer is displayed on the moving object 502b corresponding to the boat 402b for which the user has not purchased a boat ticket. That is, based on the acquired selection information, the moving object 502a representing the contestants selected by the user is displayed in a display mode different from the moving object 502b representing the contestants not selected by the user. When the moving objects 502a and 502b move, the avatar object 1620 on the moving object 502a and the avatar object 1610 on the moving object 502b move together with the moving objects 502a and 502b.
図15に戻り、S1540において、制御部190は、所定のレースにおける所定のイベントの発生を検出する。現実世界の所定のレースにおいて所定のイベントが発生した場合、例えば、制御部290は、レースの画像解析や第2サーバ装置40から受信した情報と、イベント情報252Bに記憶された情報とに基づいて、イベントの発生を検知し、イベントの発生を検知したことを示すイベント情報をユーザ端末10に送信する。ステップS730において、制御部190は、例えば、レース中に該イベント情報を受信したことに基づいて、イベントの発生を検出する。 Returning to FIG. 15, in S1540, the control unit 190 detects the occurrence of a predetermined event in a predetermined race. When a predetermined event occurs in a predetermined race in the real world, for example, the control unit 290 is based on the image analysis of the race, the information received from the second server device 40, and the information stored in the event information 252B. , The occurrence of an event is detected, and the event information indicating that the occurrence of the event has been detected is transmitted to the user terminal 10. In step S730, the control unit 190 detects the occurrence of an event, for example, based on receiving the event information during the race.
また、ステップS1540において、制御部190は、レース中における第1サーバ装置20からのイベント情報の送信に依らずに、例えば、第1情報152Aやユーザ情報153に基づいて、所定のイベントの発生を検出してもよい。例えば、制御部190は、出場者等の位置情報及び当該位置情報に対応する時間情報に基づいて、所定のイベントの発生を検出してもよい。 Further, in step S1540, the control unit 190 generates a predetermined event based on, for example, the first information 152A and the user information 153, without depending on the transmission of the event information from the first server device 20 during the race. It may be detected. For example, the control unit 190 may detect the occurrence of a predetermined event based on the position information of the contestants and the like and the time information corresponding to the position information.
ステップS1550において、制御部190は、移動オブジェクト502に登録されたアバター情報に特定のアクションを追加する。ステップS1540において第1サーバ装置20からイベント情報を受信しており、該イベント情報に対応するアバター情報が登録されている場合、ステップS1550では、例えば、当該アバター情報に基づいて、アクションオブジェクトの表示が実行される。なお、第1サーバ装置20からはイベントID等のイベントを識別する情報のみを受信し、例えば、取得したイベントID等に対応する情報をイベント情報152Bから取得し、当該イベント情報152Bに基づいてアクションオブジェクトが表示されてもよい。 In step S1550, the control unit 190 adds a specific action to the avatar information registered in the moving object 502. When the event information is received from the first server device 20 in step S1540 and the avatar information corresponding to the event information is registered, in step S1550, for example, the display of the action object is displayed based on the avatar information. Will be executed. It should be noted that only the information identifying the event such as the event ID is received from the first server device 20, for example, the information corresponding to the acquired event ID or the like is acquired from the event information 152B, and the action is taken based on the event information 152B. The object may be displayed.
以下、図17を用いて、ステップS1550のアクションオブジェクトの表示処理を具体的に説明する。図17は、ステップS1550における表示画面の一例を示す模式図である。 Hereinafter, the display processing of the action object in step S1550 will be specifically described with reference to FIG. FIG. 17 is a schematic diagram showing an example of the display screen in step S1550.
図17に示す例では、ユーザが舟券(投票権)を購入したボート402aに対応する移動オブジェクト502aに対してアクションオブジェクト(アクション画像)が表示されている。具体的には、ディスプレイ132には、移動オブジェクト502a上に表示される怪獣オブジェクト1620に対して炎を模したアクションオブジェクト1710が表示されている。このアクションオブジェクト1710は、例えば、最終直線イベントの発生に伴って、ユーザが購入している投票権の対象である出場者に付加されたものである。 In the example shown in FIG. 17, an action object (action image) is displayed for the moving object 502a corresponding to the boat 402a for which the user has purchased a boat ticket (voting right). Specifically, the display 132 displays an action object 1710 that imitates a flame against the monster object 1620 displayed on the moving object 502a. This action object 1710 is added to a contestant who is the target of the voting right purchased by the user, for example, with the occurrence of the final straight line event.
アクションオブジェクト1710の付加は、上記の例に限定されず、例えば、最終直線イベントではない他のイベントの発生の検出(例えば、投票権の対象である出場者が勝利したこと、当該出場者が負けたこと、加速が増大したこと、他の出場者よりも速度が速いこと等を検出した場合など)に応じて付加してもよい。例えば現在の順位に応じて、アバターオブジェクトに付加されるアクションを変更させてもよい。 The addition of the action object 1710 is not limited to the above example, for example, detection of the occurrence of another event other than the final straight line event (for example, the contestant who is the subject of the voting right wins, the contestant loses). It may be added according to the fact that the acceleration has increased, the speed is faster than other contestants, etc.). For example, the action attached to the avatar object may be changed according to the current rank.
イベント発生時に付加されるアクションオブジェクトは、ユーザが図8のアバター編集画面800で選択可能である。例えば、投票権の対象である出場者が勝利した場合には、図9に示す選択テーブル900の「勝利アクション」の欄に表示される複数の選択肢のうち、ユーザが図8のアバター編集画面800で選択した勝利アクションを表すアクションオブジェクトが表示される。同様に、投票権の対象である出場者が負けた場合には、図9に示す選択テーブル900の「負けアクション」の欄に表示される複数の選択肢のうち、ユーザが図8のアバター編集画面800で選択した負けアクションを表すアクションオブジェクトが表示される。また、アクションオブジェクトは、移動オブジェクト502ではなく、レース場オブジェクト501等の他の仮想オブジェクトに対して付加してもよい。 The action object added when the event occurs can be selected by the user on the avatar editing screen 800 of FIG. For example, if the contestant who is the target of the voting right wins, the user can use the avatar editing screen 800 of FIG. 8 among the plurality of options displayed in the “winning action” column of the selection table 900 shown in FIG. The action object that represents the winning action selected in is displayed. Similarly, if the contestant who is the target of the voting right loses, the user can use the avatar editing screen of FIG. 8 among the plurality of options displayed in the "losing action" column of the selection table 900 shown in FIG. An action object representing the losing action selected in 800 is displayed. Further, the action object may be attached to another virtual object such as the race field object 501 instead of the moving object 502.
図17の例では、イベントの発生時にのみアクションオブジェクト1710が表示されるが、アクションの種別によっては、ユーザが購入している投票権の対象である出場者の移動オブジェクトの近傍にアクションオブジェクトが常時表示されてもよい。すなわち、ステップS730において、ユーザが所定の出場者の投票権を購入したことに基づいてイベントの発生が検出されてもよい。 In the example of FIG. 17, the action object 1710 is displayed only when an event occurs, but depending on the type of action, the action object is always near the moving object of the contestant who is the target of the voting right purchased by the user. It may be displayed. That is, in step S730, the occurrence of an event may be detected based on the user purchasing the voting right of a predetermined contestant.
移動オブジェクト502とアクションオブジェクトとの対応付けは、上記の例に限定されず、例えば、ユーザが購入している投票権の対象外である出場者に付加してもよい。この場合、移動オブジェクト502とアクションオブジェクトとの対応付けは、第1サーバ装置20の管理者等が予め設定してもよいし、ユーザ端末10のユーザが設定してもよい。例えば、ユーザ端末10のユーザが、「お気に入り」と設定した移動オブジェクトに対応してアクションオブジェクトを表示させるようにしてもよい。
また、移動オブジェクト502に対応させて表示されるアクションオブジェクトを、選択候補として設定された複数のオブジェクトの中からユーザが選択可能なように構成してもよい。また、選択候補となるオブジェクトの種類は、ユーザの課金等に応じて増加してもよい。
The association between the moving object 502 and the action object is not limited to the above example, and may be added to, for example, a contestant who is not subject to the voting right purchased by the user. In this case, the association between the moving object 502 and the action object may be set in advance by the administrator of the first server device 20 or the user of the user terminal 10. For example, the user of the user terminal 10 may display the action object corresponding to the moving object set as "favorite".
Further, the action object displayed corresponding to the moving object 502 may be configured so that the user can select it from a plurality of objects set as selection candidates. Further, the types of objects that are candidates for selection may be increased according to the user's billing and the like.
図7と同様に、図15におけるステップS710~S1550の処理は、少なくとも現実世界におけるレースの開始時点から終了時点まで繰り返されるが、現実世界におけるレースの開始前及び終了後においても繰り返されてもよい。図15におけるステップS760の処理は、図7におけるステップS760の処理と同一であるため、詳細な説明は省略する。 Similar to FIG. 7, the process of steps S710 to S1550 in FIG. 15 is repeated at least from the start time to the end time of the race in the real world, but may be repeated before and after the start and end of the race in the real world. .. Since the process of step S760 in FIG. 15 is the same as the process of step S760 in FIG. 7, detailed description thereof will be omitted.
上記の実施形態は、本発明の理解を容易にするための例示に過ぎず、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良することができると共に、本発明にはその均等物が含まれることは言うまでもない。 The above embodiments are merely examples for facilitating the understanding of the present invention, and are not intended to limit the interpretation of the present invention. It goes without saying that the present invention can be modified and improved without departing from the spirit thereof, and the present invention includes its equivalents.
[付記事項]
本開示の内容を列記すると以下の通りである。
[Additional notes]
The contents of this disclosure are listed below.
(項目1)
プロセッサ及び撮像部を備えた第1コンピュータにおいて実行されるプログラムであって、
前記プログラムは、前記プロセッサに、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成するステップと、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させ、
前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
プログラム。
これにより、コンピュータに表示される仮想レースを通じて、当該仮想レースに対応する現実世界のレースの出場者の認知度を向上させることができる。また、ユーザの出場者に対する親近感をも高められるため、特に、観戦初心者のユーザの満足度を向上させることができ、レース観戦のファン層を若年層にまで広げることも可能となる。
(Item 1)
A program executed in a first computer equipped with a processor and an image pickup unit.
The program is attached to the processor.
The step of receiving the first information about a given race in the real world from the second computer,
A step of generating a virtual object for presenting the second information about the predetermined race to the user of the first computer based on the first information, and
The step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
The virtual object includes at least a moving object corresponding to the contestant or the moving object.
The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
program.
As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race. In addition, since the user's sense of familiarity with the contestants can be enhanced, the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
(項目2)
前記第1情報は、前記所定のレースの実施中における前記出場者または前記移動体の位置情報と、前記位置情報に対応する時間情報と、をさらに含み、
前記表示するステップは、前記位置情報および前記時間情報に基づいて前記移動オブジェクトを移動させて前記所定のレースを仮想的に表現する仮想レースを表示するとともに、前記仮想レースにおいて、前記移動オブジェクトの少なくとも一部が前記所定のアバターを含むように表示することを含む、
項目1に記載のプログラム。
これにより、仮想レースの視認性や興趣性を向上させて、ユーザの満足度をさらに向上させることができる。
(Item 2)
The first information further includes position information of the contestant or the moving body during the execution of the predetermined race, and time information corresponding to the position information.
The display step displays a virtual race in which the moving object is moved based on the position information and the time information to virtually represent the predetermined race, and in the virtual race, at least the moving object is displayed. Including displaying a portion to include the predetermined avatar,
The program described in item 1.
As a result, the visibility and interest of the virtual race can be improved, and the user's satisfaction can be further improved.
(項目3)
前記プログラムは、前記プロセッサに、さらに、
前記ユーザが選択した1以上の前記出場者または前記移動体を示す1以上の選択情報を取得するステップを実行させるものであり、
前記表示するステップは、前記選択情報に基づいて、前記所定のアバターのうち、前記ユーザが選択した1以上の前記出場者または前記移動体を表す第1アバターを、前記ユーザが選択していない1以上の前記出場者または前記移動体を表す第2アバターとは異なる表示態様で表示することを含む、
項目1または項目2に記載のプログラム。
ユーザが選択したアバターの表示態様と、ユーザが選択していないアバターの表示態様とを異ならせることで、仮想レースの視認性や興趣性をさらに向上させることができる。
(Item 3)
The program is applied to the processor and further to the processor.
It is intended to execute a step of acquiring one or more selection information indicating one or more contestants or the moving object selected by the user.
In the step to be displayed, the user has not selected one or more of the predetermined avatars selected by the user, the first avatar representing the contestant or the moving object, based on the selection information. The present invention includes displaying in a display mode different from that of the second avatar representing the contestant or the moving object.
The program according to item 1 or item 2.
By making the display mode of the avatar selected by the user different from the display mode of the avatar not selected by the user, the visibility and interest of the virtual race can be further improved.
(項目4)
前記表示するステップは、前記所定のレースにおける現在の順位に応じて、前記所定のアバターの表示態様を変更することを含む、
項目2または項目3に記載のプログラム。
これにより、仮想レースの視認性や興趣性をさらに向上させることができる。
(Item 4)
The display step comprises changing the display mode of the predetermined avatar according to the current ranking in the predetermined race.
The program according to item 2 or item 3.
This makes it possible to further improve the visibility and interest of the virtual race.
(項目5)
前記第1コンピュータは、前記所定のアバターに関するデータベースを記憶する記憶部をさらに備え、
前記プログラムは、前記プロセッサに、さらに、
前記ユーザの第2操作入力に応じて、前記データベースに予め設定された複数の項目に基づいて前記所定のアバターを編集するステップを実行させるものである、
項目1から項目4のいずれかに記載のプログラム。
これにより、ユーザがアバターを好みに応じて編集することで、ユーザの出場者等に対する親近感を向上させることができる。
(Item 5)
The first computer further comprises a storage unit for storing a database relating to the predetermined avatar.
The program is applied to the processor and further to the processor.
In response to the user's second operation input, the step of editing the predetermined avatar based on a plurality of items preset in the database is executed.
The program according to any one of items 1 to 4.
As a result, the user can edit the avatar according to his / her preference, thereby improving the familiarity of the user with the contestants and the like.
(項目6)
前記表示するステップにおいて表示される出場者情報は、前記出場者または前記移動体の特性チャートを少なくとも含む、
項目1から項目5のいずれかに記載のプログラム。
これにより、ユーザは、各移動オブジェクトの特性を把握することが可能になる。また、通常、特性情報は、ユーザの関心が高い情報の一つなので、特性情報を移動体オブジェクトと併せて表示することで、ユーザの満足度を向上させることができる。
(Item 6)
The contestant information displayed in the displayed step includes at least a characteristic chart of the contestant or the moving object.
The program according to any one of items 1 to 5.
This allows the user to understand the characteristics of each moving object. Further, since the characteristic information is usually one of the information that the user is highly interested in, the satisfaction of the user can be improved by displaying the characteristic information together with the moving object.
(項目7)
前記受信するステップは、所定の頻度で実行される、または特定のイベントが発生した場合に実行されるものであり、
前記出場者情報は、前記受信するステップが実行されることによって更新され得る、項目1から項目6のいずれかに記載のプログラム。
これにより、時間の経過に応じて出場者情報が変わった場合においても、変更後の出場者情報をユーザに届けることができる。
(Item 7)
The receiving step is executed at a predetermined frequency or when a specific event occurs, and is executed.
The program according to any one of items 1 to 6, wherein the contestant information can be updated by executing the receiving step.
As a result, even if the contestant information changes with the passage of time, the changed contestant information can be delivered to the user.
(項目8)
プロセッサ及び撮像部を備えた第1コンピュータにおいて実行される情報処理方法であって、
前記情報処理方法は、前記プロセッサに、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成するステップと、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、
を実行させることを含み、
前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
情報処理方法。
これにより、コンピュータに表示される仮想レースを通じて、当該仮想レースに対応する現実世界のレースの出場者の認知度を向上させることができる。また、ユーザの出場者に対する親近感をも高められるため、特に、観戦初心者のユーザの満足度を向上させることができ、レース観戦のファン層を若年層にまで広げることも可能となる。
(Item 8)
An information processing method executed in a first computer equipped with a processor and an image pickup unit.
The information processing method is applied to the processor.
The step of receiving the first information about a given race in the real world from the second computer,
A step of generating a virtual object for presenting the second information about the predetermined race to the user of the first computer based on the first information, and
A step of superimposing the virtual object on a real image around the first computer captured by the imaging unit and displaying the virtual object.
Including running
The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
The virtual object includes at least a moving object corresponding to the contestant or the moving object.
The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
Information processing method.
As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race. In addition, since the user's sense of familiarity with the contestants can be enhanced, the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
(項目9)
プロセッサ及び撮像部を備えた情報処理装置であって、前記プロセッサは、
現実世界における所定のレースに関する第1情報を第2コンピュータから受信し、
前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成し、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
情報処理装置。
これにより、コンピュータに表示される仮想レースを通じて、当該仮想レースに対応する現実世界のレースの出場者の認知度を向上させることができる。また、ユーザの出場者に対する親近感をも高められるため、特に、観戦初心者のユーザの満足度を向上させることができ、レース観戦のファン層を若年層にまで広げることも可能となる。
(Item 9)
An information processing device including a processor and an image pickup unit, wherein the processor is
Receive the first information about a given race in the real world from the second computer,
A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
The virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
The virtual object includes at least a moving object corresponding to the contestant or the moving object.
The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
Information processing equipment.
As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race. In addition, since the user's sense of familiarity with the contestants can be enhanced, the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
(項目10)
第1プロセッサ及び撮像装置を備える第1コンピュータと、第2プロセッサを備え、前記第1コンピュータと通信接続可能な第2コンピュータと、において実現されるシステムであって、
前記第2プロセッサは、
現実世界における所定のレースに関する第1情報を取得し、前記第1情報を前記第1コンピュータへ送信し、
前記第1プロセッサは、
前記第1情報を前記第2コンピュータから受信し、
前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成し、
前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
前記重畳させて表示することは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、システム。
これにより、コンピュータに表示される仮想レースを通じて、当該仮想レースに対応する現実世界のレースの出場者の認知度を向上させることができる。また、ユーザの出場者に対する親近感をも高められるため、特に、観戦初心者のユーザの満足度を向上させることができ、レース観戦のファン層を若年層にまで広げることも可能となる。
(Item 10)
A system realized by a first computer provided with a first processor and an image pickup apparatus, and a second computer provided with a second processor and capable of communicating with the first computer.
The second processor is
Acquire the first information about a predetermined race in the real world, transmit the first information to the first computer, and then
The first processor is
The first information is received from the second computer,
A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
The virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
The virtual object includes at least a moving object corresponding to the contestant or the moving object.
The superimposed display is to display at least a part of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object. Including the system.
As a result, through the virtual race displayed on the computer, it is possible to improve the recognition of the contestants of the real world race corresponding to the virtual race. In addition, since the user's sense of familiarity with the contestants can be enhanced, the satisfaction level of the user who is a beginner in watching the game can be improved, and the fan base of watching the race can be expanded to the younger class.
(項目11)
前記第1プロセッサは、さらに、
前記ユーザが選択した1以上の前記出場者または前記移動体を示す1以上の選択情報を取得し、
前記第1情報は、前記所定のレースの実施中における前記出場者または前記移動体の位置情報と、前記位置情報に対応する時間情報と、を含み、
前記表示するステップは、前記移動オブジェクトを移動させて前記所定のレースを仮想的に表現する仮想レースを表示するとともに、前記仮想レースにおいて、前記移動オブジェクトの少なくとも一部が前記所定のアバターを含むように表示することを含み、
前記位置情報は、前記出場者または前記移動体に設けられた位置センサ、または前記所定のレースの実施中に撮像装置によって撮像された前記所定のレースの画像に基づいて取得される、
項目10に記載のシステム。
これにより、出場者等の位置情報を得ることが容易になる。結果として、現実世界のレース中の順位や攻防なども仮想レース上で再現可能になるため、ユーザの満足度を向上させることができる。
(Item 11)
The first processor further
Acquire one or more selection information indicating the one or more contestants or the moving object selected by the user, and obtain the selection information.
The first information includes the position information of the contestant or the moving body during the execution of the predetermined race, and the time information corresponding to the position information.
The display step displays a virtual race that virtually represents the predetermined race by moving the moving object, and at least a part of the moving object includes the predetermined avatar in the virtual race. Including displaying on
The position information is acquired based on the position sensor provided on the contestant or the moving body, or the image of the predetermined race captured by the image pickup device during the execution of the predetermined race.
The system according to item 10.
This makes it easy to obtain the position information of the contestants and the like. As a result, the ranking and offense and defense during the race in the real world can be reproduced on the virtual race, so that the user's satisfaction can be improved.
1:システム、10:ユーザ端末、20:第1サーバ装置、30:ネットワーク、40:第2サーバ装置、130:タッチスクリーン、150:(ユーザ端末の)記憶部、190:(ユーザ端末の)制御部、191:操作入力受付部、192:送受信部、193:オブジェクト生成部、194:表示制御部、195:平坦面検出部、196:イベント検出部、197:選択情報取得部、250:(第1サーバ装置の)記憶部、290:(第1サーバ装置の)制御部、401:ボートレース場、402a~402f:ボート、501:レース場オブジェクト、502(502a~502f):移動オブジェクト、1310:詳細情報オブジェクト、1330:アバター表示欄、1340:チャート表示欄、1610,1620:アバターオブジェクト、1710:アクションオブジェクト
 
1: System, 10: User terminal, 20: First server device, 30: Network, 40: Second server device, 130: Touch screen, 150: Storage unit (of user terminal), 190: Control (of user terminal) Unit, 191: Operation input reception unit, 192: Transmission / reception unit, 193: Object generation unit, 194: Display control unit, 195: Flat surface detection unit, 196: Event detection unit, 197: Selection information acquisition unit, 250: (No. Storage unit (of 1 server device) 290: Control unit (of 1st server device), 401: Boat race field, 402a to 402f: Boat, 501: Race field object, 502 (502a to 502f): Moving object, 1310: Detailed information object, 1330: Avatar display field, 1340: Chart display field, 1610, 1620: Avatar object, 1710: Action object

Claims (11)

  1. プロセッサ及び撮像部を備えた第1コンピュータにおいて実行されるプログラムであって、
    前記プログラムは、前記プロセッサに、
    現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
    前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成するステップと、
    前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、を実行させ、
    前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
    前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
    前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
    プログラム。
    A program executed in a first computer equipped with a processor and an image pickup unit.
    The program is attached to the processor.
    The step of receiving the first information about a given race in the real world from the second computer,
    A step of generating a virtual object for presenting the second information about the predetermined race to the user of the first computer based on the first information, and
    The step of superimposing the virtual object on the real image around the first computer captured by the imaging unit and displaying the virtual object is executed.
    The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
    The virtual object includes at least a moving object corresponding to the contestant or the moving object.
    The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
    program.
  2. 前記第1情報は、前記所定のレースの実施中における前記出場者または前記移動体の位置情報と、前記位置情報に対応する時間情報と、をさらに含み、
    前記表示するステップは、前記位置情報および前記時間情報に基づいて前記移動オブジェクトを移動させて前記所定のレースを仮想的に表現する仮想レースを表示するとともに、前記仮想レースにおいて、前記移動オブジェクトの少なくとも一部が前記所定のアバターを含むように表示することを含む、
    請求項1に記載のプログラム。
    The first information further includes position information of the contestant or the moving body during the execution of the predetermined race, and time information corresponding to the position information.
    The display step displays a virtual race in which the moving object is moved based on the position information and the time information to virtually represent the predetermined race, and in the virtual race, at least the moving object is displayed. Including displaying a portion to include the predetermined avatar,
    The program according to claim 1.
  3. 前記プログラムは、前記プロセッサに、さらに、
    前記ユーザが選択した1以上の前記出場者または前記移動体を示す1以上の選択情報を取得するステップを実行させるものであり、
    前記表示するステップは、前記選択情報に基づいて、前記所定のアバターのうち、前記ユーザが選択した1以上の前記出場者または前記移動体を表す第1アバターを、前記ユーザが選択していない1以上の前記出場者または前記移動体を表す第2アバターとは異なる表示態様で表示することを含む、
    請求項1または請求項2に記載のプログラム。
    The program is applied to the processor and further to the processor.
    It is intended to execute a step of acquiring one or more selection information indicating one or more contestants or the moving object selected by the user.
    In the step to be displayed, the user has not selected one or more of the predetermined avatars selected by the user, the first avatar representing the contestant or the moving object, based on the selection information. The present invention includes displaying in a display mode different from that of the second avatar representing the contestant or the moving object.
    The program according to claim 1 or 2.
  4. 前記表示するステップは、前記所定のレースにおける現在の順位に応じて、前記所定のアバターの表示態様を変更することを含む、
    請求項2または請求項3に記載のプログラム。
    The display step comprises changing the display mode of the predetermined avatar according to the current ranking in the predetermined race.
    The program according to claim 2 or 3.
  5. 前記第1コンピュータは、前記所定のアバターに関するデータベースを記憶する記憶部をさらに備え、
    前記プログラムは、前記プロセッサに、さらに、
    前記ユーザの第2操作入力に応じて、前記データベースに予め設定された複数の項目に基づいて前記所定のアバターを編集するステップを実行させるものである、
    請求項1から請求項4のいずれか一項に記載のプログラム。
    The first computer further comprises a storage unit for storing a database relating to the predetermined avatar.
    The program is applied to the processor and further to the processor.
    In response to the user's second operation input, the step of editing the predetermined avatar based on a plurality of items preset in the database is executed.
    The program according to any one of claims 1 to 4.
  6. 前記表示するステップにおいて表示される出場者情報は、前記出場者または前記移動体の特性チャートを少なくとも含む、
    請求項1から請求項5のいずれか一項に記載のプログラム。
    The contestant information displayed in the displayed step includes at least a characteristic chart of the contestant or the moving object.
    The program according to any one of claims 1 to 5.
  7. 前記受信するステップは、所定の頻度で実行される、または特定のイベントが発生した場合に実行されるものであり、
    前記出場者情報は、前記受信するステップが実行されることによって更新され得る、請求項1から請求項6のいずれか一項に記載のプログラム。
    The receiving step is executed at a predetermined frequency or when a specific event occurs, and is executed.
    The program according to any one of claims 1 to 6, wherein the contestant information can be updated by executing the receiving step.
  8. プロセッサ及び撮像部を備えた第1コンピュータにおいて実行される情報処理方法であって、
    前記情報処理方法は、前記プロセッサに、
    現実世界における所定のレースに関する第1情報を第2コンピュータから受信するステップと、
    前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成するステップと、
    前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するステップと、
    を実行させることを含み、
    前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
    前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
    前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
    情報処理方法。
    An information processing method executed in a first computer equipped with a processor and an image pickup unit.
    The information processing method is applied to the processor.
    The step of receiving the first information about a given race in the real world from the second computer,
    A step of generating a virtual object for presenting the second information about the predetermined race to the user of the first computer based on the first information, and
    A step of superimposing the virtual object on a real image around the first computer captured by the imaging unit and displaying the virtual object.
    Including running
    The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
    The virtual object includes at least a moving object corresponding to the contestant or the moving object.
    The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
    Information processing method.
  9. プロセッサ及び撮像部を備えた情報処理装置であって、前記プロセッサは、
    現実世界における所定のレースに関する第1情報を第2コンピュータから受信し、
    前記所定のレースに関する第2情報を前記情報処理装置のユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成し、
    前記撮像部により撮像された前記情報処理装置の周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
    前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
    前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
    前記表示するステップは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
    情報処理装置。
    An information processing device including a processor and an image pickup unit, wherein the processor is
    Receive the first information about a given race in the real world from the second computer,
    A virtual object for presenting the second information regarding the predetermined race to the user of the information processing apparatus is generated based on the first information.
    The virtual object is superimposed and displayed on a real image around the information processing apparatus captured by the image pickup unit.
    The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
    The virtual object includes at least a moving object corresponding to the contestant or the moving object.
    The display step comprises displaying at least a portion of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object.
    Information processing equipment.
  10. 第1プロセッサ及び撮像部を備える第1コンピュータと、第2プロセッサを備え、前記第1コンピュータと通信接続可能な第2コンピュータと、において実現されるシステムであって、
    前記第2プロセッサは、
    現実世界における所定のレースに関する第1情報を取得し、前記第1情報を前記第1コンピュータへ送信し、
    前記第1プロセッサは、
    前記第1情報を前記第2コンピュータから受信し、
    前記所定のレースに関する第2情報を前記第1コンピュータのユーザに提示するための仮想オブジェクトを、前記第1情報に基づいて生成し、
    前記撮像部により撮像された前記第1コンピュータの周囲の現実画像に、前記仮想オブジェクトを重畳させて表示するものであり、
    前記第1情報は、前記所定のレースにおける出場者および移動体の少なくとも一方に関する出場者情報を少なくとも含み、
    前記仮想オブジェクトは、前記出場者または前記移動体に対応する移動オブジェクトを少なくとも含み、
    前記重畳させて表示することは、対応する前記移動オブジェクトに対する前記ユーザの第1操作入力に応じて、前記出場者情報の少なくとも一部を前記移動オブジェクトに対応する所定のアバターと関連付けて表示することを含む、
    システム。
    A system realized by a first computer provided with a first processor and an image pickup unit, and a second computer provided with a second processor and capable of communicating with the first computer.
    The second processor is
    Acquire the first information about a predetermined race in the real world, transmit the first information to the first computer, and then
    The first processor is
    The first information is received from the second computer,
    A virtual object for presenting the second information regarding the predetermined race to the user of the first computer is generated based on the first information.
    The virtual object is superimposed and displayed on the real image around the first computer captured by the imaging unit.
    The first information includes at least the contestant information regarding at least one of the contestants and the moving body in the predetermined race.
    The virtual object includes at least a moving object corresponding to the contestant or the moving object.
    The superimposed display is to display at least a part of the contestant information in association with a predetermined avatar corresponding to the moving object in response to the user's first operation input to the corresponding moving object. including,
    system.
  11. 前記第1プロセッサは、さらに、
    前記ユーザが選択した1以上の前記出場者または前記移動体を示す1以上の選択情報を取得し、
    前記第1情報は、前記所定のレースの実施中における前記出場者または前記移動体の位置情報と、前記位置情報に対応する時間情報と、を含み、
    前記表示するステップは、前記移動オブジェクトを移動させて前記所定のレースを仮想的に表現する仮想レースを表示するとともに、前記仮想レースにおいて、前記移動オブジェクトの少なくとも一部が前記所定のアバターを含むように表示することを含み、
    前記位置情報は、前記出場者または前記移動体に設けられた位置センサ、または前記所定のレースの実施中に撮像装置によって撮像された前記所定のレースの画像に基づいて取得される、
    請求項10に記載のシステム。
     
    The first processor further
    Acquire one or more selection information indicating the one or more contestants or the moving object selected by the user, and obtain the selection information.
    The first information includes the position information of the contestant or the moving body during the execution of the predetermined race, and the time information corresponding to the position information.
    The display step displays a virtual race that virtually represents the predetermined race by moving the moving object, and at least a part of the moving object includes the predetermined avatar in the virtual race. Including displaying on
    The position information is acquired based on the position sensor provided on the contestant or the moving body, or the image of the predetermined race captured by the image pickup device during the execution of the predetermined race.
    The system according to claim 10.
PCT/JP2021/042180 2020-12-04 2021-11-17 Program, information processing method, information processing device, and system WO2022118652A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020201882A JP6857774B1 (en) 2020-12-04 2020-12-04 Programs, information processing methods, information processing devices, and systems
JP2020-201882 2020-12-04

Publications (1)

Publication Number Publication Date
WO2022118652A1 true WO2022118652A1 (en) 2022-06-09

Family

ID=75378005

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/042180 WO2022118652A1 (en) 2020-12-04 2021-11-17 Program, information processing method, information processing device, and system

Country Status (2)

Country Link
JP (2) JP6857774B1 (en)
WO (1) WO2022118652A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022174992A (en) * 2021-05-12 2022-11-25 株式会社コロプラ Program, information processing method, information processing device, and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001251671A (en) * 2000-03-03 2001-09-14 Matsushita Electric Ind Co Ltd Mutual position information notice system and mutual position information notice method
JP2003204481A (en) * 2001-08-16 2003-07-18 Space Tag Inc Image-information distribution system
JP2012157510A (en) * 2011-01-31 2012-08-23 Fujitsu Frontech Ltd Passage position display system, terminal device for the same, and program
JP2019003346A (en) * 2017-06-13 2019-01-10 株式会社Mgrシステム企画 Activity support method and activity support apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4059417B2 (en) 1999-04-27 2008-03-12 株式会社バンダイナムコゲームス GAME DEVICE, HOST DEVICE, AND INFORMATION STORAGE MEDIUM
JP2002210060A (en) 2001-01-16 2002-07-30 Nippon Hoso Kyokai <Nhk> Contest participation type exercising apparatus and center system therefor
JP2007089648A (en) 2005-09-27 2007-04-12 Nec Corp Distribution system of race information, portable device, distribution method, and program
WO2019021375A1 (en) 2017-07-25 2019-01-31 富士通株式会社 Video generation program, video generation method, and video generation device
US10872493B2 (en) 2018-04-30 2020-12-22 Igt Augmented reality systems and methods for sports racing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001251671A (en) * 2000-03-03 2001-09-14 Matsushita Electric Ind Co Ltd Mutual position information notice system and mutual position information notice method
JP2003204481A (en) * 2001-08-16 2003-07-18 Space Tag Inc Image-information distribution system
JP2012157510A (en) * 2011-01-31 2012-08-23 Fujitsu Frontech Ltd Passage position display system, terminal device for the same, and program
JP2019003346A (en) * 2017-06-13 2019-01-10 株式会社Mgrシステム企画 Activity support method and activity support apparatus

Also Published As

Publication number Publication date
JP2022089466A (en) 2022-06-16
JP2022089736A (en) 2022-06-16
JP6857774B1 (en) 2021-04-14
JP7434202B2 (en) 2024-02-20

Similar Documents

Publication Publication Date Title
JP6903805B1 (en) Programs, information processing methods, information processing devices, and systems
WO2022118652A1 (en) Program, information processing method, information processing device, and system
JP7212721B2 (en) Program, information processing method, information processing apparatus, and system
JP6857785B1 (en) Programs, information processing methods, information processing devices, and systems
JP6857776B1 (en) Programs, information processing methods, information processing devices, and systems
CN117769823A (en) In-game asset tracking using NFT tracking impressions across multiple platforms
JP6896932B1 (en) Programs, information processing methods, information processing devices, and systems
JP7303845B2 (en) Program, information processing method, information processing apparatus, and system
JP6934547B1 (en) Programs, information processing methods, information processing devices, and systems
WO2022014303A1 (en) Program, information processing method, information processing device, and system
JP7324801B2 (en) Program, information processing method, information processing apparatus, and system
JP6934552B1 (en) Programs, information processing methods, information processing devices, and systems
JP6866540B1 (en) Programs, information processing methods, information processing devices, and systems
JP6903806B1 (en) Programs, information processing methods, information processing devices, and systems
JP7303846B2 (en) Program, information processing method, information processing apparatus, and system
WO2022239403A1 (en) Program, information processing method, information processing device, and system
WO2022137376A1 (en) Method, computer-readable medium, and information processing device
JP7219792B2 (en) Program, information processing method, information processing apparatus, and system
JP2022138681A (en) Program, information processing method, information processing device and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21900407

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21900407

Country of ref document: EP

Kind code of ref document: A1