WO2020044749A1 - 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム - Google Patents

配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム Download PDF

Info

Publication number
WO2020044749A1
WO2020044749A1 PCT/JP2019/024876 JP2019024876W WO2020044749A1 WO 2020044749 A1 WO2020044749 A1 WO 2020044749A1 JP 2019024876 W JP2019024876 W JP 2019024876W WO 2020044749 A1 WO2020044749 A1 WO 2020044749A1
Authority
WO
WIPO (PCT)
Prior art keywords
gift
display
moving image
distribution
displayed
Prior art date
Application number
PCT/JP2019/024876
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
彩 倉淵
Original Assignee
グリー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018159802A external-priority patent/JP6491388B1/ja
Priority claimed from JP2019035044A external-priority patent/JP6523586B1/ja
Priority claimed from JP2019083729A external-priority patent/JP6550549B1/ja
Application filed by グリー株式会社 filed Critical グリー株式会社
Priority to KR1020217003200A priority Critical patent/KR102490402B1/ko
Publication of WO2020044749A1 publication Critical patent/WO2020044749A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics

Definitions

  • the disclosure in this specification relates to a moving image distribution system, a moving image distribution method, and a moving image distribution program for live distribution of a moving image including animation of a character object generated based on a movement of a distribution user.
  • a content distribution system that displays a gift object corresponding to a gift purchased by a viewing user on a display screen in response to a request from a viewing user viewing the content.
  • a viewing user purchases a gift item and provides the purchased gift item to the distribution user as a gift. Can be.
  • the gift is displayed on the moving image being distributed in response to a display request from the viewing user.
  • the gift display request may be automatically generated in response to the gift being purchased by the viewing user.
  • the gift is automatically displayed on the moving image being distributed in response to the display request of the gift from the viewing user. Therefore, the gift may be displayed on the moving image at a timing not desired by the distribution user. As a result, there is a problem that the performance through the character object is hindered for the distribution user. For example, if a gift is displayed prominently in a moving image during a performance, the performance cannot be sufficiently expressed in the moving image.
  • the gift may be displayed in the moving image at a timing that does not match the desire of the distribution user, and thus the main part of the moving image (for example, around the character object) is a gift. Is difficult to arrange the display area.
  • the purpose of the present disclosure is to provide a technical improvement that solves or alleviates at least some of the above-mentioned problems of the prior art.
  • One of more specific objects of the present disclosure is to provide a moving image distribution system, a moving image distribution method, and a moving image distribution program that can display a gift at a desired timing of a distribution user during a distribution moving image.
  • One embodiment of the present invention relates to a moving image distribution system that performs live distribution of a moving image including animation of a character object generated based on a movement of a distribution user.
  • the video distribution system includes one or more computer processors, and the one or more computer processors execute a computer-readable instruction to receive a first display request related to a first gift from the viewing user.
  • a display instruction object is displayed on the distribution user device used by the distribution user, and the first gift is displayed on the moving image in response to an operation on the display instruction object.
  • the first gift is a mounting gift associated with a mounting portion of the character object, and the mounting gift is included in the moving image in response to an operation on the display instruction object. Is displayed at the position corresponding to.
  • the wearing gift includes a first wearing gift associated with a first wearing part of the wearing parts, and the first wearing gift includes a first wearing gift displayed on the moving image.
  • the one or more computer processors display the first wearing gift in the moving image
  • the one or more computer processors output the first one of the wearing gifts until the first display time elapses.
  • the display of the second wearing gift associated with the one wearing part on the moving image is prohibited.
  • the one or more computer processors may be configured to output the first gift of the wearing gift until the first display time elapses.
  • the display instruction object for displaying the second wearing gift associated with the wearing part is deactivated.
  • the message gift in response to receiving a message gift associated with a message from the viewing user, display the message on the distribution user device, in response to a display instruction operation to the distribution user device, The message gift is displayed on the moving image.
  • the video is synthesized with the voice of the distribution user, and the distribution is performed in response to receiving a voice change gift for changing the voice of the distribution user from the viewing user.
  • a voice change instruction object is displayed on the user device, and the voice of the distribution user is changed to the voice specified by the voice change gift according to an operation on the voice change instruction object.
  • the second gift in response to receiving a second display request related to a second gift displayed in the moving image without being associated with a specific part of the character object from a viewing user viewing the moving image, The second gift is displayed on the moving image.
  • a gift display prohibition period is set within the distribution period of the moving image, and the second gift is displayed on the moving image at a timing other than the gift display prohibition period in the distribution period of the moving image.
  • One embodiment of the present invention relates to a moving image distribution method for performing live distribution of a moving image including animation of a character object generated based on a movement of a distribution user by executing computer-readable instructions by one or a plurality of computer processors.
  • the moving image distribution method includes, in response to receiving a first display request related to a first gift from the viewing user, displaying a display instruction object on a distribution user device used by the distribution user; And displaying the first gift in the moving image in response to the operation of (1).
  • One embodiment of the present invention relates to a moving image distribution program that performs live distribution of a moving image including animation of a character object generated based on a movement of a distribution user.
  • the moving image distribution program causes one or more computer processors to display a display instruction object on a distribution user device used by the distribution user in response to receiving a first display request related to a first gift from the viewing user.
  • a step of displaying the first gift on the moving image in response to an operation on the display instruction object.
  • a gift can be displayed at a desired timing of a distribution user during a distribution moving image.
  • FIG. 1 is a block diagram illustrating a moving image distribution system according to an embodiment.
  • FIG. 2 is a schematic diagram schematically illustrating a distribution user who distributes a moving image distributed by the moving image distribution system of FIG. 1 and a distribution user device used by the distribution user. It is a figure showing an example of a display screen displayed on viewing user device 10 in one embodiment.
  • FIG. 6 is a diagram illustrating an example of a display screen displayed on the distribution user device 20 in one embodiment. It is a figure showing an example of a display screen displayed on viewing user device 10 in one embodiment. An example of a normal object is displayed on the display screen of FIG. It is a figure showing an example of a display screen displayed on viewing user device 10 in one embodiment.
  • FIG. 5A An example of the mounted object is displayed on the display screen of FIG. 5A.
  • FIG. 6 is a diagram illustrating an example of a display screen displayed on the distribution user device 20 in one embodiment.
  • An example of the mounted object is displayed on the display screen of FIG. 5B.
  • FIG. 6 is a diagram illustrating an example of a display screen displayed on the distribution user device 20 in one embodiment.
  • An example of the message confirmation window is displayed on the display screen of FIG. It is a figure showing an example of a display screen displayed on viewing user device 10 in one embodiment.
  • An example of a message window is displayed on the display screen of FIG. 7A.
  • FIG. 6 is a diagram illustrating an example of a display screen displayed on the distribution user device 20 in one embodiment.
  • An example of a message window is displayed on the display screen of FIG.
  • FIG. 7B It is a flowchart which shows the flow of a moving image distribution process in one Embodiment. It is a flowchart which shows the flow of a process which displays a normal gift in one Embodiment. It is a flow figure showing the flow of processing which displays a wearing gift in one embodiment. It is a flowchart which shows the flow of a process which displays a message gift in one Embodiment.
  • FIG. 2 is a diagram for explaining a gift display prohibition period set for a moving image distributed in the moving image distribution system of FIG. 1.
  • FIG. 1 is a block diagram illustrating a video distribution system 1 according to an embodiment.
  • FIG. 2 is a distribution user U1 that distributes a video distributed by the video distribution system 1 and a distribution user device used by the distribution user.
  • FIG. 1 is a block diagram illustrating a video distribution system 1 according to an embodiment.
  • FIG. 2 is a distribution user U1 that distributes a video distributed by the video distribution system 1 and a distribution user device used by the distribution user.
  • FIG. 1 is a block diagram illustrating a video distribution system 1 according to an embodiment.
  • FIG. 2 is a distribution user U1 that distributes a video distributed by the video distribution system 1 and a distribution user device used by the distribution user.
  • the moving image distribution system 1 includes the viewing user device 10, the distribution user device 20, the server device 60, and the storage 70.
  • the viewing user device 10, the distribution user device 20, the server device 60, and the storage 70 are communicably connected via a network 50.
  • the server device 60 is configured to distribute a moving image including an animation of the character of the distribution user U1, as described later.
  • This moving image is distributed from the server device 60 to the viewing user device 10 and the distribution user device 20.
  • the distributed moving image is displayed on the display of the viewing user device 10.
  • a viewing user who is a user of the viewing user device 10 can view the distributed moving image with the viewing user device 10.
  • the moving image distribution system 1 may include a plurality of viewing user devices. By viewing the distributed moving image, the distribution user U1 can perform while checking the moving image in the moving image.
  • the distribution user device 20 includes a computer processor 21, a communication I / F 22, a display 23, a camera 24, and a microphone 25.
  • the computer processor 21 is an arithmetic unit that loads an operating system and various programs for realizing various functions from a storage into a memory, and executes instructions included in the loaded programs.
  • the computer processor 21 is, for example, a CPU, an MPU, a DSP, a GPU, various arithmetic devices other than these, or a combination thereof.
  • the computer processor 21 may be realized by an integrated circuit such as an ASIC, a PLD, an FPGA, and an MCU. Although the computer processor 21 is illustrated as a single component in FIG. 1, the computer processor 21 may be a set of a plurality of physically separate computer processors.
  • the communication I / F 22 is implemented as hardware, firmware, communication software such as a TCP / IP driver or a PPP driver, or a combination thereof.
  • the distribution user device 20 can transmit and receive data to and from another device via the communication I / F 22.
  • the display 23 has a display panel and a touch panel.
  • the touch panel is configured to detect a player's touch operation (contact operation).
  • the touch panel can detect various touch operations such as tap, double tap, and drag of the player.
  • the touch panel may include a proximity sensor of a capacitance type, and may be configured to be able to detect a non-contact operation of the player.
  • the camera 24 continuously captures the image of the face of the distribution user U1, and acquires the image data of the face of the distribution user U1.
  • the image data of the face of the distribution user U1 imaged by the camera 24 is transmitted to the server device 60 via the communication I / F 22.
  • the camera 24 may be a 3D camera that can detect the depth of a person's face.
  • the microphone 25 is a sound collection device configured to convert input sound into sound data.
  • the microphone 25 is configured to be able to acquire the voice input of the distribution user U1.
  • the voice input of distribution user U1 acquired by microphone 25 is converted into voice data, and this voice data is transmitted to server device 60 via communication I / F22.
  • the viewing user device 10 may include the same components as the distribution user device 20.
  • the viewing user device 10 may include a computer processor, a communication I / F, a display, and a camera.
  • the viewing user device 10 and the distribution user device 20 are information processing devices such as smartphones.
  • the viewing user device 10 and the distribution user device 20 are, besides a smartphone, a mobile phone, a tablet terminal, a personal computer, an electronic book reader, a wearable computer, a game console, and various other information processing devices capable of reproducing moving images. You may.
  • Each of the viewing user device 10 and the distribution user device 20 may include a sensor unit including various sensors such as a gyro sensor and a storage for storing various information, in addition to the above-described components.
  • the server device 60 includes a computer processor 61, a communication I / F 62, and a storage 63.
  • the computer processor 61 is an arithmetic device that loads an operating system and various programs for realizing various functions from the storage 63 or another storage into a memory, and executes instructions included in the loaded programs.
  • the computer processor 61 is, for example, a CPU, an MPU, a DSP, a GPU, various arithmetic devices other than these, or a combination thereof.
  • the computer processor 61 may be realized by an integrated circuit such as an ASIC, a PLD, an FPGA, and an MCU.
  • the computer processor 21 is illustrated as a single component in FIG. 1, the computer processor 61 may be a set of a plurality of physically separate computer processors.
  • the communication I / F 62 is implemented as hardware, firmware, communication software such as a TCP / IP driver or a PPP driver, or a combination thereof.
  • the server device 60 can transmit and receive data to and from another device via the communication I / F 62.
  • the storage 63 is a storage device accessed by the computer processor 61.
  • the storage 63 is, for example, a magnetic disk, an optical disk, a semiconductor memory, or any other storage device capable of storing data.
  • Various programs can be stored in the storage 63. At least a part of the programs and various data that can be stored in the storage 63 may be stored in a storage physically separate from the server device 60 (for example, the storage 70).
  • a program described as being executed by the computer processor 21 or the computer processor 61 or instructions included in the program may be respectively executed by a single computer processor or distributed by a plurality of computer processors. May be executed. Further, a program executed by the computer processor 21 or 61 or instructions included in the program may be executed by a plurality of virtual computer processors.
  • the storage 63 stores model data 63a, object data 63b, and various other data necessary for generating and distributing other distribution moving images.
  • the model data 63a is model data for generating a character animation.
  • the model data 63a may be three-dimensional model data for generating a three-dimensional animation, or may be two-dimensional model data for generating two-dimensional model data.
  • the model data 63a includes, for example, rig data (sometimes called “skeleton data”) indicating the skeleton of the character's face and parts other than the face, and surface data indicating the shape and texture of the character's surface.
  • the model data 63a can include a plurality of different model data.
  • the plurality of model data may have different rig data, or may have the same rig data.
  • the plurality of model data may have different surface data from each other, or may have the same surface data.
  • the object data 63b includes asset data for constructing a virtual space constituting a moving image.
  • the object data 63b includes data for drawing the background of the virtual space forming the moving image, data for drawing various objects displayed in the moving image, and data for drawing various objects displayed in the moving image other than these. Contains data.
  • the object data 63b may include object position information indicating the position of the object in the virtual space.
  • the object data 63b may include gift objects other than the above.
  • the gift object is displayed on the moving image based on a gift display request from a viewing user who is watching the moving image.
  • the gift object may include an effect object corresponding to the effect gift, a normal object corresponding to the normal gift, a mounting object corresponding to the mounting gift, and a message object corresponding to the message gift.
  • the viewing user can purchase a desired gift.
  • the effect object indicating the effect gift is an object that affects the impression of the entire viewing screen of the distribution moving image, for example, an object imitating confetti.
  • An object imitating a confetti may be displayed on the entire viewing screen, thereby changing the impression of the entire viewing screen before and after the display.
  • the effect object may be displayed so as to overlap with the character object, but is different from the mounted object in that it is displayed without being associated with a specific part of the character object.
  • the normal object indicating the normal gift is an object indicating a gift from the viewing user to the distribution user (for example, the distribution user U1), and simulates, for example, a stuffed animal, a bouquet, an accessory, or an article suitable for a gift or present other than the above.
  • the normal object is displayed on the moving image display screen so as not to contact the character object.
  • the normal object is displayed on the display screen of the moving image so as not to overlap with the character object.
  • the normal object may be displayed so as to overlap with an object other than the character object in the virtual space.
  • the normal object may be displayed so as to overlap with the character object, but is different from the mounted object in that a display associated with a specific part of the character object is not performed.
  • the normal object when the normal object is displayed so as to overlap with the character object, the normal object overlaps with a part other than the head including the face of the character object, and does not overlap with the head of the character object. Is displayed as follows. In one aspect, when the normal object is displayed so as to overlap with the character object, the normal object overlaps with a part other than the upper body including the face of the character object, and does not overlap with the upper body of the character object. Is displayed.
  • the wearing object indicating the wearing gift is an object displayed on the display screen in association with a specific part (wearing part) of the character object.
  • the mounted object displayed on the display screen in association with the specific part of the character object is displayed on the display screen so as to be in contact with the specific part of the character object.
  • the mounting object displayed on the display screen in association with the specific part of the character object is displayed on the display screen so as to cover part or all of the specific part of the character object.
  • the specific part may be specified by three-dimensional position information indicating a position in the three-dimensional coordinate space, or may be associated with position information in the three-dimensional coordinate space.
  • the specific part is a unit of front left side, front right side, rear left side, rear right side, center front side, center rear side, left eye, right eye, left ear, right ear, and the whole hair. It may be determined.
  • the mounting object is, for example, an accessory (such as a headband, a necklace, or an earring) mounted on the character object, clothing (such as a T-shirt), a costume, or any other object that can be mounted on a character object.
  • the object data 63b corresponding to the mounted object may include mounted part information indicating which part of the character object is associated with the mounted object.
  • the mounting part information of a certain mounting object can indicate which part of the character object the mounting object is mounted on. For example, when the mounted object is a headband, the mounted part information of the mounted object may indicate that the mounted object is mounted on the “head” of the character object.
  • the mounting part information on which the mounting object is mounted may be associated with a plurality of positions in the three-dimensional coordinate space.
  • the mounting part information indicating the position where the mounting object indicating “headband” is mounted may be associated with two parts of the character object, “head rear left” and “head rear right”. That is, the mounting object indicating “headband” may be mounted on both “head rear left” and “head rear right”.
  • the mounted part information of the mounted object may indicate that the mounted object is mounted on the “body” of the character object.
  • Two types of mounting objects having a common mounting part are displayed in a moving image at a time interval. That is, two types of mounting objects having a common mounting part are mounted on the character object at intervals. In other words, two types of mounting objects having a common mounting part are not simultaneously mounted on the character object. For example, with respect to the mounting object indicating the headband and the mounting object indicating the hat, when both the heads are set as the mounting parts, the mounting object indicating the headband and the mounting object indicating the hat are not displayed at the same time.
  • the message object indicating the message gift includes a message from the viewing user.
  • the message object may be displayed on the moving image in a manner more prominent than a comment displayed in a comment display area 35 described later.
  • the message object may be displayed on the moving image for a longer time than the comment displayed in the comment display area 35.
  • a display time according to the type of each gift object may be set.
  • the display time of the mounted object may be set longer than the display time of the effect object and the display time of the normal object.
  • the display time of the mounted object may be set to 60 seconds
  • the display time of the effect object may be set to 5 seconds
  • the display time of the normal object may be set to 10 seconds.
  • the computer processor 21 functions as a face motion data generation unit 21a by executing computer-readable instructions included in the distribution program. At least a part of the functions realized by the computer processor 21 may be realized by a computer processor other than the computer processor 21 of the moving image distribution system 1. At least a part of the functions realized by the computer processor 21 may be realized by, for example, the computer processor 61 mounted on the server device 60.
  • the face motion data generation unit 21a generates face motion data, which is a digital representation of the motion of the face of the distribution user U1, based on the imaging data of the camera 24.
  • the face motion data is generated as needed with the passage of time.
  • the face motion data may be generated at predetermined sampling time intervals. In this way, the face motion data can digitally express the movement of the face (change in facial expression) of the distribution user U1 in a time-series manner.
  • the face motion data generated by the face motion data generation unit 21a is transmitted to the server device 60 via the communication I / F 22.
  • body motion data that is a digital representation of the position and orientation of each part other than the face of the body of the delivery user U1 is generated. May be.
  • the distribution user device 20 may transmit the body motion data to the server device 60 in addition to the face motion data.
  • the distribution user U1 may wear a motion sensor.
  • the distribution user device 20 may be configured to be able to generate body motion data based on detection information of a motion sensor mounted on the distribution user U1.
  • the body motion data may be generated at predetermined sampling time intervals. As described above, the body motion data represents the movement of the body of the distribution user U1 as digital data in time series.
  • the generation of the body motion data based on the detection information of the motion sensor mounted on the distribution user U1 may be performed, for example, in a shooting studio.
  • the shooting studio may be provided with a base station, a tracking sensor, and a display.
  • the base station may be a multi-axis laser emitter.
  • the motion sensor attached to the distribution user U1 may be, for example, Vive Tracker provided by HTC CORPORATION.
  • the base station provided in the shooting studio may be, for example, a base station provided by HTC CORPORATION.
  • a supporter computer may be installed in a room separate from the shooting studio.
  • the display of the shooting studio may be configured to display information received from the support computer.
  • the server device 60 may be installed in the same room where the supporter computer is installed.
  • the room in which the supporter computer is installed may be separated from the photography studio by a glass window.
  • the operator of the supporter computer (hereinafter, sometimes referred to as “supporter”) can visually recognize the distribution user U1.
  • the supporter computer may be configured to be able to change settings of various devices provided in the shooting studio according to the operation of the supporter.
  • the supporter computer can, for example, set a scan interval by the base station, set a tracking sensor, and change various settings of various other devices.
  • the supporter can input a message to the supporter computer and display the input message on the display of the photography studio.
  • the computer processor 61 executes computer-readable instructions included in the distribution program to function as an animation generation unit 61a, a video generation unit 61b, a video distribution unit 61c, a gift request processing unit 61d, and a gift purchase processing unit 61e. I do. At least a part of the functions realized by the computer processor 61 may be realized by a computer processor other than the computer processor 61 of the moving image distribution system 1. At least a part of the functions realized by the computer processor 61 may be realized by, for example, the computer processor 21 of the distribution user device 20 or may be realized by the computer processor of the viewing user device 10.
  • part or all of the functions of the animation generation unit 61a and the moving image generation unit 61b may be executed in the distribution user device 20.
  • the moving image generated in the distribution user device 20 may be transmitted to the server device 60 and distributed from the server device 60 to the viewing user device 10.
  • the animation generation unit 61a generates an animation of the character object by applying the face motion data generated by the face motion data generation unit 21a of the distribution user device 20 to predetermined model data included in the model data 63a. It is composed of The animation generation unit 61a can generate an animation of the character object so that the expression of the character object changes based on the face motion data. Specifically, the animation generation unit 61a can generate an animation of a character object that moves in synchronization with the movement of the expression of the distribution user U1, based on the face motion data relating to the distribution user U1.
  • the animation generation unit 61a performs the movement of the body and the facial expression of the distribution user U1 based on the body motion data and the face motion data regarding the distribution user U1. Animation of a character object that moves in synchronization with.
  • the moving image generator 61b generates a background image indicating the background using the object data 63b, and can generate a moving image including the background image and an animation of the character object corresponding to the distribution user U1.
  • the character object corresponding to the distribution user U1 is displayed so as to be superimposed on the background image.
  • the moving image generation unit 61b can synthesize the generated moving image with the audio of the distribution user U1 generated based on the audio data received from the distribution user device 20. As described above, the moving image generation unit 61b generates an animation of the character object that moves in synchronization with the movement of the expression of the distribution user U1, and generates a distribution moving image in which the voice of the distribution user U1 is synthesized with the animation. can do.
  • the moving image distribution unit 61c distributes the moving image generated by the moving image generation unit 61b. This moving image is distributed to the viewing user device 10 and other viewing user devices via the network 50. The generated moving image is also distributed to the distribution user device 20. The received moving image is reproduced in the viewing user device 10 and the distribution user device 20.
  • FIG. 3A shows a display example of a moving image distributed from the moving image distribution unit 61c and reproduced in the viewing user device 10, and a display example of a moving image distributed from the moving image distribution unit 61c and reproduced in the distribution user device 20 is shown in FIG. This is shown in FIG. 3b.
  • a display image 30 of the moving image distributed from the server device 20 is displayed on the display of the viewing user device 10.
  • the display image 30 displayed on the viewing user device 10 includes a character object 31, a gift button 32, an evaluation button 33, a comment button 34, and a comment display generated by the animation generating unit 61a. Region 35 is included.
  • the character object 31 is generated by applying the face motion data of the distribution user U1 to the model data included in the model data 63a, the character object 31 is synchronized with the movement of the facial expression of the distribution user U1.
  • the expression changes.
  • the character object 31 may be controlled so that parts other than its face change in synchronization with the movement of the body of the distribution user U1.
  • the gift button 32 is displayed on the display screen 30 so as to be selectable by operating the viewing user device 10.
  • the gift button 32 can be selected by, for example, a tap operation on an area on the touch panel of the viewing user device 10 where the gift button 32 is displayed.
  • a window is displayed on the display screen 30 for selecting a gift to be gifted to a distribution user who distributes the moving image being viewed.
  • the viewing user can purchase a gift to be gifted from the gifts displayed in the window.
  • a window including a list of purchased gifts is displayed on the display screen 30 in response to the selection of the gift button 32. In this case, the viewing user can select a gift to be gifted from the gifts displayed in the window.
  • Gifts that can be gifted or purchased can include effect gifts, regular gifts, wearing gifts, message gifts, and other gifts.
  • the evaluation button 33 is displayed on the display screen 30 so as to be selectable by a viewing user who uses the viewing user device 10.
  • the evaluation button 33 can be selected by, for example, a tap operation on a region of the touch panel of the viewing user device 10 where the evaluation button 33 is displayed.
  • evaluation information indicating that the moving image has been positively evaluated may be transmitted to the server device 60.
  • the server device 60 can aggregate evaluation information from the viewing user device 10 and other viewing user devices.
  • the comment button 34 is displayed on the display screen 30 so as to be selectable by the viewing user.
  • a comment input window for inputting a comment is displayed on the display screen 30.
  • the viewing user can input a comment via the input mechanism of the viewing user device 10.
  • the input comment is transmitted from the viewing user device 10 to the server device 60.
  • the server device 60 accepts comments from the viewing user device 10 and other viewing user devices, and displays the comment in the comment display area 35 in the display image 30.
  • comments posted from the viewing user device 10 and other viewing user devices are displayed, for example, in chronological order.
  • the comment display area 35 occupies a part of the display screen 30.
  • the comment display area 35 There is an upper limit on the number of comments that can be displayed in the comment display area 35. In the illustrated example, up to three comments can be displayed in the comment display area 35. When a comment exceeding the upper limit set in the comment display area 35 is posted, the comment is deleted from the comment display area 35 in ascending order of post time. For this reason, the display time of each comment in the comment area 35 decreases as the frequency of comments received from the viewing user increases.
  • the display image of the moving image distributed from the server device 20 is displayed on the display of the distribution user device 20.
  • the display image 40 displayed on the distribution user device 20 includes a character object 31 corresponding to the distribution user U1, display instruction buttons 42a to 42c for displaying a wearing gift requested to be displayed by the viewing user, and a comment. And a display area 35.
  • the display image 40 displayed on the distribution user device 20 includes the same background image, character object image, and comment as the display image 30 displayed on the viewing user device 10.
  • the display image 40 is different from the display image 30 in that it does not include the gift button 32, the evaluation button 33, and the comment button 34, but includes the display instruction buttons 42a to 42c.
  • the display instruction buttons 42a to 42c are displayed on the display screen 40 in response to receiving a display request for displaying a wearing gift described later from the viewing user.
  • three display instruction buttons 42a to 42c are displayed on the display image 40.
  • Each of the display instruction buttons 42a to 42c is displayed on the display screen 40 so as to be selectable by the distribution user.
  • any one of the display instruction buttons 42a to 42c is selected by, for example, a tap operation, an operation for displaying a wearing gift corresponding to the selected display instruction button is performed.
  • the display instruction buttons 42a to 42c are display instruction objects for giving an instruction to display the wearing gift on the moving image being distributed.
  • the display instruction buttons 42a to 42c may be referred to as display instruction objects 42a to 42c.
  • the display instruction objects 42 When it is not necessary to distinguish the display instruction objects 42a to 42c from each other, they may be simply referred to as the display instruction objects 42.
  • the display screen 40 may be displayed on the supporter computer.
  • the display instruction objects 42a to 42c may be selected according to the operation of the supporter computer by the supporter.
  • a display instruction object 42 corresponding to the display request is added to the display screen 40.
  • the maximum number of display instruction objects 42 that can be displayed on the display screen 40 is three.
  • the display screen 40 has a display area in which three display instruction objects can be displayed.
  • the display instruction objects 42 corresponding to the fourth and subsequent display requests are not displayed on the display screen 40.
  • the display instruction object 42 corresponding to the display request of the wearing gift accepted fourthly is selected when any of the three display instruction objects 42 already displayed is selected and the display area becomes empty. It is displayed on the display screen 40.
  • the gift request processing unit 61d accepts a gift display request from the viewing user and performs processing for displaying a gift object corresponding to the display request.
  • Each viewing user can transmit a gift display request to the server device 60 by operating his or her viewing user device.
  • the gift display request includes a user ID of the viewing user, gift identification information (gift ID) for specifying a gift for which display is requested, and / or gift object identification information for specifying a gift object corresponding to the gift for which display is requested ( Gift object ID).
  • the gift object indicating the gift may include an effect object corresponding to the effect gift, a normal object corresponding to the normal gift, and a wearing object corresponding to the wearing gift.
  • the wearing gift is an example of the first gift.
  • the wearing object may be called a first gift object.
  • the display request requesting the display of the attached gift (or the attached object) is an example of the first display request.
  • the effect gift and the normal gift are examples of the second gift.
  • the effect object and the normal object may be collectively referred to as a second gift object.
  • the display request for requesting the display of the effect gift (or the effect object) or the normal gift (or the normal object) is an example of the second display request.
  • the gift request processing unit 61d when receiving a display request for a specific normal gift from the viewing user, displays a normal object indicating the requested normal gift on the moving image based on the display request. Perform processing. For example, when a display request for a normal gift indicating a bag is made, a normal object 36 indicating a bag is displayed on the display image 30 by the gift request processing unit 61d based on the display request, as shown in FIG. . Similarly, when a display request for a normal gift indicating a stuffed bear is made, as shown in FIG. 4, the normal object 37 indicating the stuffed bear is displayed on the display image by the gift request processing unit 61d based on the display request. 30 is displayed. Although not shown, the normal object 36 and the normal object 37 are also displayed on the display image 40 of the distribution user device 20, similarly to the display image 30.
  • a gift request processing unit 61d when a gift request processing unit 61d receives a display request for a specific effect gift from a viewing user, the gift object processing unit 61d converts the effect object corresponding to the requested effect gift into a display image of a moving image based on the display request. Perform processing for display. For example, when a display request for an effect gift indicating confetti or fireworks is made, the gift request processing unit 61d draws an effect object (not shown) corresponding to the effect gift indicating confetti or fireworks based on the display request. ) Are displayed on the display image 30 and the display image 40.
  • the display request of the normal gift may include a display position specifying parameter for specifying a display position of the normal object indicating the normal gift.
  • the gift request processing unit 61d can display the normal object at the position specified by the display position specification parameter.
  • the display position and the display range of the character object 31 are determined, the relative position with respect to the character object 31 can be specified as the display position of the normal object by the display position specification parameter.
  • the gift request processing unit 61d when the gift request processing unit 61d receives a display request for a specific mounted object from a viewing user, based on the display request, as shown in FIG.
  • the objects 42a to 42c are displayed.
  • Each of the display instruction objects 42a to 42c is associated with a wearing gift for which a display request has been made.
  • the wearing gift associated with the selected display instruction object is displayed on the moving image being distributed.
  • the gift request processing unit 61d sets the headband corresponding to the selected display instruction object 42b. Is displayed on the moving image being distributed.
  • 5A and 5B show display examples of a moving image including the wearing object 38 indicating the headband.
  • the selected display instruction object 42b is deleted from the display screen 40.
  • the mounting object is displayed in the moving image in association with a specific part (wearing part) of the character object.
  • the mounted object may be displayed in the moving image so as to be in contact with the mounting site of the character object.
  • the mounting object 38 may be displayed in the moving image so as to be mounted on the mounting site of the character object.
  • the wearing object 38 indicating the headband is associated with the head of the character object. For this reason, in the display examples shown in FIGS. 5A and 5B, the mounting object 38 is mounted on the head of the character object 31.
  • the mounting object may be displayed in the moving image display screen so as to move in association with the movement of the mounting part of the character object.
  • the mounting object 38 indicating the headband also moves to the head of the character object 31 as if the headband were mounted on the head of the character object 31. It moves along with the part.
  • the object data 63b may include mounting part information indicating to which part of the character object the mounting object is associated.
  • the gift request processing unit 61d is the same as or different from the part indicated by the wearing part information of the wearing object until the display time of the wearing object elapses.
  • the display of other mounted objects mounted on the overlapping part is prohibited.
  • the headband associated with both the “head rear left” and “head rear right” and the hair accessory associated with “head rear left” have “head rear left” overlapping. Therefore, when the headband is displayed, the display of the hair accessory for which “head left side” is set as the attachment site information is prohibited.
  • the display instruction object 42 for displaying the mounting object whose display is prohibited may be deactivated.
  • the display instruction object 42a is a hair accessory in which “head left side” is set as the attachment site information.
  • the display instruction object 42a associated with the mounting object indicating the hair accessory is deactivated in order to prohibit the display of the mounting object indicating the hair accessory on the moving image.
  • the display instruction object 42a cannot be selected even if operated while the mounting object 38 is mounted on the character object 31.
  • the display instruction object 42 a is deleted from the display screen 40 while the mounting object 38 is mounted on the character object 31.
  • the deactivated display instruction object 42a is activated again when the display time of the headband has elapsed.
  • the display instruction object 42a which has not been selected may be made selectable again, or the display instruction object 42a which has not been displayed may be displayed again on the display screen 40. It may be displayed.
  • a viewing user who is viewing a moving image can transmit a display request requesting that a message gift including a specific message be displayed on the moving image to the server device 60.
  • the gift request processing unit 61d Upon receiving a display request for a message gift associated with a specific message from the viewing user, the gift request processing unit 61d displays a message on the display screen 40 of the distribution user device 20, based on the display request, as shown in FIG. A confirmation screen 43 is displayed.
  • the message confirmation screen 43 includes a message input by the viewing user who has made the display request, a button for permitting display of the message in a moving image, and a button for rejecting display of the message in a moving image. Is included.
  • the distribution user U1 confirms the message from the viewing user on the message confirmation screen 43, and selects the button described as “display” when permitting the display of the message gift including the message on the moving image. By doing so, a message gift display instruction operation is performed. Conversely, when rejecting the display of the message gift on the moving image, the distribution user U1 performs a display rejection operation by selecting a button indicated as “not display”.
  • the message object 39 includes text data indicating a message associated with the message gift.
  • the message object 39 may include a display indicating the viewing user who made the display request for the message gift (for example, the user name, nickname, or the like of the viewing user).
  • the viewing user who is watching the moving image can transmit a voice change gift for changing the voice of the distribution user U1 combined with the moving image being distributed to the server device 60.
  • the gift request processing unit 61d displays a confirmation screen (not shown) on the display screen 40 of the distribution user device 20.
  • the confirmation screen for confirming whether or not the voice can be changed may include information for specifying the content of the voice change.
  • this confirmation screen contains information that specifies the details of voice changes, such as changing a male voice to a female voice, changing a human voice to an electronic sound like a robot, etc. You may.
  • the confirmation screen includes a button for permitting a voice change requested in the voice change gift and a button for rejecting the voice change.
  • the delivery user U1 confirms the content of the change in the voice, and performs an instruction operation for permitting or rejecting the change.
  • a viewing user who is watching a moving image can transmit to the server device 60 a motion gift designating a movement other than the head of the character object 31 included in the moving image being distributed.
  • the gift request processing unit 61d controls the movement of the character object 31 so that the character object 31 takes the movement specified by the motion gift.
  • a confirmation screen (not shown) for confirming whether or not the movement specified by the motion gift is reflected on the character object 31 is displayed on the display screen 40 of the distribution user device 20, and the movement specified by the distribution user U1 is displayed.
  • the movement specified by the motion gift may be reflected on the character object 31 only when an instruction operation for permitting the reflection is performed.
  • the gift purchase processing unit 61e in response to a request from a user viewing a moving image, transmits purchase information of each of a plurality of gift objects that can be purchased in association with the moving image to the viewing user device (for example, To the viewing user device 10).
  • the purchase information of each gift object includes the type of the gift object (effect object, normal object, or wearing object), the image of the gift object, the price of the gift object, and information necessary for purchasing other gift objects. May be included.
  • the viewing user can select a gift object to be purchased based on the gift object purchase information displayed on the viewing user device 10. The selection of the gift object to be purchased may be performed by operating the viewing user device 10.
  • a purchase request for the gift object is transmitted to the server device 60.
  • the gift purchase processing unit 61e performs a payment process based on the purchase request.
  • the purchased gift object is held by the viewing user.
  • the storage 23 may store the gift ID of the purchased gift (or the gift object ID of the gift object indicating the gift) in association with the user ID of the viewing user who purchased the gift.
  • the purchasable gift object may be purchasable in a plurality of videos. That is, the purchasable gift objects may include a unique gift object peculiar to each moving image and a common gift object purchasable in a plurality of moving images.
  • the effect object indicating confetti may be a common gift object that can be purchased in a plurality of moving images.
  • the effect object to be purchased when an effect object is purchased while viewing a predetermined moving image, the effect object to be purchased is added to the watching moving image in response to completion of the payment processing for purchasing the effect object. It may be displayed automatically.
  • the normal gift object to be purchased in response to completion of the payment processing for purchasing the normal object, is changed to the moving image being viewed. May be displayed automatically.
  • a payment completion notification is transmitted to the viewing user device 10, and the purchase is performed by the viewing user device 10.
  • a confirmation screen for confirming with the viewing user whether to make a display request for the effect object may be displayed.
  • the viewing user selects to make a display request for the purchased effect object
  • a display request for displaying the purchased effect object is sent from the client device of the viewing user to the gift request processing unit 61d.
  • the gift request processing unit 61d may perform a process of displaying the effect object to be purchased on the moving image 70.
  • a confirmation screen for confirming with the viewing user whether or not to make a display request for the purchased normal object may be displayed on the viewing user device 10 in the same manner as described above.
  • FIG. 8 is a flowchart showing a flow of a moving image distribution process in one embodiment
  • FIG. 9 is a flowchart showing a flow of a process of displaying a normal object in one embodiment
  • FIG. FIG. 11 is a flowchart showing a flow of a process of displaying a mounted object in FIG. 11,
  • FIG. 11 is a flowchart showing a flow of a process of displaying a message object in one embodiment.
  • the distribution user U1 performs the moving image distribution based on the face motion data acquired using the distribution user device 20.
  • step S11 face motion data, which is a digital expression of the movement (expression) of the face of the distribution user U1, is generated.
  • the generation of the face motion data is performed, for example, by the face motion data generation unit 21a of the distribution user device 20.
  • audio data may be generated based on an audio input from the distribution user U1.
  • the generated face motion data and voice data are transmitted to the server device 60.
  • step S12 the face motion data from the distribution user device 20 is applied to the model data for the distribution user U1, thereby generating an animation of the character object that moves in synchronization with the movement of the expression of the distribution user U1. Is done.
  • the generation of the animation is performed, for example, by the above-described animation generation unit 61a.
  • step S13 a moving image including the animation of the character object corresponding to the distribution user U1 is generated.
  • the voice of the distribution user U1 may be synthesized with this moving image.
  • the animation of the character object is displayed so as to be superimposed on the background image.
  • the generation of the moving image is performed by, for example, the moving image generation unit 61b.
  • step S14 the moving image generated in step S13 is distributed.
  • the moving image is distributed to the viewing user device 10 and other viewing user devices and the distribution user device via the network 50.
  • the moving image is continuously distributed over a predetermined distribution period.
  • the distribution period of the moving image may be set to, for example, 30 seconds, 1 minute, 5 minutes, 10 minutes, 30 minutes, 60 minutes, 120 minutes, and any other time.
  • step S15 it is determined whether or not an end condition for ending the distribution of the moving image has been fulfilled.
  • the end condition is, for example, that the end time of the distribution has been reached, that the distribution user U1 has performed an operation to end the distribution in the distribution user device 20, or other conditions. If the end condition is not satisfied, the processing of steps S11 to S14 is repeatedly executed, and the distribution of the moving image including the animation synchronized with the movement of the distribution user U1 and the distribution user A2 is continued. If it is determined that the end condition is satisfied for the moving image, the distribution process of the moving image ends.
  • the normal gift display process is performed in parallel with the moving image distribution process shown in FIG.
  • a display request for a normal gift it is determined in step S21 whether or not a display request for a normal gift has been made. For example, the viewing user selects one or a plurality of specific normal gifts from the normal gifts held by the user, and issues a display request for displaying the selected normal gift from the viewing user device 10 to the server device 60. Can be sent to As described above, a display request for the predetermined normal gift may be generated in response to the predetermined normal gift purchase processing or settlement processing being performed.
  • step S22 based on the display request, a process for displaying the normal gift requested to be displayed on the moving image being distributed is performed. For example, when a display request for a normal gift is made during distribution of a predetermined moving image, as shown in FIG. 4, the normal gift corresponding to the normal gift requested to be displayed on the display screen 30 of the viewing user device 10 is displayed. The object 36 is displayed. Although not shown, the normal object 36 may also be displayed in the display image 40 of the distribution user device 20.
  • the display processing for the normal gift ends.
  • the display processing of the normal gift shown in FIG. 9 is repeatedly performed during the distribution period of the moving image.
  • the display processing of the effect gift can be performed in the same procedure as the display processing of the normal gift described above.
  • the display processing of the attached gift is performed in parallel with the moving image distribution processing shown in FIG.
  • the display processing of the wearing gift may be performed in parallel with the display processing of the normal gift shown in FIG.
  • step S31 it is determined in step S31 whether or not a display request for the attached gift has been made.
  • the first viewing user can transmit a display request for displaying the wearing gift held by the first viewing user from the viewing user device 10 to the server device 60.
  • step S32 based on the display request, a display instruction object associated with the wearing gift requested to be displayed is displayed on the display screen 40 of the distribution user device 20. For example, when a display request of the wearing gift indicating the headband is made, the display instruction object 42b associated with the wearing gift is displayed on the display screen 40 of the distribution user device 20.
  • step S33 it is determined whether or not a specific display instruction object has been selected from the display instruction objects included in the display screen 40 of the distribution user device 20.
  • step S34 a process of displaying the wearing gift corresponding to the selected specific display instruction object on the display screen of the moving image being distributed is performed. For example, when the display instruction object 42b included in the display screen 40 is selected, as shown in FIGS. 5A and 5B, the selected display instruction object is displayed on the display images 30 and 40. The mounting object 38 associated with 42b is displayed. Further, the selected display instruction object 42b is deleted from the display screen 40.
  • step S33 If no display instruction object is selected in step S33, and if the display processing of the attached gift is completed in step S34, the display processing of the attached gift ends.
  • the message gift display process is performed in parallel with the moving image distribution process shown in FIG.
  • the message gift display processing may be performed in parallel with the normal gift display processing shown in FIG. 9 and the attached gift display processing shown in FIG.
  • step S41 it is determined in step S41 whether a display request for a message gift has been made.
  • the first viewing user can transmit a display request for requesting display of a message gift including a message input by the user himself / herself from the viewing user device 10 to the server device 60.
  • step S42 based on the display request, a message confirmation screen 43 associated with the message gift requested to be displayed is displayed on the display screen 40 of the distribution user device 20.
  • step S43 it is determined whether display of the message gift in the moving image is permitted. For example, it is determined whether or not a button associated with permission to display a message gift included in the message confirmation screen 43 included in the display screen 40 of the distribution user device 20 on the moving image is selected.
  • step S44 a process of displaying the message gift on the display screen of the moving image being distributed is performed in step S44. For example, when a button associated with permission to display a message gift on a moving image included in the message confirmation screen 43 is selected, the display image 30 and the display image 30 are displayed as shown in FIGS. 7A and 7B. The message gift 39 for which the display request has been made is displayed on the display image 40.
  • step S43 when the button associated with the refusal of the display of the message gift on the moving image included in the message confirmation screen 43 is selected, and when the display processing of the message gift is completed in step S44 (for example, If the display time set for the message gift has elapsed), the message gift display process ends.
  • an object display prohibition section in which display of a gift object is prohibited may be provided in the moving image distribution.
  • FIG. 12 is a diagram schematically illustrating an object display prohibited section.
  • FIG. 12 shows that a moving image is distributed between time t1 and time t2. That is, the time t1 is the distribution start time of the moving image, and the time t2 is the distribution end time of the moving image.
  • a period between time t3 and time t4 is a gift display prohibition period 81. Even if a gift display request r1 is made during the gift display prohibition period 81, the gift object is not displayed on the display image of the moving image during the gift display prohibition period 81.
  • the effect gift or the normal gift requested to be displayed is not distributed during the gift display prohibition period 81. It is not displayed on the moving image, and is displayed on the moving image at the point in time after the gift display prohibition period 81 has elapsed (that is, after time t4).
  • the display instruction button for instructing the display of the wearing gift for which the display request has been made is not displayed on the display screen 40 of the distribution user device 20. Is displayed on the display screen 40 at a point in time after the gift display prohibition period 81 has elapsed.
  • a message confirmation screen 43 for confirming whether or not the message gift for which the display request has been made can be displayed is displayed on the display screen 40 of the distribution user device 20. Instead, it is displayed on the display screen 40 at the point in time after the gift display prohibition period 81 has elapsed. Accordingly, the distribution user U1 can distribute the moving image during the gift display prohibition period 81 without being interfered by the display of the effect gift or the normal gift. In addition, during the gift display prohibition period 81, the distribution user U1 does not pay attention to the addition of the display instruction button for confirming the wearing gift and the confirmation of whether or not the gift message can be displayed. You can concentrate.
  • the distribution user U1 can distribute the performance by the character object wearing the desired wearing gift.
  • the animation of the character object is considered to be an element that attracts the attention of the viewing user.
  • a display instruction button for displaying the attached gift is displayed on the distribution user device 20 of the distribution user U1
  • Since the wearing gift is not displayed in the moving image until the display instruction button is selected, it is possible to prevent the wearing gift from being displayed around the character object or overlapping with the character object. This can prevent the viewing experience of the viewing user from deteriorating.
  • a gift object is displayed on a moving image in response to a display request for the moving image, regardless of its type. Therefore, if a gift can be overlapped with a moving image, a large amount of gifts will be displayed on the moving image, and the viewing experience of a user who views the moving image will be degraded.
  • the display timing of the wearing gift displayed in association with the character object which is a main part of the moving image is determined by the distribution user. Control was made possible in U1.
  • the display time of the normal gifts 36 and 37 can be set shorter than that of the wearing gift 38, and the normal objects 36 and 37 are not in contact with the character object 31 or behind the character object 31 instead of the front side. May be displayed. In this case, the effects of the normal objects 36 and 37 on the visibility of the character object 31 in the moving image being distributed are small. Therefore, even if the normal gift is automatically displayed on the moving image (without permission of the distribution user U1) in response to the display request from the viewing user, the user's viewing experience is deteriorated due to the deterioration of the visibility of the character object 31. Is difficult to connect.
  • the user can give the wearing object to the character. This makes it possible to provide a system with high uniqueness as compared with a system that cannot give such a mounted object, and it is possible to provide a service with high uniqueness by the system. As a result, more users can be attracted to the moving image distribution system 1, so that the number of times of viewing the moving images in the moving image distribution system 1 can be increased.
  • the distribution user U1 can distribute a moving image including a character object that moves according to his / her own expression using the distribution user device 20 including a camera such as a smartphone.
  • the distribution user device 20 including a camera such as a smartphone.
  • facilities for distributing the moving image by the distribution user U1 are simplified, so that a platform in which the distribution user U1 can easily participate is realized.
  • the display instruction object 42 in response to receiving a first display request related to a first gift from a viewing user, is displayed on the distribution user device 20 used by the distribution user U1, and the display instruction object In response to the operation on 42, the first gift is displayed on the moving image being distributed.
  • the timing at which the first gift for which the display request has been made by the viewing user is displayed in the moving image is determined by operating the display instruction object 42 displayed on the distribution user device 20. This can prevent the first gift from being displayed in the moving image at a timing that the distribution user U1 does not want.
  • the first gift may be a wearing gift associated with the wearing site of the character object.
  • the mounted gift is displayed at a position corresponding to the mounted part set in the mounted gift in the moving image being distributed in response to an operation on the display instruction object 42 displayed on the distribution user device 20.
  • One example of the wearing gift is a gift representing a headband associated with the head of the character object.
  • the wearing gift representing the headband is displayed on the moving image as if it were worn on the head of the character object 31 in response to an operation on the display instruction object 42 corresponding to the wearing gift.
  • the attached gift displayed on or around the character object is likely to obstruct the performance of the distribution user U1 through the character object 31, and tends to deteriorate the viewing experience of the viewing user.
  • the display timing of the wearing gift associated with the wearing site of the character object 31 on the moving image is determined according to the operation on the display instruction object 42 displayed on the distribution user device 20. .
  • the delivery user U1 it is possible to prevent the delivery user U1 from displaying the attached gift that is likely to deteriorate the performance of the delivery user U1 through the character object 31 and the viewing experience of the viewing user at an undesired timing.
  • the display time displayed in the moving image may be set for each of the wearing gifts.
  • the display time may be different depending on the type of the attached gift, and a predetermined display time may be set uniformly.
  • the same wearing part as the first wearing gift is set until the display time set in the first wearing gift elapses. Display of other wearing gifts being displayed on the video is prohibited.
  • the same wearing part as the first wearing gift is set until the display time set in the first wearing gift elapses.
  • the display instruction object 42 for displaying another wearing gift that has been displayed is deactivated. According to this aspect, it is possible to prevent a plurality of wearing gifts from being displayed in an overlapping manner on the wearing site of the character object.
  • a message confirmation screen 43 for confirming the message is displayed on the distribution user device 20.
  • the distribution user U1 performs a display instruction operation of the message gift via the distribution user device 20 when permitting the display of the message gift including the message on the moving image.
  • a message posted from a viewing user through a message gift may include a message that is not desired to be displayed in a moving image.
  • the distribution user U1 can determine whether to display the message gift. This can prevent the display of a message gift that is not desirable to be displayed in the moving image.
  • the voice of the distribution user U1 is synthesized with the moving image, and the distribution user device 20 changes the voice in response to receiving a voice change gift for changing the voice of the distribution user U1 from the viewing user.
  • the instruction object is displayed, and the voice of the distribution user is changed to the voice specified by the gear voice change gift according to an operation on the voice change instruction object. According to this aspect, it is possible to prevent a change to a sound that the distribution user U1 does not want.
  • the moving image in response to receiving a second display request related to a second gift displayed in the moving image without being associated with a specific part of the character object 31 from the viewing user, the moving image includes the second gift. Is displayed.
  • the second gift includes an effect gift and a normal gift.
  • the second gift which is displayed in the moving image without being associated with the specific part of the character object, can be displayed on the moving image in response to a display request from the viewing user (without requiring an operation or instruction from the distribution user) Can be displayed inside. Since the second gift is not displayed in association with the specific part of the character object, it is unlikely that the distribution user U1 will interfere with the performance of the character object 31 through the character object 31 or deteriorate the viewing experience of the viewing user. For this reason, the second gift can be displayed in the moving image without requiring the operation by the distribution user U1, thereby easily activating the interaction with the viewing user.
  • the gift display prohibition period 81 is set within the distribution period of the moving image, and the second gift is displayed on the moving image at a timing other than the gift display prohibition period 81. According to this aspect, in the gift display prohibition period 81, a moving image that does not include the second gift can be distributed to the viewing user. For example, by setting the time zone during which the distribution user U1 shows a performance through the character object 31 as the gift display prohibition period 81, the attention of the viewing user can be prevented from being dissipated from the character object 31.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2019/024876 2018-08-28 2019-06-24 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム WO2020044749A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020217003200A KR102490402B1 (ko) 2018-08-28 2019-06-24 배신 유저의 움직임에 기초하여 생성되는 캐릭터 오브젝트의 애니메이션을 포함하는 동화상을 라이브 배신하는 동화상 배신 시스템, 동화상 배신 방법 및 동화상 배신 프로그램

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2018-159802 2018-08-28
JP2018159802A JP6491388B1 (ja) 2018-08-28 2018-08-28 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2019-035044 2019-02-28
JP2019035044A JP6523586B1 (ja) 2019-02-28 2019-02-28 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2019083729A JP6550549B1 (ja) 2019-04-25 2019-04-25 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2019-083729 2019-04-25

Publications (1)

Publication Number Publication Date
WO2020044749A1 true WO2020044749A1 (ja) 2020-03-05

Family

ID=69644075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/024876 WO2020044749A1 (ja) 2018-08-28 2019-06-24 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム

Country Status (3)

Country Link
KR (1) KR102490402B1 (zh)
CN (1) CN110866963B (zh)
WO (1) WO2020044749A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6883140B1 (ja) * 2020-12-18 2021-06-09 グリー株式会社 情報処理システム、情報処理方法およびコンピュータプログラム
JP2022097350A (ja) * 2020-12-18 2022-06-30 グリー株式会社 情報処理システム、情報処理方法およびコンピュータプログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015184689A (ja) * 2014-03-20 2015-10-22 株式会社Mugenup 動画生成装置及びプログラム
WO2018142494A1 (ja) * 2017-01-31 2018-08-09 株式会社 ニコン 表示制御システム、及び、表示制御方法

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040105999A (ko) * 2003-06-10 2004-12-17 온오프코리아 주식회사 네트워크 기반 소리 아바타 생성 방법 및 시스템
US20050206751A1 (en) * 2004-03-19 2005-09-22 East Kodak Company Digital video system for assembling video sequences
JP2012120098A (ja) 2010-12-03 2012-06-21 Linkt Co Ltd 情報提供システム
KR20130012228A (ko) * 2011-07-15 2013-02-01 (주)코아텍 가상 공간과 실공간을 통합적으로 활용하는 이벤트 서비스 시스템 및 그 이벤트 서비스 방법
KR20130053466A (ko) * 2011-11-14 2013-05-24 한국전자통신연구원 인터랙티브 증강공간 제공을 위한 콘텐츠 재생 장치 및 방법
CN106709762A (zh) * 2016-12-26 2017-05-24 乐蜜科技有限公司 直播间中虚拟礼物的推荐方法、装置及移动终端
CN106993195A (zh) * 2017-03-24 2017-07-28 广州创幻数码科技有限公司 虚拟人物角色直播方法及系统
CN108076392A (zh) * 2017-03-31 2018-05-25 北京市商汤科技开发有限公司 直播互动方法、装置和电子设备
CN107423809B (zh) * 2017-07-07 2021-02-26 北京光年无限科技有限公司 应用于视频直播平台的虚拟机器人多模态交互方法和系统
CN107484031A (zh) * 2017-07-28 2017-12-15 王飞飞 一种基于直播的礼物展示互动方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015184689A (ja) * 2014-03-20 2015-10-22 株式会社Mugenup 動画生成装置及びプログラム
WO2018142494A1 (ja) * 2017-01-31 2018-08-09 株式会社 ニコン 表示制御システム、及び、表示制御方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MOGURA INC.: "What's the secret behind the cuteness of "Shinonome Megu"", THE POPULAR VIRTUAL BISHOJO, 16 March 2018 (2018-03-16), pages 1 - 7 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6883140B1 (ja) * 2020-12-18 2021-06-09 グリー株式会社 情報処理システム、情報処理方法およびコンピュータプログラム
JP2022097350A (ja) * 2020-12-18 2022-06-30 グリー株式会社 情報処理システム、情報処理方法およびコンピュータプログラム
JP2022097047A (ja) * 2020-12-18 2022-06-30 グリー株式会社 情報処理システム、情報処理方法およびコンピュータプログラム
JP7199791B2 (ja) 2020-12-18 2023-01-06 グリー株式会社 情報処理システム、情報処理方法およびコンピュータプログラム

Also Published As

Publication number Publication date
KR20210025102A (ko) 2021-03-08
CN110866963A (zh) 2020-03-06
KR102490402B1 (ko) 2023-01-18
CN110866963B (zh) 2024-02-02

Similar Documents

Publication Publication Date Title
JP6491388B1 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP6543403B1 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
JP6382468B1 (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム
JP7389855B2 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
US11044535B2 (en) Video distribution system for live distributing video containing animation of character object generated based on motion of distributor user, distribution method, and storage medium storing video distribution program
JP6550549B1 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
WO2019216146A1 (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム
JP6550546B1 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
JP7191883B2 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
JP6523586B1 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
WO2020044749A1 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP6671528B1 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
JP2024041749A (ja) 動画配信システム、動画配信方法及び動画配信プログラム
WO2020121909A1 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
JP6713080B2 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP7284329B2 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2020043578A (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム
JP7104097B2 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP6937803B2 (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2023103424A (ja) 配信ユーザの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画をライブ配信する動画配信システム、動画配信方法及び動画配信プログラム
JP6828106B1 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
JP2020005238A (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2019198054A (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム
JP2019198057A (ja) アクターの動きに基づいて生成されるキャラクタオブジェクトのアニメーションを含む動画を配信する動画配信システム、動画配信方法及び動画配信プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19856355

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20217003200

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19856355

Country of ref document: EP

Kind code of ref document: A1