CN110249631B - Display control system and display control method - Google Patents

Display control system and display control method Download PDF

Info

Publication number
CN110249631B
CN110249631B CN201780084966.XA CN201780084966A CN110249631B CN 110249631 B CN110249631 B CN 110249631B CN 201780084966 A CN201780084966 A CN 201780084966A CN 110249631 B CN110249631 B CN 110249631B
Authority
CN
China
Prior art keywords
gift
user
performer
display
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780084966.XA
Other languages
Chinese (zh)
Other versions
CN110249631A (en
Inventor
细见幸司
栗山孝司
胜俣祐辉
出口和明
志茂谕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Publication of CN110249631A publication Critical patent/CN110249631A/en
Application granted granted Critical
Publication of CN110249631B publication Critical patent/CN110249631B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The display control system carries out live broadcast release on an image of an actual space where a performer is located, acquires three-dimensional position information of the actual space, detects a user action of a user for giving a gift to the performer, calculates a gift position where the gift should be configured in the actual space based on the acquired three-dimensional position information and the detected user action information of the user action, and displays the calculated gift position on the actual space in a manner that the performer can recognize the gift position.

Description

Display control system and display control method
Technical Field
The invention relates to a display control system and a display control method for live broadcast.
Background
Patent document 1 and the like describe a server that distributes content.
Documents of the prior art
Patent document
Patent document 1: japanese patent No. 5530557
Disclosure of Invention
According to one aspect of the present invention, there is provided a display control system including: a display device control unit that causes a display device to display a video of a real space in which an performer is located as a target of live distribution; an acquisition unit that acquires three-dimensional position information of the actual space; a detection unit that detects a user action of a user for giving a gift to the performer; and a gift display control unit that calculates a gift position where the gift is to be arranged in the actual space based on the three-dimensional position information acquired by the acquisition unit and the user motion information of the user motion detected by the detection unit, and displays the calculated gift position in the actual space so as to be recognizable by the performer.
According to another aspect of the present invention, there is provided a display control method of live-broadcasting an image of an actual space in which a performer is located, acquiring three-dimensional position information of the actual space, detecting a user action of a user for presenting a gift to the performer, calculating a gift position at which the gift should be placed in the actual space based on the acquired three-dimensional position information and the user action information of the detected user action, and displaying the calculated gift position in the actual space so as to be recognizable to the performer.
Drawings
Fig. 1 is a diagram showing an overall configuration of a live distribution system.
Fig. 2 is a block diagram of a server and a user terminal.
Fig. 3 (a) is a flowchart showing live broadcast distribution processing, and (b) is a diagram showing a display screen of a user terminal at the time of live broadcast distribution.
Fig. 4 is a flow chart of a gift and performer selection process by a user.
Fig. 5 (a) is a diagram showing a display screen of the user terminal when selecting a gift, (b) is a diagram showing a display screen of the user terminal when selecting an artist, (c) is a diagram showing a display screen of the user terminal when selecting a gift and an artist, (d) is a diagram showing a display screen of the studio display when selecting a gift and an artist, and (e) is a diagram showing a display screen of the user terminal and the studio display when the user makes an action of posting a gift.
Fig. 6 is a flowchart showing an acquisition process of the performer acquiring the gift.
Fig. 7 (a) is a diagram showing a display screen of the user terminal and the studio display when the performer picks up the cat-ear hair band gift, (b) is a diagram showing a display screen of the user terminal and the studio display when the performer wears the hair band gift, and (c) is a diagram showing a display screen of the user terminal and the studio display when the performer faces the side.
Fig. 8 is a flowchart of a gift-return process from the performer to the user.
Fig. 9 (a) is a diagram showing a display screen of the user terminal and the studio display when the performer holds a signature ball for a gift, (b) is a diagram showing a display screen of the user terminal and the studio display when the performer throws the signature ball, and (c) is a diagram showing a display screen of the user terminal and the studio display when the user receives the signature ball.
Fig. 10 (a) is a diagram showing a display screen of the user terminal and the studio display when a special effect is added to the performer, (b) is a diagram showing a display screen of the user terminal and the studio display when the user makes an action of projecting a gift, (c) is a diagram showing a display screen of the user terminal and the studio display when the performer receives a gift, and (d) is a diagram showing a display screen of the user terminal and the studio display when a tower is displayed on the background image.
Detailed Description
Hereinafter, a live broadcast distribution system to which the present invention is applied will be described with reference to the drawings with reference to fig. 1 to 10.
[ summary of live publishing System ]
As shown in fig. 1, the live distribution system 1 includes: a studio 10 for performing a performance such as a live performance on the spot, a server 20 for live-broadcasting and distributing content data acquired in the studio 10, and user terminals 40, 60, 70 for viewing the content data distributed by the server 20. The server 20 and the user terminals 40, 60, 70 are connected via the network 2. The number of user terminals is not limited to 2 as shown here, and may be 1, or several tens or several hundreds.
As an example, in the real space in the studio 10, 3 performers A, B, C as subjects perform and sing a music piece on a table. Of course, the number of performers is not limited to 3, and may be 1 or 2, or 4 or more. Further, performer A, B, C may be a combination of 1 band or the like, or may be grouped together performers A, B, C that work independently. The studio 10 includes a playback device 11, a speaker 12, a microphone 13, an RGB camera 14, a depth camera 15, a projector 16, and a studio display 17.
The playback device 11 reproduces the music data of the performance, and plays back a music based on the music data from a speaker 12 connected to the playback device 11. Each performer A, B, C holds a microphone 13, and microphone 13 captures the sound of performer A, B, C. As an example, the RGB camera 14 is the 1 st camera in the live broadcast distribution system 1. The RGB camera 14 is a digital camera having a moving image capturing function. The RGB camera 14 is a video camera, for example. The RGB camera 14 is, for example, a display data generation camera. The RGB camera 14 is a camera that photographs the real space in which the performer A, B, C performs a performance. The RGB camera 14 includes an image pickup device such as a CCD and/or a CMOS, for example, and detects light such as visible light to output display data including color signals of 3 colors (red, green, and blue). For example, the RGB camera 14 captures an object such as the performer A, B, C, and outputs viewing data as display data that enables the captured object to be displayed on a display unit such as the user terminals 40, 60, and 70. In addition, as an example, the RGB camera 14 outputs shot data as display data to be displayed on the studio monitor 17. Further, as an example, the RGB camera 14 outputs, as display data, video data displayed on a large-sized display device provided in a public place, a live conference place, or a concert hall where the user A, B, C is located and having a large screen. The RGB camera 14 is not necessarily a camera, and may be, for example, a smart terminal having a moving image capturing function. In this case, the smart terminal can be fixed to a tripod or the like to exhibit the same function as a camera.
For example, the depth camera 15 is the 2 nd camera in the live broadcast distribution system 1. For example, the depth camera 15 is an infrared camera. The depth camera 15 is, for example, a three-dimensional position information acquisition camera. The depth camera 15 acquires depth information, which is the distance from the camera to the subject. The depth camera 15 is, for example, an acquisition unit that acquires depth information or the like, which is a distance from the performer A, B, C or the like serving as the object. As an example, the depth camera 15 acquires a distance to a performer a as a part of the object (depth information), a distance to a performer B as a part of the object (depth information), and a distance to a performer C as a part of the object (depth information), respectively. For example, the depth camera 15 acquires depth information, which is a distance to each point of a studio that is a part of the object. As an example, the depth camera 15 acquires three-dimensional position information of an actual space including the performers A, B and C and a studio as an object. The depth camera 15 includes a light projection unit for irradiating infrared rays and an infrared detection unit for detecting infrared rays. The depth camera 15 acquires three-dimensional position information such as depth information of an actual space, for example, based on the time until the infrared pulse irradiated from the light projection unit is reflected. The RGB camera 14 and the depth camera 15 may be an integrated device or may be separate devices.
For example, the projector 16 displays an image of a gift, which is a gift given to the performer A, B, C, on the stage by a method such as a projection mapping technique. The studio monitor 17 is a display device that is disposed in the studio 10, which is an actual space, and displays an image, and is, for example, a display device that is disposed in front of a stage so that the performer A, B, C can visually recognize the display device. The studio monitor 17 is a flat panel display, for example, an LCD display device or an organic EL display device. The studio monitor 17 displays an image of the performance of the performer A, B, C captured by the RGB camera 14.
The server 20 generates live data as content data performed by the performer A, B, C. For example, the server 20 generates live broadcast data for a performance performed by the performer A, B, C to be distributed to the user terminals 40, 60, and 70 based on various data such as music data from the playback device 11, audio data from the microphone 13, and video data from the RGB camera 14, and distributes the live broadcast data to the user terminals 40, 60, and 70 live broadcast. That is, the server 20 live-broadcasts the performance by the performer A, B, C to the user terminals 40, 60, 70.
Note that the performer A, B, C may perform the performance by actually playing a musical instrument such as a guitar and/or a drum by the performer A, B, C without reproducing the accompaniment by the playback device 11, and may record the sound by using a microphone. Live data may be generated in the studio 10 by a data generating device or the like and transmitted to the server 20.
The user A, B, C participating in the live broadcast distribution system 1 is, for example, a fan of the performer A, B, C, and can view live broadcast data using the user terminals 40, 60, and 70. The user terminal 40 includes, for example, a desktop or notebook personal computer 40a and a smart watch 50 as a wearable terminal and/or a smart device terminal connected to the personal computer 40 a. In the case where the personal computer 40a is a desktop type, the user terminal 40 includes a desktop personal computer 40a, a display connected to the personal computer 40a, and a smart watch 50 connected to the personal computer 40 a. When the personal computer 40a is a notebook computer, the user terminal 40 includes a notebook personal computer 40a having a display portion and a smart watch 50 connected to the notebook personal computer 40 a. For example, the user a of the user terminal 40 wears the smart watch 50 on a dominant hand or the like, and the smart watch 50 and the personal computer 40a are connected by wire or wireless. The smart watch 50 includes a detection unit such as an acceleration sensor and/or a gyro sensor, and detects, as user motion information, acceleration, angle (posture), and/or angular velocity of the user a when the user a performs a motion of projecting an object, for example.
The personal computer 40a may be connected to a Head Mounted Display (HMD) by wire or wirelessly. The HMD may be provided as the personal computer 40 a. The HMD may be an optical see-through head mounted display, a video see-through head mounted display, an opaque head mounted display, or the like. In the case of an optical see-through head mounted display, display based on AR (augmented reality) is possible. In the case of a video see-through head mounted display and/or an opaque head mounted display, VR (virtual reality) based display is possible. The HMD can display an image of a gift for giving and/or gift giving as described later.
In addition, the user terminal 60 is, for example, a smart device terminal such as a smart phone and a tablet pc, and is a portable small-sized information processing terminal. For example, a smartphone includes a touch panel on a display surface. The user terminal 60 includes a detection unit such as an acceleration sensor and/or a gyro sensor, and detects, as user motion information, an acceleration, an angle, and/or an angular velocity of the user B when the user B performs a motion of projecting an object. Since the user terminal 60 is a portable small-sized information processing terminal, the user of the user terminal 60 can view live data wherever he or she is.
The user terminal 70 of the user C includes a smart device terminal 60a and a smart watch 50. In this case, the smart device terminal 60a assumes the function of the notebook personal computer 40a of the user a. Thus, even when the user C makes a motion of projecting an object with the dominant hand wearing the smart watch 50, the image displayed on the display unit of the smart device terminal 60a held by the other hand can be visually recognized. Further, the operation of projecting the object can be performed in a state where the smart device terminal 60a is placed on a desk or the like or fixed to a tripod. This enables a user to perform a throwing-out operation or the like with a dominant hand wearing the smart watch 50 while viewing the video displayed on the display unit of the smart device terminal 60a.
The user terminals 40, 60, and 70 can watch live data and give a gift virtually to the performer A, B, C who is actually performing at the time. For example, a gift choice image is displayed on the display surface of the user terminal 40, 60, or 70 together with live broadcast data, and an image that is the 1 st image and is a gift that can be presented to the performer A, B, C is displayed as a list of gift choice images. Examples of the gift include ornaments such as bouquets and hair bands, special effects that show the actions of performers when the user terminals 40, 60, and 70 are viewed, and background images of places where the performers actually perform. Further, the user A, B, C selects a gift from the list of images in the gift selection image, and further selects an actor to be given the gift from the actor A, B, C. In fig. 1, an example is shown in which performer a is selected as the performer, and further, a cat ear hair band is selected as the gift.
Then, the user a of the user terminal 40 swings his arm while wearing the smart watch 50, and performs an action of throwing out an object. The user B of the user terminal 60 swings his arm while holding the user terminal 60 and performs an action of throwing out an object. The user C of the user terminal 70 swings an arm while wearing the smart watch 50, and performs an action of throwing out an object. At this time, the user terminals 40, 60, and 70 transmit operation data such as acceleration data, angle (posture) data, and angular velocity data, which are detection results, that is, user motion information, to the server 20. Further, the user terminals 60 and 70 may perform a touch operation in the direction of the performer A, B, C displayed on the display surface on which the live data is displayed using a finger or a stylus pen, and may transmit operation data such as the coordinate data to the server 20.
At this time, the server 20 displays the image of the gift 18 indicated by the gift ID transmitted from the user terminal 40, 60, 70 on the floor of the studio 10 using the projector 16. The gift 18 displayed in the image is displayed in front of the performer indicated by the performer ID transmitted from the user terminal 40, 60, 70, as an example. In fig. 1, an example of a gift 18 with a hair band displayed in an image in front of an actor a is shown. As an example, the projector 16 displays the displayed gift 18 in such a manner that the gift is projected from the user A, B, C side, i.e., the front position of the performer A, B, C, toward the direction of the performer A, B, C. In the real space, the ground position where the gift is dropped, that is, the gift position of the final display gift 18 is set to a specific position determined in the studio 10, and is determined based on operation data such as acceleration data, angle data, and angular velocity data, which are pieces of user motion information transmitted from the user terminals 40, 60, and 70. The gift position is determined by three-dimensional position information such as depth information. The gift position can be specified by a three-dimensional coordinate system with the detection portion of the depth camera 15 as the origin, for example.
For example, when the user A, B, C performs an action of throwing out a gift with a small force, the gift 18 is displayed at a position far in front of the performer A, B, C. In the case where the user A, B, C makes a more forceful gift-casting motion, the gift position at which the gift 18 is displayed is a position closer in front of the performer A, B, C. Also, in the event that the user A, B, C drops out a gift with excessive force, the gift position at which the gift 18 is displayed is either forward or rearward relative to the performer A, B, C after the gift has bounced back by a wall located behind the performer A, B, C. Thereby, the performer A, B, C can visually recognize as if a gift was cast in the direction of himself from the user A, B, C who is not actually in the studio 10. In the display surface of the user terminal 40, 60, 70, the appearance of the gift cast to the performer A, B, C is also displayed together with the image, and the gift image is located at the position of the gift image in the image captured by the RGB camera 14. Thus, user A, B, C is also able to visually recognize that the self-posted gift has arrived near performer A, B, C. Further, as an example, when the user A, B, C makes an action to drop a gift to the right or left with respect to a specific performer a, the gift position at which the gift 18 is displayed in the real space is a position to the right or left with respect to the performer a in accordance with the direction in which the user A, B, C dropped the gift. As an example, in the case where a gift is cast to the right of the performer a, the gift position at which the gift 18 is displayed may be the front of the performer B or the performer C. Such gift positions are determined by three-dimensional position information decided based on operation data such as acceleration data, angle data, angular velocity data, etc. detected as a result of detection by the user terminal 40, 60, 70.
The studio monitor 17 may also display video images similar to those on the display surfaces of the user terminals 40, 60, and 70.
The depth camera 15 of the studio 10 always calculates three-dimensional position information such as depth information at various places in the studio 10, for example. The depth camera 15 extracts a character region of the performer A, B, C, for example, and divides the character region into a character region and a non-character region. The depth camera 15 acquires, as bone data, 25 bone positions of each performer A, B, C, and calculates depth information of each bone position. Examples of the bone positions include bone positions of the left and right hands, the head, the neck, the left and right shoulders, the left and right elbows, the left and right knees, and the left and right feet. In addition, the number of bone locations acquired is not limited to 25. In addition, the depth camera 15 calculates the distance to the walls and/or floor in the studio 10. Here, the depth information is, for example, a distance from the objective lens or the sensor surface in front of the depth camera 15 to the position of the object image (at each position on the wall and/or each position on the floor of the studio 10). The depth information is, for example, a distance from an objective lens or a sensor surface in front of the depth camera 15 to a bone position of an actor as an object.
When the right hand or left hand of performer a overlaps the gift position where gift 18 is displayed, as an example, when the position based on the depth information of the right hand or left hand of performer a overlaps the gift position is when performer a picks up the gift. Projector 16 then causes gift 18 to not be displayed when either the right or left hand of performer a overlaps the gift position. The gift here is a hair band. Thus, performer a takes the action of wearing the pick-up headband to the head. Meanwhile, the server 20 displays the display screen image of the user terminal 40, 60, 70 as if it were picked up as if it were held on the right hand or left hand of the performer a, and displays the appearance of the headband as if it were worn on the head of the performer a. Thus, the user A, B, C can visually recognize in the display surface of the user terminal 40, 60 that the hairband given by himself is seen by the performer a, and visually recognize the manner in which the performer a wears the hairband.
After that, the user A, B can visually recognize in the display surface of the user terminal 40, 60, 70 the appearance that the performer a performs in a state of wearing the headband. For example, when the performer a faces the side surface, the direction of the object image of the headband is also displayed in accordance with the orientation of the performer a. The orientation of each actor A, B, C is determined by: face detection of performer A, B, C is performed from the display data from RGB camera 14 and the skeletal position of performer A, B, C is calculated using depth camera 15. In addition, the data of the object image displaying the gift is also three-dimensional data. For example, when it is detected that the performer a faces sideways based on these data, the direction of the object image of the headband is changed in accordance with the orientation of the performer a. The state in which the performer a wears the headband is also displayed in the studio display 17, enabling the performer A, B, C to visually recognize the state displayed on the user terminals 40, 60, 70.
As an example, server 20 can determine all gifts given to performer A, B, C (received by performer A, B, C) and the user IDs corresponding thereto at the time the user presented a gift to performer A, B, C (the time the performer A, B, C received the gift). Thus, the server 20 can determine the user ID of the gifted gift at the time the performer A, B, C picks up the gift.
When entering the interlude, as an example, a video of an object image of the user ID of the gift, which is given to the performer a by the hoop, is displayed on the display surface of the studio display 17 of the studio 10. In addition, as an example, a video of an object image of the user ID is displayed on the floor of the studio 10 using the projector 16. Thus, the performer a can grasp the user A, B, C given the hair band.
For example, when the performer a speaks the user ID and records it on the microphone 13, the server 20 performs voice recognition processing based on the voice data from the microphone 13, thereby specifying the user ID of the user to be presented and presenting the user. The voice recognition processing can be performed by other devices (the server 20, the device installed in the studio 10, and the like) than the microphone 13. Further, as an example, as the backgift processing, when the performer touches a gift (e.g., a hair band) worn by the performer A, B, C and makes an action of throwing out a backgift, the user ID of the gift touched by the performer A, B, C is determined and the user can be backgift. Further, for example, the server 20 can specify the entire live viewer by the user ID, and cause the performer A, B, C to give a gift to a user who purchased a gift of a predetermined amount or more. Further, as an example, the user terminal may include a line-of-sight detection unit that detects a line of sight of the user, and may calculate a time when the user is gazing at a specific performer. In this case, when the performer a performs the gifting process, gifting can be performed for a user who has continuously watched the user ID for a certain time or longer for the performer a. Further, for a user who does not watch the user ID for a certain time or more for the performer a, gifting is performed for a user of the user ID extracted at random.
As described above, in the case of return presentation, the presentation of the 2 nd object image, which is the present presented by the performer a to the user A, B, C for return presentation, is displayed on the display surfaces of the studio display 17 and the user terminals 40, 60, and 70. Specifically, when the gift of the gift receipt is the signature ball of the performer a, the image of the gift image of the signature ball is displayed on the display surfaces of the studio display 17 and the user terminals 40, 60, and 70 at the position of the right hand or the left hand of the performer a as if the performer a had the signature ball in the right hand or the left hand. This enables the performer a to grasp the current situation in which the performer a currently holds a signature ball, and the user A, B, C can also grasp the current situation in which the performer a is going to give a gift of signature ball.
When the performer a performs the action of throwing the signature ball, the depth camera 15 detects that the performer a has thrown the signature ball based on a change in depth information or the like of the hand that the performer a has held the signature ball. At this time, in the display surface of the user terminal 40, 60, 70, the gift is displayed as if performer a cast a signature ball to the user A, B, C. When the user terminal 40, 60, or 70 performs an operation of catching the signature ball displayed on the display surface, the user A, B, C receives the signature ball. The gift as the gift is not limited to the signature ball. The user A, B, C can put a gift, such as a signature ball, received as a gift to the performer A, B, C the next time the live release is made. Also, the gift as a return gift may be mailed to the user A, B, C at a later date. In the mailing, instead of the actual signature ball, a paper label and/or merchandise related to the performer and/or an album such as a CD or DVD, a coupon for a concert, or the like may be used.
[ depth camera 15 ]
The depth camera 15 includes, for example, a light projecting section such as a projector that emits pulsed infrared light and an infrared detecting section such as an infrared camera, and calculates depth information from the Time until the emitted infrared pulse is reflected (Time of Flight (TOF) method). The depth camera 15 always calculates three-dimensional position information such as depth information at each location in the studio 10, for example.
Further, the depth camera 15 extracts a character region of the performer A, B, C, and divides the character region into a character region and a non-character region. For example, the person region is calculated based on the difference value between before and after the image of the person in the same place (for example, the studio 10). In addition, as an example, a region in which the detected infrared ray amount exceeds a threshold is determined as a human object region.
Also, the depth camera 15 detects the bone position. The depth camera 15 acquires depth information of each position in the person region, calculates a position (left and right hands, head, neck, left and right shoulders, left and right elbows, left and right knees, left and right feet, and the like) in the actual space of the person imaged in the person region based on the feature quantities of the depth and the shape, and calculates the center position of each position as a skeleton position. The depth camera 15 uses the feature quantity dictionary stored in the storage unit, and calculates each part in the human figure region by comparing the feature quantity determined from the human figure region with the feature quantity of each part registered in the feature quantity dictionary.
In addition, the following method may be adopted: the depth camera 15 outputs the detection result obtained by the infrared detection unit to another device (the server 20, the user terminals 40, 60, and 70, a calculation device provided in the studio 10, and the like), and the other device performs processing such as calculation of depth information, extraction of a human figure region, division of a human figure region and a non-human figure region, detection of a bone position, and identification of each part in the human figure region.
Note that the motion capture processing described above is performed without placing a marker on the performer A, B, C, but a marker may be placed on the performer A, B, C.
In addition, when the depth information is calculated, a method (Light Coding method: optical Coding method) of reading the emitted infrared ray pattern and obtaining the depth information from the pattern deformation may be employed.
Further, the depth information may be calculated from parallax information obtained by a twin-lens camera or a plurality of cameras. The depth information may be calculated by performing image recognition on the video acquired by the RGB camera 14 and performing image analysis using a photogrammetric technique or the like. In this case, the RGB camera 14 functions as a detection unit, and therefore the depth camera 15 is not necessary.
[ server 20 ]
As shown in fig. 2, the server 20 includes interfaces (hereinafter, simply referred to as "IF") with the respective parts of the studio 10, and is connected by wire or wireless. The IFs connected to the respective parts of the studio 10 include an audio IF21, an RGB camera IF22, a depth camera IF23, a projector IF24, and a display IF 25. The system further includes a database 26, a data storage unit 27, a network IF28, a main memory 29, and a control unit 30. The server 20 distributes live data to the user terminals 40, 60, 70, and also functions as a display control device that controls the projector 16, the studio display 17, and/or the display of the user terminals 40, 60, 70.
The audio IF21 is connected to the playback device 11, the microphone 13, and the like of the studio 10. The audio IF21 receives musical composition data from the playback device 11 and sound data from the performer A, B, C from the microphone 13.
The RGB camera IF22 receives the image data of the studio 10 captured by the RGB camera 14. The depth camera IF23 inputs depth information of the studio 10 and/or the performer A, B, C, data of character areas, depth information of skeletal positions, and the like. The projector IF24 performs control for displaying a gift on the floor of the stage of the studio 10 or the like by the projector 16. Display IF25 controls studio display 17 provided at studio 10. As an example, the display IF25 displays a gift image and/or a user ID given to the performer A, B, C on the display surface of the studio display 17. Thus, the performer A, B, C can know who gave the gift.
The database 26 as a management unit manages gifts of users on a live broadcast basis so as to be associated with user IDs of users registered in the system. Specifically, in the database 26, the gift ID, the gift sending-out destination ID, whether the gift is accepted, whether there is a gift, and whether the gift is successfully received are managed in such a manner as to establish an association with each user ID on a live broadcast basis.
The gift ID is an ID that uniquely identifies a gift purchased by the user, and in each live broadcast, is an ID that uniquely identifies a gift given by the user.
The gift sending-out destination ID is an ID that uniquely determines the performer who has given the gift.
Whether the gift is accepted by the performer selected by the management user and whether the gift was successfully received by the gift given by the user.
Whether or not the performer who has collected the gift given by the user has performed a gift for the user who sent the gift is managed.
Whether the gift receipt is successful or not and whether the gift receipt is successful or not.
In addition, the database 26 manages all users who can participate in live distribution in association with the user ID. The users participating in each live broadcast are selected from all registered users. The database 26 manages, for example, prices of gifts in association with gift IDs. In addition, as an example, the total amount of the purchase amount corresponding to the performer of each user is managed.
A case where the user a gives a gift ID "a (bouquet)" to the performer C but the gift is not received by the performer C is shown in fig. 2. And shows a case where the user B gives a gift ID "C (hair band)" to the performer a, the performer a collects the gift a, and the performer a makes a gift, and the user a collects the gift. Also shown is a case where user C gives gift ID "B (special effect)" to performer B, performer B collects gift B, and performer B makes a gift return, but user C cannot successfully collect the gift return.
The data storage unit 27 is a storage device such as a hard disk. The data storage unit 27 stores a control program related to the live broadcast distribution system 1, display data for displaying an image of a gift, and the like. As an example of display data for displaying an image of a gift, when the gift is a tangible object such as an ornament, the display data is three-dimensional data and is configured to display the ornament in accordance with the orientation of the performer. For example, the data of the decoration is displayed from the front of the decoration when the performer faces the front, and the decoration is also displayed to face the side when the performer faces the side. The control program is, for example, a distribution program for distributing live data to the user terminals 40, 60, and 70. The control program is, for example, a gift display control program for displaying an image of a gift given from the user on the floor of the studio 10 by the projector 16. The control program is, for example, a display device control program for displaying a gift image given from the user on the display surface of the studio display 17 and/or the user terminals 40, 60, and 70 in association with the performer A, B, C. The control program is, for example, a display device control program for displaying an object image of the user ID of the presenter on the display surface of the studio monitor 17 and/or the user terminals 40, 60, and 70. The control program is, for example, a display device control program for displaying a gift image of a gift to be presented on the display screen of the studio 17 and/or the user terminals 40, 60, and 70 when the performer to be presented with the gift returns the gift to the user.
The network IF28 is connected to the server 20 and the user terminals 40, 60, and 70 via a network 2 such as the internet. The main memory 29 is, for example, a RAM, and temporarily stores live broadcast data and/or control programs during distribution.
The control unit 30 is, for example, a CPU, and controls the overall operation of the server 20. The control unit 30 is, for example, a distribution unit that distributes live data to the user terminals 40, 60, and 70 according to a distribution control program. The control unit 30 is, for example, a gift display control unit that displays a gift sent from the user on the floor of the studio 10 by the projector 16 according to a gift display control program. The control unit 30 is, for example, a display device control unit that controls display of the user terminals 40, 60, and 70, and is also a display device control unit that controls display of a studio monitor.
Such a control unit 30 generates and displays display data for displaying the gift image sent from the user in association with the performer A, B, C on the display surface of the studio display 17 and/or the user terminals 40, 60, 70. Further, as an example, display data for displaying the user ID of the presenter is generated and displayed on the display surface of the studio display 17 and/or the user terminals 40, 60, and 70. Further, for example, when the performer given the gift returns the gift to the user, the control unit 30 generates and displays display data of the gift to be returned to the display screen of the studio display 17 and/or the user terminals 40, 60, and 70.
When the gift image is displayed on the display of the studio display 17 and/or the user terminal 40, 60, 70, it displays the position of the gift image that would have been in the real space of the studio 10. That is, the position of the gift position in the real space of the studio 10 is determined by the three-dimensional position information. Regarding the object image of the gift displayed on the display surface of the studio display 17 and/or the user terminal 40, 60, 70, even if the orientation of the RGB camera 14 changes, it is displayed at the appropriate gift object image position in the image acquired under that orientation. Further, in the case where the gift position is beyond the shooting range of the RGB camera 14, the gift image is no longer displayed in the display surface of the studio display 17 and/or the user terminal 40, 60, 70. In addition, even when the presenter A, B, C squats or jumps up on the display surface of the studio display 17 and/or the user terminals 40, 60, and 70, the gift image is displayed in accordance with the movement of the performer A, B, C.
The control unit 30 does not need to perform all of the above-described processing, and may perform a part of the processing in cooperation with other devices. As an example, a system may be configured in which a control device such as a personal computer is installed in the studio 10, and the control device installed in the studio 10 and the server 20 cooperate to perform the above-described processing. In this case, for example, the server 20 includes a database 26, a main memory 29, and a control unit 30. The control device further includes an audio IF21, an RGB camera IF22, a depth camera IF23, a projector IF24, a display IF25, a data storage unit 27, and a network IF 28. Further, for example, the control device may be caused to perform processing other than updating of the database 26. For example, a process of displaying an image of a gift given by the user using the projector 16, a process of displaying an image of a gift in the display screen of the studio display 17 and/or the user terminal 40, 60, 70, and the like.
Further, a part of the above-described processing may be performed in cooperation with the user terminals 40, 60, and 70. For example, the image data of the real space acquired by the RGB camera 14, the three-dimensional position information acquired by the depth camera 15, and the like are transmitted to the user terminals 40, 60, 70. The user terminals 40, 60, and 70 detect the movement of the user A, B, C, and display the trajectory of the gift to the final arrival position, the display of the gift image, and the like on their display screens based on the video data of the real space, the three-dimensional position information, and the detection result of the movement of the user A, B, C.
[ user terminal 40 ]
The user terminal 40 is a device managed by the user a, and includes a desktop or notebook personal computer 40a and a smart watch 50. For example, the notebook personal computer 40a includes an audio IF41, a display IF42, a network IF43, a communication IF44, a data storage unit 45, an operation IF46, a main memory 47, and a control unit 48. Audio IF41 is coupled to a sound output device such as a speaker, earphone, headset, etc., and/or a sound input device such as a microphone. The display IF42 is connected to a display unit 49 formed of a display device such as a liquid crystal display device, for example.
The network IF43 communicates with the server 20 via the network 2, for example. The communication IF44 communicates with the smart watch 50, for example. The communication IF44 and the smart watch 50 are connected by a wireless LAN and/or a wired LAN, and acceleration data, angle data, angular velocity data, and the like are input as user motion information from the smart watch 50. The data storage unit 45 is a nonvolatile memory, for example, a hard disk and/or a flash memory. The data storage unit 45 stores a playback program of live data, a communication control program with the smart watch 50, and the like. The operation IF46 is connected to an operation device such as a keyboard and/or a mouse. When a touch panel is provided on the display surface of the display unit 49 connected to the display IF42, the operation IF46 is connected to the touch panel. The main memory 47 is, for example, a RAM, and temporarily stores live broadcast data and/or control programs during distribution. The control unit 48 is, for example, a CPU, and controls the overall operation of the user terminal 40. For example, when reproducing live broadcast data, the control unit 48 selects 1 or more persons from among the performers A, B, C, transmits the performer selection data to the server 20, and transmits 1 or more gift selection data in the gift image list to the server 20. The control unit 48 transmits, for example, operation data such as acceleration data, angle data, and angular velocity data, which are detected by the smart watch 50 as user operation information, to the server 20.
The smart watch 50 is, for example, a wristwatch-type information processing terminal that is worn on the wrist of the dominant hand of the user a. The smart watch 50 includes a sensor 51, a communication IF52, a data storage unit 53, a main memory 54, and a control unit 55. The sensor 51 is, for example, an acceleration sensor and/or a gyro sensor. As an example, the communication IF52 transmits acceleration data detected by the sensor 51, angle data of the smart watch 50, and/or angular velocity data to the personal computer 40 a. As an example, when the user a performs a motion of projecting an object, the sensor 51 transmits operation data such as acceleration data, angle data, and angular velocity data, which are user motion information related to arm swing, to the personal computer 40a from the communication IF 52. The data storage unit 53 is a nonvolatile memory, for example, a hard disk and/or a flash memory. The data storage 53 stores a driver for driving the sensor 51, a communication control program for the personal computer 40a, and the like. The control unit 55 is, for example, a CPU, and controls the operation of the entire smart watch 50.
In addition, the terminal connected to the user terminal 40 may be a small-sized portable information processing terminal such as a smartphone equipped with an acceleration sensor and/or a gyro sensor, instead of the smart watch 50.
[ user terminal 60 ]
The user terminal 60 is, for example, a device managed by the user B, and is a smart device terminal such as a smart phone and a tablet computer. The user terminal 60 includes, for example, an audio IF61, a display IF62, an operation IF63, a sensor 64, a network IF65, a data storage unit 66, a main memory 67, and a control unit 68. The audio IF61 is connected to an audio output device such as a built-in speaker and/or headphone and/or an audio input device such as a built-in microphone. The audio IF61 plays, as an example, live data from an audio output device. The display IF62 is connected to a small display 69 such as a built-in liquid crystal panel or organic EL panel. The display 69 is provided with a touch panel, and the operation IF63 is connected to the touch panel. The sensor 64 is, for example, an acceleration sensor and/or a gyro sensor. The network IF65 communicates with the server 20 via the network 2, for example. As an example, when the user performs a motion to throw an object, the network IF65 transmits operation data including acceleration data, angle data, and angular velocity data relating to the swinging of an arm, which are detected by the sensor 64, to the server 20 as user motion information. Further, when a slide operation of touch in the direction of the displayed performer A, B, C is performed on the display surface on which live data is displayed using a finger and/or a stylus, the network IF65 transmits operation data such as coordinate data thereof to the server 20. The data storage unit 66 is a nonvolatile memory, for example, a flash memory. The data storage unit 66 stores a playback program of live data and the like. The main memory 67 is, for example, a RAM, and temporarily stores live broadcast data and/or control programs during distribution.
The control unit 68 is, for example, a CPU, and controls the overall operation of the user terminal 60. For example, when reproducing live data, the control unit 68 selects 1 or more persons from the performers A, B, C, transmits the selection data to the server 20, and transmits the selection data of 1 or more persons in the gift image list to the server 20. When the user holds the user terminal 60 by hand and performs an operation to throw an object, the control unit 48 transmits operation data such as acceleration data, angle data, angular velocity data, and coordinate data of the swinging of the arm to the server 20, for example.
The operation of the broadcast distribution system 1 will be described below.
[ live broadcast processing ]
Before live broadcast, in the studio 10, the depth camera 15 is first caused to acquire depth information at each location of the studio 10 and calculate a person region, and then, the skeleton position is calculated in the person region and depth information at each skeleton position can be calculated. Then, the motion capture process is performed in the depth camera 15. The user terminals 40, 60, and 70 are also enabled to register with the server 20 and view live distribution.
As shown in (a) of fig. 3, when the performer A, B, C starts a performance, the server 20 generates live data as content data in step S1. Specifically, the server 20 receives the image data of the real space played by the performer A, B, C in the studio 10 captured by the RGB camera 14. Further, in the server 20, music data is input from the playback device 11, and voice data of the performer A, B, C is input from the microphone 13. Then, the server 20 generates live data for performance by the performer A, B, C distributed to the user terminals 40, 60, 70 based on these various data. Further, in step S2, the server 20 inputs depth information of skeletal positions of the studio 10 and/or the performer A, B, C. In step S3, the server 20 distributes live data to the user terminals 40, 60, and 70 live broadcast. That is, the server 20 distributes the performance performed by the performer A, B, C to the user terminals 40, 60, 70 in real time. As a result, as shown in fig. 3 (b), the user terminals 40, 60, and 70 display live videos 71 based on live data on the display surfaces, and output live sounds.
[ Gift/performer selection processing ]
Next, the gift/performer selection process will be explained using fig. 4. Here, a case where the user B views and operates live data through the user terminal 60 will be described as an example. In step S11, the server 20 displays the gift choice image 72 on the display screen of the user terminal 60 so as to overlap the live-action image 71 (see fig. 5 a), and the gift choice image 72 displays selectable gift images in a list form. As an example, in fig. 5 (a), the gift selection object images 72 include an object image 72a showing a bouquet gift, an object image 72b of a gift to which a special effect is added, which is shown in the movement of the performer, an object image 72c showing a gift with a cat ear hair band, and an object image 72d showing a gift of a background image which is released live broadcast, which are arranged in a row in order from the left.
The gifts listed in the gift choice image 72 are a plurality of gifts prepared by the operator side. The prepared gift can be different according to each live broadcast, and can also be universal in all live broadcasts. In addition, the gift, which may also be a part, is repeated in multiple live broadcasts. In the database of the server 20, the price of each gift is managed in association with the gift ID. The server 20 stores moving image data, sound data, and/or music data, for example, in association with the gift ID as gift data for displaying the gift. The gift data is three-dimensional data as an example.
Each gift is charged, the amount of money is determined from the gift, and the price is associated with the gift ID. As an example, the gift of the bouquet with the gift ID "a" is 200 yen. The gift added with the special effect of gift ID 'B' is 300 yen. The gift of the cat ear hair clasp with the gift ID of C is 500 yen. The gift "D" of the background image with the gift ID "D" is 1000 yen. As an example, the user a shown in the database of fig. 2 purchases a bouquet of the gift ID "a" with 200 yen, the user B purchases a hair band of the gift ID "C" with 500 yen, and the user C purchases a special effect of the gift ID "B" with 300 yen. The user A, B, C can purchase these gifts via the user terminals 40, 60, 70 as such, thereby giving gifts to the performer A, B, C. Thus, the performer A, B, C and the operator can obtain sales corresponding to the gift given by the user A, B, C. Regardless of whether performer A, B, C collects (e.g., picks up) gifts, all gifts given by user A, B, C become sales of performer A, B, C and its operator. Among the gifts, there may also be free gifts. In addition, in 1 live broadcast, 1 user can purchase 1 gift or multiple gifts. In the database, the total amount of purchase amount corresponding to the performer for each user is managed. Thus, for example, the server 20 managed by the operator can perform processing for giving a gift to a user who has purchased a large number of gifts in priority.
When the user B selects 1 gift from the list of gift options 72 through the user terminal 60, the user terminal 60 transmits gift selection data including the user ID and the gift ID of the selected gift to the server 20. In step S12, the server 20 performs the selection processing of the gift based on the gift selection data. The server 20 transmits selection data for displaying only the object image 72c of the selected gift on the user terminal 60 to the user terminal 60, and displays the object image 72c on the display surface of the user terminal 60 so as to be superimposed on the live view image 71. Fig. 5 (b) shows, as an example, a state in which the server 20 selects an object image 72c representing a gift of a hair band and displays the object image at a lower corner of the display surface of the user terminal 60. Accordingly, the server 20 also displays the same on the studio display 17 to notify the performer A, B, C that the gift selection process is currently in progress.
Further, the server 20 displays, on the display surface of the user terminal 60, an artist selection image 73 including each of the artists A, B, C, as an example. Here, as an example, the 1 st presentation image 73a notifying the user B that the performer selection operation is to be performed next is displayed correspondingly on the display surface of the user terminal 60. When 1 performer selection image 73 is selected by the user terminal 60, the user terminal 60 transmits performer selection data including the user ID and the performer ID of the selected performer to the server 20. In step S13, the server 20 performs performer selection processing based on the performer selection data. The server 20 displays the performer determination image 74 for the selected performer a in a manner overlapping with the live image 71 in the display surface of the user terminal 60. Fig. 5 (c) shows, as an example, a state in which the server 20 selects the performer a and displays the performer a on the display screen of the user terminal 60. The performer selection image 73 and/or the performer determination image 74 may be a quadrilateral shape, but is not limited thereto, and may be a circular shape or a triangular shape, for example. Accordingly, as shown in fig. 5 (d), the server 20 also displays the same on the studio display 17 to notify the performer A, B, C that the performer selection process is currently in progress. On the studio display 17, a 2 nd presentation image 74a showing that the user B wants to send out the selected gift is displayed.
In addition, after selecting the gift and the performer, the server 20 registers the selected gift and performer in the database 26.
Then, the user of the user terminal 60 becomes a state capable of operating the user terminal 60 and giving a gift to the performer A, B, C at the studio 10. Specifically, the user B of the user terminal 60 can obtain a simulated experience of casting the selected gift to the performer selected by himself by holding the user terminal 60 in hand and making an action of casting the object. Specifically, the server 20 and the user terminal 60 start the synchronization process, and the user terminal 60 transmits operation data such as acceleration data, angle data, and angular velocity data, which are user motion information detected by the sensor 64, to the server 20 every unit time. In step S14, the server 20 stores a threshold value for determining whether or not the user has performed the dispensing operation, and determines that the dispensing operation has been performed at the user terminal 60 when the threshold value is exceeded. For example, the server 20 stores threshold values such as acceleration data, angle data, and angular velocity data in order to specify the casting operation. When the acceleration data, the angle data, the angular velocity data, or the like exceeds the threshold value, the server 20 determines that the casting operation is performed. When the distance between the start point and the end point when the sliding operation of the touch panel is performed exceeds a threshold value, it is determined that the projection operation is performed. In the case of the user terminal 40, when the acceleration data, the angle data, the angular velocity data, or the like exceeds a threshold value, it is determined that the projection operation is performed.
In step S15, the server 20 analyzes the swing direction, speed, and the like of the arm of the user based on the operation data such as the acceleration data, angle data, and angular velocity data related to the swing of the arm transmitted from the user terminal 60. Thus, the server 20 calculates the gift position, which is the trajectory and/or the falling position of the thrown gift when it is thrown. As an example, the gift position can be specified by a three-dimensional coordinate system or the like with the detection portion of the depth camera 15 as the origin.
In step S16, the server 20 generates display data indicating the falling object image 75 of the gift of the falling headband displayed on the display screen of the user terminal 60 based on the analysis result, and transmits the display data to the user terminal 60. As a result, as shown in fig. 5 (e), the falling object image 75 is displayed on the display screen of the user terminal 60 in real time as if it came to the performer a. The same display data of the falling object image 75 is transmitted to the studio display 17 of the studio 10, and the falling object image 75 is also displayed in real time on the studio display 17. The falling object image 75 is displayed on the display surface of the studio display 17 and/or the user terminal 60 at a position of a gift image that should be present in the real space of the studio 10. That is, regarding the position of the gift, the gift position in the real space of the studio 10 is determined according to the three-dimensional position information. Thus, even if the direction of the RGB camera 14 is changed, the falling object image 75 is displayed at an appropriate gift image position in the image acquired in the direction of the RGB camera 14. Further, in the case where the gift position is beyond the shooting range of the RGB camera 14, the falling object image 75 is no longer displayed.
In addition, the process of displaying the falling object image 75 on the studio monitor 17 can be omitted because the falling object image 75 is displayed on the floor of the studio 10 by the projector 16 in the following step S17.
In step S17, the server 20 transmits the display data of the dropped object image 75 to the projector 16, and the projector 16 displays the object image of the gift coming to the performer a in a flying state and/or the object image of the gift dropped to the gift position on the floor of the studio 10 in real time. Thereby, the performer A, B, C can grasp the falling position of the falling object image 75.
The falling object image 75 may be displayed at least at the gift position, and may not be displayed until the object flies to the gift position.
Further, the drop images 75 may also be displayed in such a manner that the gift position is not within the range of performer's activity where the performer A, B, C is active. Further, the object image may be brought into the player's range of motion until the gift position is reached, but the drop object image 75 may also be displayed in such a manner that the object image of the gift given by the user does not eventually come into the player's range of motion. Further, even if an action (e.g., casting) of the user giving a gift to the performer is detected and the calculated gift position is located within the performer's range of motion, the object image may not be displayed within the performer's range of motion. In this case, for example, in consideration of the calculated gift position, the object image may be displayed in the vicinity outside the player's range of motion (a position outside the player's range of motion and closest to the calculated gift position). According to such a display mode, the performer A, B, C can be prevented from mistakenly stepping on the gift to be given.
The performer activity range is, for example, a stage in the studio 10 or the like. The range of the performer's activity may be set to be different between the period of the prelude, interlude, and tail in the 1 st song and the other periods. In addition, different ranges may be set between the period of the performance music and the period of the non-performance music (for example, the period of the performance music is set to a range in which the performer moves along with the music, and the period of the non-performance music is set to a range in which the performer does not move). Further, different ranges may be set according to music being played, or the same range may be set at all times during live broadcast distribution. The range may be set to be different between the period of the performance and the period before and after the performance, without being limited to the performance.
Also, if the gift hits the performer, the gift can be caused to either drop there or change its flight direction or move.
[ Gift acquisition processing ]
Next, the operation of the performer a to acquire the hoop gift given by the user B will be described with reference to fig. 6.
At the server 20, the skeletal positions of the performers A, B, C and the depth information of the skeletal positions are always input from the depth camera 15. Further, the server 20 performs face detection of the performer A, B, C based on the images from the RGB camera 14. Thus, the server 20 tracks each skeletal position of the performer a, depth information of the skeletal position, and a face position of the performer a. In step S21, the server 20 determines that performer a picked up the gift with the hair band. For example, the server 20 stores a threshold value relating to a distance between a bone position of one of the left and right hands and a bone position of one of the left and right feet, a distance between a bone position of one of the left and right hands and the floor, or the like, in order to specify an operation of picking up a gift dropped on the floor of the studio 10. The server 20 determines that the performer a has picked up the gift with the hair band when the calculated data such as the distance between the bone position of the hand and the bone position of the foot and/or the distance between the bone position of the hand and the ground of the gift position exceeds a threshold value. For example, the server 20 determines that the performer a has picked up the hair band gift when the position of either of the left and right hands overlaps the gift position on the ground. In other words, it is determined whether the performer is located within the range of the figure of the gift given to the performer by the user. The range of the image of the gift is determined based on the three-dimensional position information, the user motion information, and the type of the gift, for example. For example, in the case of a hair band, the range of the outer shape of the hair band is defined. As an example of the determination of whether or not the position of one of the left and right hands overlaps the gift position on the ground, it is determined whether or not the three-dimensional information (1 point) of the hand position of the performer overlaps the three-dimensional information (1 point) of the gift position. Further, as an example, it is determined whether or not three-dimensional information (multipoint) of the position of the hand of the performer and three-dimensional information (1 point) of the gift position overlap. Further, as an example, it is determined whether or not the three-dimensional information (1 point) of the hand position of the performer and the three-dimensional information (multiple points) of the gift position overlap. Further, as an example, it is determined whether or not the three-dimensional information (multiple points) of the hand position of the performer and the three-dimensional information (multiple points) of the gift position overlap. Further, as an example, it is determined whether or not three-dimensional information of the hand position of the performer (an area such as a fingertip) and three-dimensional information of the gift position (an area where the falling object image 75 is displayed) overlap. Whether the position of a hand and the gift position on the ground are overlapped or not can be more easily judged by adopting the following method: instead of judging whether the three-dimensional information of the hand position of the performer and the three-dimensional information of the gift position coincide at 1 point, whether the two are overlapped by multiple points and/or areas is judged.
In step S22, the server 20 performs control to prevent the falling object image 75 displayed on the floor of the studio 10 from being displayed. Because if the gift is picked up by performer a, the gift will disappear from the ground.
The gift is held in the hand by performer a from the time it is picked up by performer a to the time it is worn on the head of performer a. As an example, fig. 7 (a) shows a state in which performer a picks up a gift of a hair band. Then, in step S23, the server 20 interprets the acquisition action of the performer a. That is, the server 20 analyzes the operation of wearing the hair band gift on the head based on each bone position of the performer a, the depth information of the bone position, and the position of the face of the performer a. In step S24, the server 20 generates display data indicating the acquired object image 76 of the headband gift picked up to be worn on the head on the display surfaces of the studio display 17 and the user terminal 60 based on the analysis result, and transmits the display data to the studio display 17 and the user terminal 60. Thus, for example, the acquired object image 76 is displayed in association with the pickup hand while moving from the gift position on the floor to the head on the display surface of the studio monitor 17 and the user terminal 60.
In step S25, the server 20 interprets the wearing action of performer a wearing a hair band gift on the head. That is, the server 20 analyzes the wearing motion of wearing the hair band gift on the head based on each bone position of the performer a, the depth information of the bone position, and the face position of the performer a. For example, the server 20 detects the wearing operation when the position of either of the left and right hands overlaps the position of the head. In step S26, the server 20 generates display data for displaying the captured image 76 at the head-mounted position of the performer A, B, C on the display surface of the studio display 17 and the user terminal 60, and transmits the display data to the studio display 17 and the user terminal 60. As an example, the server 20 generates display data for displaying the acquired object image 76 along the boundary between the color and the background. Thus, as an example, a state in which the object image 76 is worn on the head of the performer a is displayed on the display surfaces of the studio monitor 17 and the user terminal 60 (see fig. 7 (b)). The server 20 tracks the head of the performer a and displays in such a manner that the performer a always wears a gift of a hair band even though the performer a acts.
Performer a will sometimes face sideways depending on the dance. Even in such a case, the server 20 displays the captured image 76 in accordance with the orientation of the performer a (see fig. 7 c). The orientation of each performer a can be determined by detecting the face of the performer a based on the display data from the RGB camera 14 and calculating the bone position of the performer a from the depth camera 15, and the data of the object image displaying the gift can also be three-dimensional data, and display from any direction can be realized. Based on these data, when it is detected that the performer a faces sideways, the direction of the object image of the headband is also changed in accordance with the orientation of the performer a. The captured image 76 is also displayed in accordance with the movement of the performer a when the performer a squats or jumps up.
In addition, the server 20 registers a successful intention in the database 26 when a gift is acquired by the selected performer.
When the gift with the headband is worn by the performer a, the server 20 displays an ID object 76a indicating the user ID of the user B who presented the gift with the headband to the performer a on the display surface of the studio display 17 and the user terminal 60 in step S27. Thereby, the performer a can visually recognize the user ID of the user presenting the gift with the hair band, and the user B can also visually recognize that the self-presented hair band gift is worn on the performer a by the displayed user ID. The server 20 may also display the ID object image 76a on the floor of the studio 10 using the projector 16.
The period during which the image of the gift is displayed in association with the performer may be the entire period from the acquisition of the image to the live broadcast distribution, or may be displayed for each 1 song. Further, it may not be displayed during the interlude.
The performer can pick up the gift worn once and store or place the gift in a box (which may be an actual object installed in a studio or the like or a virtual image similar to the gift), for example. Thus, in the case where many gifts are given to the performer, the performer can wear a plurality of gifts. For example, a performer may wear multiple hair bands at once, or may remove a previously worn hair band from the performer and pick up a new hair band to wear. In such a case, the use of the box as a desk or a storage box makes it possible to demonstrate the replacement work of the hair band without any strange feeling.
[ treating for gift ]
During the interlude, the performer A, B, C does not sing, and therefore, the action of returning a gift in return can be performed for the user B who presented the gift. In step S31, the server 20 determines whether or not the piece of music in the live performance enters a interlude. The server 20 can determine that an interlude has been entered when, for example, no sound is input from the microphone 13 for a predetermined period. The server 20 can determine that an interlude has been entered when a detection signal indicating that an interlude has been entered is input from the playback device 11, for example. Further, as an example, the entry of the interlude can be determined by detecting an action indicating the entry of the interlude. The server 20, for example, starts the synchronization process with the user terminal 60, and can detect a display in the user terminal 60 and/or a receipt process operated by the user terminal 60.
In addition, the gift-returning process may be performed not during the interlude but between songs. In addition, when a drama or a musical drama is performed on a stage or the like, the nth screen and the (N + 1) th screen may be provided therebetween.
In addition, regarding the end of the interlude, the end of the interlude can be determined when there is a voice input from the microphone 13 and/or when a detection signal indicating the end of the interlude is input from the playback apparatus 11. Further, the end of the interlude can be determined by detecting an action indicating the end of the interlude.
In step S27, the server 20 displays the user ID of the user B who presented the banded gift to the performer a on the display surface of the studio display 17 and the user terminal 60. Thus, the performer a can shout to the microphone 13 the user ID of the user B who presented the hoop gift to the performer a. At this time, in step S32, the server 20 performs voice recognition on the voice data recorded by the microphone 13 to identify the user ID of the user B. Further, the server 20 registers a meaning of presenting a gift to the user who is the gift-return target in the database 26.
In step S33, the server 20 detects the gift returning action of performer a. Server 20 detects, as an example, a particular action characteristic of an actor that is to transition to a gift-returning action. For example, the server 20 stores a threshold value for each bone position for determining a specific motion, and determines that the performer a performs the specific motion when the data of each bone position exceeds the threshold value. Here, the gift product from the performer a to the user B is, as an example, a signature ball of the performer a, and after a specific motion, the performer a makes a motion of casting a signature ball from the studio 10 to the user B who is not actually in the studio 10. In step S34, the server 20 analyzes the gift-returning action from each skeletal position of performer a, depth information of the skeletal position, and face position of performer a. In step S35, the server 20 generates, as an example, a start part of the casting operation performed by the performer a on the display surfaces of the studio display 17 and the user terminal 60, displays display data of the backdrop image 77 of the signature ball at the position of one of the left and right hands of the performer a, and transmits the display data to the studio display 17 and the user terminal 60. As a result, as shown in fig. 9 (a), a gift image 77 is displayed in real time on the display surfaces of the studio monitor 17 and the user terminal 60, for example. The server 20 generates display data for displaying the received object image 78 simulating the hand of the user B on the display surfaces of the studio monitor 17 and the user terminal 60, and transmits the display data to the studio monitor 17 and the user terminal 60. The receiving object image 78 is a virtual target when the signature ball is thrown.
In step S36, the server 20 analyzes the casting operation performed by the performer a. Specifically, the server 20 detects a waving of an arm or the like of the performer a from each skeletal position of the performer a, depth information of the skeletal position, and a face position of the performer a. In step S37, the server 20 generates display data in which the gift image 77 is displayed at the position of either of the left and right hands of the performer a during the casting operation on the display surfaces of the studio monitor 17 and the user terminal 60. Further, display data is generated to display a gift image 77 in a flying state separated from either of the left and right hands. Then, the information is transmitted to the studio monitor 17 and the user terminal 60. As a result, as shown in fig. 9 (b), the signature ball is displayed in real time on the display surfaces of the studio monitor 17 and the user terminal 60, for example, in the direction of the received object image 78.
In addition, special effects may be added to the arms of performer a who cast the signature ball. As an example, the special effect is a special effect in which the movement of the performer is detected and the detected movement is matched, and is a special effect in which a plurality of flickering star patterns are displayed on the edge on the downstream side in the movement direction of the arm of the moving performer a. For example, the special effect is displayed in association with the arm of performer a who throws the signature ball at the time when the gift returning motion is detected. When the swing movement is switched to the casting movement, the edge on the downstream side in the arm movement direction is displayed in accordance with the movement of the swinging arm. Further, at the time when the gift-returning action is detected, the background image may be changed to the specific image displayed at the time of the gift-returning process.
When the user terminal 60 performs an acceptance operation at the time when the gift image 77 reaches the acceptance object image 78, the user terminal 60 transmits acceptance data including the user ID to the server 20. The reception operation is a ball holding operation, and is an operation of clicking an arbitrary position on the screen and/or the reception object image 78 with a mouse. The reception operation is, for example, an operation of touching the touch panel. In step S38, upon receiving the acceptance data, the server 20 registers the fact that the gift was received in the database 26. At this time, the server 20 generates display data showing the state where the gift image 77 is received by the receiving image 78 on the display surfaces of the studio monitor 17 and the user terminal 60, and transmits the display data to the studio monitor 17 and the user terminal 60. As a result, as shown in fig. 9 (c), for example, a state in which the signature ball is caught by hand is displayed on the display surfaces of the studio display 17 and the user terminal 60.
When the acceptance operation is not performed at the time when the gift image 77 reaches the acceptance object image 78, acceptance failure data including the user ID is transmitted to the server 20, and when the acceptance failure data is received by the server 20, the intention that acceptance of the gift image failed is registered in the database 26.
[ other gifts ]
As the object image 72b of the gift to which the effect shown in fig. 5 (a) on the action of the performer is added, the following effects are added to the performer A, B, C. In fig. 10 (a), a special effect image 81 is added to the selected performer a. The video of the special effect object image 81 is displayed with a plurality of flickering star images at the edge on the downstream side in the movement direction of the moving arm of the moving performer a as a special effect corresponding to the detected movement by detecting the movement of the performer. Such effects are not as tangible as the headband 81 described above. As shown in fig. 10 (b), a gift image 82 provided with a ribbon or the like for a gift is used when cast to the selected performer a. As shown in fig. 10 (c), when the gift image 82 is not displayed, the display of the special effect image 81 is controlled after the gift image is displayed when the selected performer a obtains the gift image, that is, when the position of either of the left and right hands overlaps the gift position.
For example, even when the performer a squats or jumps up, the display is performed in accordance with the movement of the performer a. As an example, the effect may be changed before and after the performer A, B, C jumps. For example, a special effect before jumping displays a flickering star pattern, and a different pattern flickers after jumping. In addition, for example, a plurality of specific motions are defined in advance, and when 1 specific motion is detected, a display is performed in which a specific special effect associated with the motion is added. In addition, as an example, when a specific motion is detected, the display of the add special effect is stopped. In addition, as an example, until the specific motion is detected, the display for adding the special effect is not performed.
Fig. 10 (d) shows a state in which the object image 72d representing the gift of the background image distributed live broadcast is selected. In the case of the object image 72d of the background image, the object is not a tangible object as in the above hair band. Therefore, the gift image 82 for the gift is preferably used when cast to the selected performer a.
In addition, when the gift for the headband is selected, the gift image 82 for the gift may be displayed when the selected performer a is thrown.
Further, the gift certificate 77 may be displayed only to the user terminal 40, 60, 70 of the user who wants to gift. Thereby, 1-to-1 communication of the performer with the user can be achieved.
[ other gifts/performers selection processing ]
In the above example, the case where the user B selects the performer a has been described, but the performer B or the performer C can be selected instead of the performer a, and 1 user may select a plurality of performers including the performer a, for example, the performer B and/or the performer C, using 1 user terminal 40, 60, 70. When a plurality of performers are selected, different gifts may be selected or the same gift may be selected as long as the object image 72d is not a gift representing a background image that is broadcast live.
According to the live broadcast distribution system 1, the following effects can be obtained.
(1) In the live distribution system 1, as a user demand, for example, a user purchases a gift until the performer accepts it in a mood that the performer wants to accept the gift, and gives the gift to the performer. Further, the user wants to increase the likelihood of acceptance by the performer as much as possible, and thus, throws the gift as close to the performer as possible. Further, as the competition consciousness among users, there is a competition consciousness that, for example, performers receive gifts sent by themselves and receive a return from themselves. Thereby, gift purchase by the user can be facilitated. In this way, the profit of the operator and performer can be improved.
(2) In the studio 10, the user A, B, C can give a gift to the performer A, B, C being performed in the studio 10 by making an action to cast a gift to the studio 10. The gifted gift is displayed on the floor of the studio 10 as a drop figure 75. Thereby, the performer A, B, C can also visually recognize that a gift was given from the user A, B, C such as a fan. Further, it is also displayed in front of the performer A, B, C in the user terminals 40, 60, 70, and therefore, the user A, B, C can also see that the gift cast by himself arrives in front of the performer A, B, C. In this way, even if the user A, B, C is not actually present in the studio 10, a gift can be given to the performer A, B, C by the user terminal 40, 60, 70 as if the user were present in the studio 10. Further, the performer A, B, C can also perform actions such as receiving actual gifts.
(3) In the case where the gift-presented performer A, B, C has made an action to pick up the item of the gift displayed on the floor of the studio 10, in the user terminals 40, 60, 70, the gift is displayed as if the gift-presented performer A, B, C was actually wearing on his body. Thus, the user giving the gift can see that the gift given by himself is accepted.
(4) When the user gives a gift to the performer A, B, C performing in the studio 10 by performing such a motion to the studio 10, operation data such as acceleration data, angle data, and angular velocity data relating to the swinging of the arm are transmitted from the user terminals 40, 60, and 70. Therefore, the gift position of the gift falling down can be changed according to the swinging speed of the arm and the like. Therefore, the user A, B, C can adjust the speed of swinging the arms and the like so that the gift falls as far as possible in front of the selected performer A, B, C, thereby improving entertainment.
(5) The performer A, B, C can be enabled to visually recognize the user ID of the user A, B, C giving the gift.
(6) Performer A, B, C can give a gift to user A, B, C. Thus, a bidirectional communication can be realized between the performer A, B, C and the user.
(7) Gifts of the gift returns can also be displayed in the user terminals 40, 60, 70 according to the actions of the performer A, B, C. Further, by enabling the user terminals 40, 60, and 70 to perform an operation of timely retrieving gifts for gift retrieval, it is possible to further enhance entertainment.
The live distribution system can be appropriately changed as described below.
In the case where performer A, B, C gives a gift to user A, B, C, performer A, B, C may also give a gift without using gift image 77. In this case, a physical gift of a gift back may be mailed to the gift user A, B, C. This can simplify the processing of the live distribution system 1.
With respect to return gifts, tangible gifts may also post physical objects to the user at a later date A, B, C. In the mailing, it is possible not to actually sign the ball, but to paper labels and/or merchandise related to the performer and/or albums such as CDs, DVDs, coupons for concerts, etc. In the case of sending a gift back, the user who has registered in the database 26 the intention of receiving the gift back is given. The presenter in this case may be either the performer A, B, C or the operator of the system. In addition, in the case of a failure in acceptance, it may be assumed that the user cannot receive a tangible gift (not mailed).
Performer A, B, C may not also be present for gift user A, B, C. That is, the server 20 may omit the gift-return process, or may not mail back the gift even if the gift is received.
For example, when the gift giving user A, B, C is not to be returned, the user ID may be managed without being associated with the gift to be given. This can simplify the processing of the live distribution system 1.
When the user terminal 60 is provided with a touch panel, the user terminal can give a gift to the selected performer A, B, C by performing a touch operation in the direction of the displayed performer A, B, C on the display surface on which live data is displayed using a finger or a stylus. In this case, the acceleration sensor and/or the gyro sensor are not required in the user terminal. Further, in the case where the user uses a user terminal without a touch panel, an operation of moving a cursor in the direction of the displayed performer A, B, C may be performed using a mouse to give a gift to the selected performer A, B, C.
As long as the falling object image of the gift cast to the performer A, B, C is displayed at least at the gift position, the trajectory until the falling object image reaches the gift position may be omitted.
The actual space in which the performer actually performs may be a space other than the studio 10, or may be a live meeting place and/or a concert place. In this case, the projector 16 displays the image of the gift on the stage, and the user can perform an operation of presenting the gift to the performer on the auditorium using a small portable information processing terminal such as the user terminal 60.
The method of displaying the object image of the gift on the studio 10 is not limited to the projector 16. For example, a plurality of flat display panels such as liquid crystal display panels may be arranged so that the display surface faces the floor, and a transparent synthetic resin may be laid on the display surface to form the floor of the studio 10, and the object image of the gift may be displayed on the floor. In addition, the gift position may be indicated only with a laser pointer. The gift may also be displayed using aerial display technology, aerial imaging technology, or aerial imaging technology. The gift can be displayed in a two-dimensional image (computer graphics) or a three-dimensional image (computer graphics (CG)). Further, the ground may be changed in a wavy manner by spreading a plurality of rods over the ground and moving the rods up and down in a direction perpendicular to the ground, thereby expressing the image of the gift. Further, the means for displaying the image of the gift in the studio 10 may be a combination of these devices.
The gift of can also be registered in the database 26, on the next broadcast, throw to the performer. Such gifts are non-sales gifts that the user cannot purchase. When the gift competes with gifts of other users, a control can be performed to make the gift preferentially worn by performers. The gift as a non-selling article can be an ornament, can be added with a special effect, and can also be a background image.
The specific motion performed by the performer and/or the user (e.g., the motion of projecting an object, or in which direction and with what degree of intensity) is not limited to being determined (detected) based on the detection result detected by the detection unit of the smart watch 50 and/or the detection unit of the smart device terminal 60a. The determination may be performed by calculating an inter-frame difference and/or a motion vector based on a video captured by a camera, for example.
For example, the following method is used instead of detecting the movement of the user A, B, C when the user gives a gift to the performer A, B, C using an acceleration sensor and/or a gyro sensor. A camera having a dynamic image shooting function, such as a network camera and a video camera, is arranged on the front of a user. The camera here is, for example, a network camera provided integrally with a notebook personal computer, and is, for example, a network camera and/or a video camera connected to a desktop personal computer. In addition, a camera built in the smart device terminal is exemplified. The user terminals 40, 60, and 70, the server 20, or other devices calculate movement data of the user's projected object image from the inter-frame difference between frames constituting the video data, and detect the projection operation based on the movement data. Or detecting the motion of the user to project an object by detecting the motion vector of the object image from the frame as a reference. Then, in the gift position on the floor of the studio 10 and/or the display surface of the user terminal 40, 60, 70, it is displayed in such a way that the performer A, B, C can recognize the track and/or gift of the gift.
Further, it is also possible to detect an operation of the performer A, B, C to acquire a gift given by the user A, B, C to the performer A, B, C by image analysis using the above-described inter-frame difference and/or movement vector. For example, the above-described image analysis detects an operation of the performer A, B, C to squat or bend down when picking up a gift, and/or an operation of touching a gift or a gift position where a gift is displayed. Then, a process can be performed in which the performer A, B, C wears a gift or adds a special effect to the performer A, B, C.
Further, the motion of the performer A, B, C wearing the acquired gift can be detected using the above-described image analysis or the like. As an example, the motion of the performer A, B, C moving the gift to his/her head can be detected using the image analysis described above.
Further, the above-described image analysis processing may be used to detect an action of the performer A, B, C to give a gift to the user A, B, C. As an example, the presenter A, B, C can be detected in the studio 10 using the image analysis described above.
That is, the motion of the performer A, B, C can be detected by the image analysis processing described above without using the depth information, and the motion of the user A, B, C can be detected by the image analysis processing described above.
The operation data may be at least acceleration data that is not all of acceleration data, angle data, and angular velocity data, but only movement data. Because the flying distance of the thrown gift and the like can be calculated according to the acceleration data.
In the studio 10, the studio display 17 may be omitted. In this case, the projector 16 may be used to display the ID object image 76a of the user ID of the user who has projected the object image of the gift.
In the case where too many gifts are given to the performer A, B, C, too many images of gifts are displayed on the display surface of the studio display 17 and/or the user terminals 40, 60, 70. Likewise, on the floor of the studio 10, the images of the gifts displayed by the projector 16 are excessive. In this case, when the number of participating user terminals exceeds the threshold, the server 20 randomly extracts the user terminals and displays the object image of the gift from the extracted user terminal on the display surface of the studio display 17 and/or the user terminals 40, 60, 70. Further, the server 20 displays an object image of a gift from the extracted user terminal on the floor of the studio 10 through the projector 16.
Gifts that are gifts given from user A, B, C to actor A, B, C and/or gift returns made from actor A, B, C to user A, B, C, as an example, something that is merely "sent out" from user A, B, C to actor A, B, C. In addition, as an example, user A, B, C who is "given" to performer A, B, C from user A, B, C shows something to performer A, B, C including feelings of gratitude, blessing, or feelings of recourse. Further, as an example, there is something that "performer A, B, C is given a gift (belonging) purchased by user A, B, C" to performer A, B, C from user A, B, C. Further, as an example, there is something that "gives" a gift (ownership) purchased by the user A, B, C to "the performer A, B, C.
The gift given from the user A, B, C to the performer A, B, C may be an image displayed only in the display surface of the user terminal 40, 60, 70 and/or the display surface of the studio display 17 as a gift. Such a gift may be selected from the gift choice image 72 shown in fig. 5 (a), or may be a gift such as image data created by the user himself/herself to magnify the stage of the performer, regardless of the gift choice image 72. Such gifts displayed only on the display surface of the user terminal 40, 60, 70 and/or the display surface of the studio display 17 are charged, for example, when the user purchases a gift. In addition, it may be free in the case of homemade gifts.
A donned gift and/or special gift worn by an actor A, B, C such as a headband may be charged a fee again when the actor A, B, C actually obtains the gift. That is, a charge may be made twice when the user A, B, C purchases a gift and when the performer A, B, C obtains the gift. Alternatively, the fee may be charged only when the performer A, B, C obtains the gift.
As a backpresentation from performer A, B, C to user A, B, C, there may be a display and/or performance that only the user receiving the backpresentation may view. In this case, as an example, the action of returning a gift from the performer A, B, C to the user A, B, C may not be performed. In this case, it is not necessary to send an actual object such as a signature ball to the user A, B, C.
The present may be a simple program including image data and/or moving image data created by the user A, B, C using software. As an example of the simple program, there is a special effect program including a performance in which an object image moves and a stage of a performer is gorgeous.
Description of the reference numerals
A live broadcast distribution system, 2.. a network, 10.. a studio, 11.. a playing device, 12.. a speaker, 13.. a microphone, 14.. an RGB camera, 15.. a depth camera, 16.. a projector, 17.. a studio display, 18.. a gift, 20.. a server, 21.. an audio IF, 22.. an RGB camera IF, 23.. a depth camera IF, 24.. a projector IF, 25.. a display IF, 26.. a database, 27.. a data storage, 28.. a network IF, 29.. a main memory, 30.. a control, 40.. a user terminal, 41.. an audio IF, 42.. a display IF, 43.. a network IF, 44.. a communication, 45.. a main memory, 46.. a main memory, 47.. a main memory, and 47.. a main memory, A control unit, 49.. a display unit, 50.. a smart watch, 51.. a sensor, 52.. a communication IF, 53.. a data storage unit, 54.. a main memory, 55.. a control unit, 60.. a user terminal, 60a.. a smart device terminal, 61.. an audio IF, 62.. a display IF, 63.. an operation IF, 64.. a sensor, 65.. a network IF, 66.. a data storage unit, 67.. a main memory, 68.. a control unit, 69.. a display unit, 70.. a user terminal, 71.. a live image, 72.. a gift selection object image, 72a.. object image, 72b. object image, 72c.. object image, 72d.. object image, 73.. a selection image, 74. a.. a person who performs a.a.a.a.a.a.a.a.a.a.a.a.a.a.a.a.a.a.a.a.a person who performs a.a.a.a., A drop object image, a 76.. an object image is acquired, a 76a.. ID object image, a 77.. a backgift object image, a 78.. an acceptance object image, a 81.. a special effect object image, and a 82.. a gift box object image.

Claims (13)

1. A display control system is characterized by comprising:
a display device control unit that causes a display device to display a video of a real space in which an performer is located as a target of live distribution;
an acquisition unit that acquires three-dimensional position information of the actual space;
a detection unit that detects a user action of a user for giving a gift to the performer; and
and a gift display control unit that calculates a gift position at which the gift should be placed in the actual space based on the three-dimensional position information acquired by the acquisition unit and the user motion information of the user motion detected by the detection unit, and displays an image of the gift in the actual space.
2. The display control system of claim 1,
the user action is detected by a smart device terminal of the user.
3. The display control system of claim 2,
the user action is an action of the user throwing an object.
4. The display control system of claim 3,
the user action information is action data of the user.
5. The display control system according to any one of claims 1 to 4,
the display device control unit calculates a gift image position at which a gift image corresponding to the gift is to be placed in the image, and causes the image and the gift image to be displayed on a display device in such a manner that the gift image is displayed at the gift image position.
6. The display control system of claim 5,
in a case where the detection portion determines that the performer is located within the range of the gift, the display device control portion causes the gift image to be displayed in the display device in association with the performer.
7. The display control system of claim 6,
the range of the gift is determined based on the three-dimensional position information acquired by the acquisition portion, the user action information detected by the detection portion, and the kind of the gift.
8. The display control system according to any one of claims 1 to 4,
the display device control unit delivers the gift back to the user when it is determined that the performer has performed an action for presenting the gift back to the user.
9. The display control system of claim 8,
the display device control unit delivers the gift back to the user when it is determined that the user has performed the accepting operation, and does not deliver the gift back to the user when it is determined that the user has not performed the accepting operation.
10. The display control system of claim 9,
the display device control unit causes the display related to the gift return to be displayed only on the display device of the user terminal to which the gift return is to be made, and not on the display devices of the user terminals of other users.
11. The display control system according to any one of claims 1 to 4,
the gift is a gift purchased by the user.
12. The display control system according to any one of claims 1 to 4,
the gift display control unit displays a gift image as the image of the gift.
13. A display control method characterized by comprising, in a display control unit,
the display device control unit causes the display device to display the video of the real space in which the performer is located as an object of live distribution,
three-dimensional position information of the actual space is acquired by an acquisition section,
detecting, by a detecting section, a user action of a user for giving a gift to the performer,
calculating, by a gift display control unit, a gift position at which the gift should be arranged in the actual space based on the acquired three-dimensional position information and the detected user motion information of the user motion, and displaying an image of the gift in the actual space.
CN201780084966.XA 2017-01-31 2017-01-31 Display control system and display control method Active CN110249631B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/003496 WO2018142494A1 (en) 2017-01-31 2017-01-31 Display control system and display control method

Publications (2)

Publication Number Publication Date
CN110249631A CN110249631A (en) 2019-09-17
CN110249631B true CN110249631B (en) 2022-02-11

Family

ID=63040375

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780084966.XA Active CN110249631B (en) 2017-01-31 2017-01-31 Display control system and display control method

Country Status (4)

Country Link
JP (1) JP6965896B2 (en)
CN (1) CN110249631B (en)
TW (1) TWI701628B (en)
WO (1) WO2018142494A1 (en)

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019216146A1 (en) 2018-05-08 2019-11-14 グリー株式会社 Moving picture delivery system for delivering moving picture including animation of character object generated based on motions of actor, moving picture delivery method, and moving picture delivery program
CN110460893B (en) 2018-05-08 2022-06-03 日本聚逸株式会社 Moving image distribution system, method thereof, and recording medium
US11128932B2 (en) 2018-05-09 2021-09-21 Gree, Inc. Video distribution system for live distributing video containing animation of character object generated based on motion of actors
JP6491388B1 (en) * 2018-08-28 2019-03-27 グリー株式会社 Video distribution system, video distribution method, and video distribution program for live distribution of a video including animation of a character object generated based on the movement of a distribution user
JP6713080B2 (en) * 2019-07-01 2020-06-24 グリー株式会社 Video distribution system, video distribution method, and video distribution program for live distribution of videos including animation of character objects generated based on movements of distribution users
JP6550549B1 (en) * 2019-04-25 2019-07-24 グリー株式会社 Video distribution system, video distribution method, and video distribution program for live distribution of a video including animation of a character object generated based on the movement of a distribution user
JP6523586B1 (en) * 2019-02-28 2019-06-05 グリー株式会社 Video distribution system, video distribution method, and video distribution program for live distribution of a video including animation of a character object generated based on the movement of a distribution user
US11044535B2 (en) 2018-08-28 2021-06-22 Gree, Inc. Video distribution system for live distributing video containing animation of character object generated based on motion of distributor user, distribution method, and storage medium storing video distribution program
WO2020044749A1 (en) * 2018-08-28 2020-03-05 グリー株式会社 Moving-image delivery system for delivering moving-image live that includes animation of character object generated on the basis of motion of delivering user, moving-image delivery method, and moving-image delivery program
WO2020105568A1 (en) * 2018-11-20 2020-05-28 グリー株式会社 System, method, and program for delivering moving-image
US11997368B2 (en) * 2018-12-12 2024-05-28 GREE Inc. Video distribution system, video distribution method, and storage medium storing video distribution program
JP6550546B1 (en) * 2019-03-26 2019-07-24 グリー株式会社 Video distribution system, video distribution method and video distribution program
JP6543403B1 (en) * 2018-12-12 2019-07-10 グリー株式会社 Video distribution system, video distribution method and video distribution program
JP6671528B1 (en) * 2019-07-01 2020-03-25 グリー株式会社 Video distribution system, video distribution method, and video distribution program
JP7277145B2 (en) * 2019-01-10 2023-05-18 株式会社Iriam Live communication system with characters
JP6809719B2 (en) * 2019-02-15 2021-01-06 ステルスバリュー合同会社 Information processing equipment and programs
JP7236632B2 (en) * 2019-03-26 2023-03-10 株式会社Mixi Server device, server device program and terminal device program
JP6748753B1 (en) 2019-04-02 2020-09-02 株式会社 ディー・エヌ・エー System, method and program for delivering live video
US11559740B2 (en) 2019-09-13 2023-01-24 Gree, Inc. Video modification and transmission using tokens
JP7360112B2 (en) * 2019-09-27 2023-10-12 グリー株式会社 Computer program, server device, terminal device, and method
JP7133590B2 (en) * 2020-08-13 2022-09-08 グリー株式会社 Video processing method, server device and computer program
JP6751193B1 (en) * 2019-10-31 2020-09-02 グリー株式会社 Video processing method, server device, and computer program
US11682154B2 (en) 2019-10-31 2023-06-20 Gree, Inc. Moving image processing method of a moving image viewed by a viewing user, a server device controlling the moving image, and a computer program thereof
JP7046044B6 (en) * 2019-11-08 2022-05-06 グリー株式会社 Computer programs, server devices and methods
JP7261727B2 (en) 2019-11-20 2023-04-20 グリー株式会社 Video distribution system, video distribution method and server
US11595739B2 (en) 2019-11-29 2023-02-28 Gree, Inc. Video distribution system, information processing method, and computer program
JP7336798B2 (en) * 2019-11-29 2023-09-01 グリー株式会社 Information processing system, information processing method and computer program
JP7134197B2 (en) * 2020-05-01 2022-09-09 グリー株式会社 Video distribution system, information processing method and computer program
JP6766246B1 (en) * 2019-12-19 2020-10-07 株式会社ドワンゴ Management server, user terminal, gift system and information processing method
WO2021145023A1 (en) * 2020-01-16 2021-07-22 ソニーグループ株式会社 Information processing device and information processing terminal
JP6798733B1 (en) * 2020-01-20 2020-12-09 合同会社Mdk Consideration-linked motion induction method and consideration-linked motion induction program
JP6803485B1 (en) * 2020-01-27 2020-12-23 グリー株式会社 Computer programs, methods and server equipment
JP6788756B1 (en) * 2020-01-27 2020-11-25 グリー株式会社 Information processing system, information processing method and computer program
JP7034191B2 (en) * 2020-01-30 2022-03-11 株式会社ドワンゴ Management server, gift system and information processing method
CN115039410A (en) * 2020-02-12 2022-09-09 索尼集团公司 Information processing system, information processing method, and program
CN111523545B (en) * 2020-05-06 2023-06-30 青岛联合创智科技有限公司 Article searching method combined with depth information
JP7284329B2 (en) * 2020-06-02 2023-05-30 グリー株式会社 Video distribution system, video distribution method, and video distribution program for live distribution of video containing animation of character object generated based on movement of distribution user
JP7104097B2 (en) * 2020-06-02 2022-07-20 グリー株式会社 Distribution A video distribution system, video distribution method, and video distribution program that delivers live videos including animations of character objects generated based on user movements.
JP7145266B2 (en) * 2020-06-11 2022-09-30 グリー株式会社 Information processing system, information processing method and computer program
JP7521779B2 (en) * 2020-06-12 2024-07-24 株式会社コナミデジタルエンタテインメント Video distribution system, computer program used therein, and control method
JP7093383B2 (en) * 2020-08-07 2022-06-29 株式会社 ディー・エヌ・エー Systems, methods, and programs for delivering live video
JP7175299B2 (en) * 2020-08-21 2022-11-18 株式会社コロプラ Program, method and computer
JP7255918B2 (en) * 2020-09-16 2023-04-11 日本紙工株式会社 Video evaluation system, video evaluation program, video evaluation method
JP6841465B1 (en) * 2020-10-02 2021-03-10 合同会社Mdk Consideration-linked motion induction method and consideration-linked motion induction program
JP2022096096A (en) * 2020-12-17 2022-06-29 株式会社ティーアンドエス Video distribution method and program for the same
WO2022149517A1 (en) * 2021-01-05 2022-07-14 株式会社コルク Livestreaming distribution system and method therefor
CN112929685B (en) * 2021-02-02 2023-10-17 广州虎牙科技有限公司 Interaction method and device for VR live broadcast room, electronic device and storage medium
JP7156735B1 (en) 2021-10-26 2022-10-20 合同会社Mdk Program, management server device, content distribution management method, content distribution method
CN116320508A (en) * 2022-09-07 2023-06-23 广州方硅信息技术有限公司 Live interaction method, computer equipment and storage medium
JP7349689B1 (en) 2022-09-07 2023-09-25 義博 矢野 Information processing method and information processing system
US20240153350A1 (en) 2022-11-04 2024-05-09 17Live Japan Inc. Gift box event for live streamer and viewers

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007288295A (en) * 2006-04-13 2007-11-01 Nippon Telegr & Teleph Corp <Ntt> Observation position tracking type video image providing apparatus and observation position tracking type video image providing program, and video image providing apparatus and the video image providing program
CN104363519A (en) * 2014-11-21 2015-02-18 广州华多网络科技有限公司 Online-live-broadcast-based information display method, device and system
CN104516492A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Man-machine interaction technology based on 3D (three dimensional) holographic projection
CN105373306A (en) * 2015-10-13 2016-03-02 广州酷狗计算机科技有限公司 Virtual goods presenting method and device
CN106131536A (en) * 2016-08-15 2016-11-16 万象三维视觉科技(北京)有限公司 A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010512693A (en) * 2006-12-07 2010-04-22 アダックス,インク. System and method for data addition, recording and communication
JP2012120098A (en) * 2010-12-03 2012-06-21 Linkt Co Ltd Information provision system
JP5726987B2 (en) * 2013-11-05 2015-06-03 株式会社 ディー・エヌ・エー Content distribution system, distribution program, and distribution method
JP5530557B1 (en) * 2013-12-13 2014-06-25 株式会社 ディー・エヌ・エー Server, program and method for distributing content
JP5902768B2 (en) * 2014-07-22 2016-04-13 トモヤ 高柳 Content distribution system
US9846968B2 (en) * 2015-01-20 2017-12-19 Microsoft Technology Licensing, Llc Holographic bird's eye view camera
US20160330522A1 (en) * 2015-05-06 2016-11-10 Echostar Technologies L.L.C. Apparatus, systems and methods for a content commentary community
CN106231368B (en) * 2015-12-30 2019-03-26 深圳超多维科技有限公司 Main broadcaster's class interaction platform stage property rendering method and its device, client
CN106231435B (en) * 2016-07-26 2019-08-02 广州华多网络科技有限公司 The method, apparatus and terminal device of electronics present are given in network direct broadcasting
CN106331735B (en) * 2016-08-18 2020-04-21 北京奇虎科技有限公司 Special effect processing method, electronic equipment and server
CN106355440A (en) * 2016-08-29 2017-01-25 广州华多网络科技有限公司 Control method and device for giving away electronic gifts in group

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007288295A (en) * 2006-04-13 2007-11-01 Nippon Telegr & Teleph Corp <Ntt> Observation position tracking type video image providing apparatus and observation position tracking type video image providing program, and video image providing apparatus and the video image providing program
CN104516492A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Man-machine interaction technology based on 3D (three dimensional) holographic projection
CN104363519A (en) * 2014-11-21 2015-02-18 广州华多网络科技有限公司 Online-live-broadcast-based information display method, device and system
CN105373306A (en) * 2015-10-13 2016-03-02 广州酷狗计算机科技有限公司 Virtual goods presenting method and device
CN106131536A (en) * 2016-08-15 2016-11-16 万象三维视觉科技(北京)有限公司 A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof

Also Published As

Publication number Publication date
TW201832161A (en) 2018-09-01
JP6965896B2 (en) 2021-11-10
JPWO2018142494A1 (en) 2019-11-21
WO2018142494A1 (en) 2018-08-09
CN110249631A (en) 2019-09-17
TWI701628B (en) 2020-08-11

Similar Documents

Publication Publication Date Title
CN110249631B (en) Display control system and display control method
JP6382468B1 (en) Movie distribution system, movie distribution method, and movie distribution program for distributing movie including animation of character object generated based on movement of actor
US20220148386A1 (en) Composition production with audience participation
JP6431233B1 (en) Video distribution system that distributes video including messages from viewing users
JP6420930B1 (en) Movie distribution system, movie distribution method, and movie distribution program for distributing movie including animation of character object generated based on movement of actor
WO2020027226A1 (en) Display control system, display control method, and display control program
WO2021246498A1 (en) Live broadcasting system
CN111148554A (en) Virtual reality presentation of real world space
WO2019216146A1 (en) Moving picture delivery system for delivering moving picture including animation of character object generated based on motions of actor, moving picture delivery method, and moving picture delivery program
US11074737B2 (en) Information processing apparatus and method
JP2018094326A (en) Event control system, and event notification system and program
JPWO2019234879A1 (en) Information processing system, information processing method and computer program
JP2019188059A (en) Program, information processing device and method
JP2019192174A (en) Program, information processing device, and method
JP2020162084A (en) Content distribution system, content distribution method, and content distribution program
JP2022075669A (en) Program, information processing device, and method
JP6498832B1 (en) Video distribution system that distributes video including messages from viewing users
JP2020091884A (en) Moving image distribution system, moving image distribution method, and moving image distribution program distributing moving image including animation of character object generated based on actor movement
JP6431242B1 (en) Video distribution system that distributes video including messages from viewing users
US11843809B2 (en) Movie distribution system
JP6592214B1 (en) Video distribution system that distributes video including messages from viewing users
WO2021095576A1 (en) Information processing device, information processing method, and program
JP2023124782A (en) Performance video display program
JP2020167661A (en) Content distribution system, content distribution method, and content distribution program
JP2019198054A (en) Video distribution system for distributing video including animation of character object generated on the basis of actor movement and video distribution program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant