WO2017077751A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2017077751A1 WO2017077751A1 PCT/JP2016/073014 JP2016073014W WO2017077751A1 WO 2017077751 A1 WO2017077751 A1 WO 2017077751A1 JP 2016073014 W JP2016073014 W JP 2016073014W WO 2017077751 A1 WO2017077751 A1 WO 2017077751A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- moving image
- information
- sticker
- image
- unit
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 137
- 238000003672 processing method Methods 0.000 title claims description 7
- 238000000034 method Methods 0.000 claims abstract description 111
- 230000008569 process Effects 0.000 claims abstract description 104
- 238000004891 communication Methods 0.000 claims abstract description 56
- 238000004458 analytical method Methods 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims description 320
- 238000009826 distribution Methods 0.000 claims description 66
- 230000033001 locomotion Effects 0.000 claims description 30
- 238000000605 extraction Methods 0.000 claims description 8
- 238000010191 image analysis Methods 0.000 description 92
- 230000000694 effects Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 24
- 238000003384 imaging method Methods 0.000 description 24
- 230000008859 change Effects 0.000 description 21
- 230000006870 function Effects 0.000 description 21
- 101100296200 Mus musculus Pak3 gene Proteins 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 12
- 230000005540 biological transmission Effects 0.000 description 11
- 238000003860 storage Methods 0.000 description 11
- 101150034941 AURKB gene Proteins 0.000 description 8
- 101150008262 STK24 gene Proteins 0.000 description 5
- 101150112794 Stk3 gene Proteins 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 4
- 101100058698 Neurospora crassa (strain ATCC 24698 / 74-OR23-1A / CBS 708.71 / DSM 1257 / FGSC 987) cln3 gene Proteins 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 241000282326 Felis catus Species 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 101150000485 snd1 gene Proteins 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 2
- 238000004220 aggregation Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47205—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
Definitions
- This disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 discloses a technique for appropriately controlling the supply of video from an imaging device that captures live video.
- a video is distributed from a distribution user to a viewer unilaterally.
- no means for communicating with the distribution user from the viewer is disclosed.
- a communication means for example, a chat function or a comment function
- the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of facilitating communication between users via distributed moving images.
- the operation information including the moving image that is distributed live and the operation position related to the placement of the sticker on the moving image displayed on the display unit of the device and the information related to the placed sticker is acquired.
- a control unit that analyzes a region corresponding to the operation position on the moving image, and performs an arrangement process of the sticker with respect to the moving image based on the analysis result and the operation information.
- An information processing apparatus is provided.
- the processor acquires the moving image being distributed live, and the operation position and the arrangement regarding the arrangement of the sticker on the moving image displayed on the display unit of the device.
- Acquiring operation information including information related to a sticker; analyzing a moving image corresponding to the operation position on the moving image; and analyzing the moving image based on the analysis result and the operating information.
- an information processing method including performing an arrangement process of a sticker.
- the operation information transmitted to the server is generated from the server by generating the operation information including the operation position relating to the placement of the sticker on the moving image and the information relating to the sticker to be arranged, and by controlling the communication unit.
- the sticker is arranged by the server based on the analysis result obtained by analyzing the region on the moving image corresponding to the operation position acquired from the device including the computer by the server and the operation information. Generating display information for causing the displayed moving image to be displayed on the display unit. Beam is provided.
- FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating an outline of a configuration of an information processing system 1 according to an embodiment of the present disclosure.
- the information processing system 1 includes a server 10, a user terminal 20, and user terminals 30 (30a, 30b,).
- Server 10, user terminal 20, and user terminal 30 are connected by various wired or wireless networks NW.
- the server 10 is an example of an information processing apparatus.
- the server 10 is configured by one or a plurality of information processing apparatuses on the network.
- the information processing device constituting the server 10 can include a processing circuit and a communication device.
- the server 10 processes a moving image (for example, live video) received from the user terminal 20 using a communication device, using a processing circuit based on operation information acquired from the user terminal 20 or the user terminal 30.
- the processed moving image may be transmitted to each user terminal by the communication device.
- the user terminal 20 (30) is an example of an information processing apparatus.
- a smartphone is shown as an example of the user terminal 20 (30).
- the user terminal 20 (30) includes other devices such as a tablet, a personal computer, a recorder, a game machine, and a wearable device. obtain.
- the user terminal 20 (30) may include a processing circuit, a communication device, and an input / output device.
- the user terminal 20 (30) receives a moving image from the server 10 using a communication device, and outputs the moving image using an input / output device.
- the user terminal 20 (30) may acquire an operation on the moving image displayed on the input / output device, and transmit information regarding the operation to the server 10 using the communication device.
- the user terminal 20 (30) may have an imaging function for imaging a real space and generating a moving image. The moving image generated by the imaging function of the user terminal 20 (30) is transmitted to the server 10 and can be distributed to each user terminal by the server 10.
- the user terminal 20 captures a live video by the operation of the distribution user SS, and the captured live video is distributed to the user terminal 30 after predetermined processing is performed by the server 10. .
- the viewing users AA1 and AA2 can view the live video distributed via the user terminal 30.
- the user terminal 30 that has acquired the live video can instruct to perform a predetermined process on the live video by a predetermined operation by the user on the touch panel on which the live video is displayed.
- the instruction is transmitted to the server 10.
- the server 10 performs a predetermined process corresponding to the acquired instruction on the live video, and distributes the processed live video to each user terminal. Thereby, live video based on an operation by an arbitrary viewing user is displayed on each user terminal.
- a display screen 1000 is displayed on the display unit of the user terminal 20, the user terminal 30a, and the user terminal 30b.
- Display screen 1000 includes live video display screen 1010 and sticker selection screen 1012.
- a live video including the object Obj of the distributor SS is displayed on the live video display screen 1010.
- the user AA2 selects the heart sticker Stk from the sticker selection screen 1012 using the operating tool H, and operates to place the sticker Stk at an arbitrary position on the live video display screen 1010.
- the user terminal 30b that has acquired the operation transmits an instruction related to the process corresponding to the operation to the server 10.
- the server 10 acquires the instruction, performs processing corresponding to the instruction, and distributes the processed live video to each user terminal.
- a live video in a state where the sticker Stk is arranged is displayed on the display unit of the user terminal 20.
- a live image in which the sticker Stk is arranged is also displayed on the display unit of the user terminal 30a of the viewing user AA1.
- the viewing user (or distribution user) can easily communicate with a plurality of users in real time while viewing the live video by performing an operation such as placing a sticker on the live video being distributed. be able to. Therefore, communication between users via live video becomes smoother.
- Terminal UI example> The configuration and operation processing of a UI (User Interface) displayed on the display operation unit of the user terminal 30 (20) will be described in detail with reference to FIGS.
- FIG. 2 is a diagram illustrating an example of a UI configuration displayed on the display operation unit 210 of the user terminal 30 (20).
- a display screen 1000 is displayed on the display operation unit 210 of the user terminal 30 (20).
- the display screen 1000 displays a header 1001, a live video display screen 1010, a mode selection screen 1011 and a sticker selection screen 1012.
- the header 1001 includes an ID 1002 and information 1003 related to the live video.
- the ID of the viewing user may be displayed in the ID 1002, or the ID of the distribution user indicating the distribution source of the live video may be displayed.
- the information 1003 related to the live video may be, for example, an index (for example, “user level”) indicating the usage frequency of the information processing system 1 of the distribution user or the viewing user, or an index related to the live video (for example, “exciting” Point ").
- the index related to the live video is, for example, a value (corresponding to the excitement point) calculated according to the number of executions of processing (for example, placement of stickers) performed on the live video by a viewing user or the like during distribution of the live video There may be.
- the use of the information processing system 1 by each user can be promoted. For example, by raising the above index, it is possible to increase the user's own attention level, such as being positioned at the top of a ranking composed of a large number of users. Can do. For this reason, it is desirable to display the above index in the information 1003 related to the live video in the header 1001. In addition, when the user uses a specific sticker, it is possible to further increase the above index, thereby further promoting the use of the information processing system 1 by the user.
- the live video display screen 1010 is a screen for displaying the live video M1 distributed from the server 10.
- the live video M1 includes a distribution user object Obj1 corresponding to the distribution user.
- live video captured by the user terminal 20 of the distribution user is distributed to the user terminal 30 of each viewing user via the server 10.
- the mode selection screen 1011 is a screen for allowing the user to select an operation mode for live video.
- the mode selection screen 1011 includes, for example, objects for selecting a comment entry mode, a user ranking display mode, a notification confirmation mode, a sharing mode, and the like in addition to the sticker selection mode.
- the sticker selection mode is highlighted. This indicates that the sticker selection mode is selected.
- the sticker selection screen 1012 is a screen for allowing the user to select an arbitrary sticker when the sticker selection mode is selected.
- the sticker selection screen 1012 includes at least one sticker.
- an instruction for performing processing related to the sticker is transmitted to the server 10 for the live video M1. This operation process will be described later.
- an operation screen corresponding to each selected mode is displayed on the sticker selection screen 1012.
- the sticker selection screen 1012 displays a screen for entering a comment (for example, a text field or a pseudo keyboard).
- the sticker selection screen 1012 displays information related to the user ranking.
- FIG. 3 is a diagram for describing an operation of selecting a sticker using a UI displayed on the display operation unit 210 of the user terminal 30 (20).
- the user operating the user terminal 30 selects the button Btn1 indicating the heart sticker included in the sticker selection screen 1012 with the operating tool H (for example, the user's finger).
- the area defining the button Btn1 may be highlighted to display that the button Btn1 has been selected.
- FIG. 4 is a diagram for explaining an operation of placing a sticker on a live video using a UI displayed on the display operation unit 210 of the user terminal 30 (20).
- the user selects an arbitrary position (arrangement position) on the live video display screen 1010 with the operating tool H.
- the user terminal 30 transmits operation information (for example, the type or arrangement position of the sticker Stk1) by the operating tool H to the server 10.
- the server 10 that has acquired the operation information performs image processing based on the operation information on the live video.
- the server 10 distributes the live video after the image processing to a plurality of user terminals including the user terminal 30.
- live video M1 after image processing (for example, live video on which the sticker Stk1 is superimposed) is displayed on the live video display screen 1010.
- FIG. 5 is a block diagram illustrating a functional configuration example of the server 10 according to an embodiment of the present disclosure.
- the server 10 includes an image acquisition unit 101, an operation information acquisition unit 102, an image analysis unit 103, and an image processing unit 104. Operations of these functional units are controlled by a processing circuit such as a CPU (Central Processing Unit) provided in the server 10.
- a processing circuit such as a CPU (Central Processing Unit) provided in the server 10.
- the image acquisition unit 101 acquires moving image data generated by the user terminal 20 that is a live video distribution source via a communication unit (not shown). For example, the image acquisition unit 101 acquires the moving image data generated by the user terminal 20 in time series.
- the moving image data includes, for example, information such as acoustic information and moving image capturing time in addition to a moving image that is a live video.
- the image acquisition unit 101 sequentially outputs the acquired moving image data to the image analysis unit 103.
- the frame rate of the moving image data sequentially output to the image analysis unit 103 is an imaging frame rate by the imaging unit 220 of the user terminal 20, an analysis process by the image analysis unit 103 described later, or an image by the image processing unit 104. It may be determined according to at least one of the throughputs of processing. For example, the frame rate may be determined according to the processing time of the longer throughput of the above analysis processing or image processing. As a result, it is possible to reduce the latency that can occur between the user terminals on the distribution side and the viewing side of the moving image subjected to the image processing.
- the image acquisition unit 101 may output moving image data to the image processing unit 104.
- the operation information acquisition unit 102 acquires operation information transmitted from at least one of the user terminal 20 and the user terminal 30 via a communication unit (not shown). For example, the operation information acquisition unit 102 acquires operation information transmitted from at least one of the user terminals in time series.
- the operation information includes, for example, an operation position related to placement of a sticker on a moving image displayed on the display operation unit 210 of the user terminal 20, information related to the sticker, and a display screen on which a moving image is displayed.
- the type of operation performed on the operation mode, the operation mode, and the like are included.
- the operation position means a position on the moving image designated by the operation body or the like on the moving image. More specifically, the operation position may be a position on the moving image corresponding to the position touched by the operating body when the operating body contacts the display operation unit 210. This operation position can be specified by the pixel position of the moving image.
- the operation information acquisition unit 102 acquires the position (u, v) on the moving image corresponding to the position (x, y) on the display operation unit 210 of the user terminal 20 as the operation position.
- the position (u, v) on the moving image corresponds to the pixel position of the moving image.
- the information related to the sticker includes, for example, information related to the image processing for the moving image such as the type of the sticker, the mode of the sticker, and the effect of the sticker arranged on the moving image.
- the effect includes, for example, an effect that changes the type or mode of the sticker itself, an effect that changes the type or mode of another sticker arranged on the moving image, and the like.
- the effect that changes the aspect of the sticker may be an effect that changes the size, color, position, tilt, movement, display time, or the like of the sticker.
- the effects also include effects for processing or editing moving images, or effects for outputting sound such as sound effects.
- the type of operation includes, for example, whether or not a sticker superimposed on the moving image display screen as described above is used.
- the operation mode includes, for example, an operation method using an operating tool for a moving image display screen. More specifically, the operation mode includes a known gesture process for a touch panel or the like having a function as a display operation unit, such as a tap operation, a pinch operation, a swipe operation, or a slide operation by an operating body. Further, when the display operation unit can detect the pressure applied by the operation body, the operation mode includes information on the magnitude of the pressure applied to the display operation unit. The operation position, information on the sticker, the type of operation, and the operation mode are transmitted as operation information from the user terminal 20 (30) to the server 10 and acquired by the operation information acquisition unit 102.
- the operation information acquisition unit 102 outputs the acquired operation information to the image analysis unit 103. Note that, when the moving image analysis processing by the image analysis unit 103 is not performed, the operation information acquisition unit 102 may output the operation information to the image processing unit 104.
- the image analysis unit 103 performs image analysis based on the moving image data acquired by the image acquisition unit 101 and the operation information acquired by the operation information acquisition unit 102. More specifically, the image analysis unit 103 analyzes a region corresponding to the operation position on the moving image.
- the image analysis unit 103 analyzes the region on the moving image corresponding to the operation position related to the arrangement of the stickers included in one operation information. Specifically, the image analysis unit 103 recognizes a specific object from a region on the moving image corresponding to the operation position. This object may be an object such as a person or an object included in the moving image, for example.
- a moving image generally includes a background and an object.
- the image analysis unit 103 analyzes the object that is the target of the image processing, the effect of the image processing is generated only on the object, so that it is possible to prevent the viewer from feeling uncomfortable. Further, by analyzing the object by the image analysis unit 103, the image processing according to the object can be performed by the image processing unit 104 at the subsequent stage.
- the object analyzed by the image analysis unit 103 may be a living body such as a person or an object.
- the image analysis unit 103 may recognize a specific object using a known image recognition technique such as feature amount calculation or an image analysis technique. Further, when the object included in the moving image changes in time series, the image analysis unit 103 may analyze the movement of the object.
- the objects to be analyzed by the image analysis unit 103 are all and part of the moving image before processing by the image processing unit 104 and all and part of the moving image after processing by the image processing unit 104.
- the processed moving image means a moving image after being processed by the image processing unit 104. That is, a sticker or the like superimposed on the moving image is also an object of analysis by the image analysis unit 103. Whether the object to be analyzed is a moving image before processing or a moving image after processing can be determined according to the type or target of processing by the image processing unit 104.
- the image analysis unit 103 outputs the analysis result to the image processing unit 104.
- the image processing unit 104 performs image processing on the moving image based on the analysis result acquired from the image analysis unit 103 and the operation information acquired by the operation information acquisition unit 102.
- the image processing referred to here is, for example, sticker placement processing.
- the image processing by the image processing unit 104 may include a processing process or an editing process for a moving image.
- the image processing unit 104 performs a process of superimposing the sticker on the operation position on the moving image based on the tap operation on the moving image by the operating body.
- the superimposed stickers may be combined in the same layer as the moving image, or may be superimposed on a different layer from the moving image.
- the image processing unit 104 may further perform processing on the sticker previously superimposed on the moving image based on the operation information.
- the image processing unit 104 may perform image processing that causes another effect by superimposing another sticker on the sticker previously superimposed on the moving image.
- the image processing unit 104 may change the type or mode of the sticker according to the operation mode of the operating tool.
- the image processing unit 104 may perform a process of sequentially changing the position of the sticker on the moving image based on the slide operation.
- the image processing unit 104 may perform processing for enlarging or reducing the sticker based on the pinch operation.
- the image processing unit 104 may perform processing for changing the type or manner of the sticker according to the value of the applied pressure.
- the image processing unit 104 may perform a process of changing the type or aspect of the sticker according to the operation mode by the operating tool.
- the aspect of the sticker means the strength or type of effect caused by the size, color or arrangement of the sticker.
- These image processes are continuously executed for a predetermined time after the operation information acquisition unit 102 acquires the operation information, for example.
- the predetermined time is determined according to the type of operation information.
- the image processing unit 104 performs a process of superimposing a sticker on a moving image for n seconds after obtaining operation information including an instruction to arrange a sticker.
- the image processing unit 104 may change the image processing method according to the execution time of the image processing. For example, when the image processing unit 104 performs processing for arranging stickers, for the stickers superimposed on the moving image, the image processing unit 104 gradually fades out from the moving image immediately before the time when the image processing ends. Such processing may be performed. Thereby, it is possible to prevent the viewing user from feeling uncomfortable with the moving image that has been subjected to image processing.
- the image processing unit 104 performs image processing on the object recognized by the image analysis unit 103, for example. Accordingly, as described above, the effect of image processing can be presented to the user without causing the user who views the moving image to feel uncomfortable about the moving image.
- an object feature means a feature included in one object.
- the feature of the object may be a part of the person's body.
- the part of the body includes, for example, not only the face of a person but also the head, face, eyes, nose, mouth, trunk, or limbs (arms, legs).
- the image processing unit 104 may change the mode of the sticker included in the operation information in accordance with the characteristics of the analyzed object when performing image processing for superimposing a sticker on a moving image. More specifically, when the area corresponding to the operation position is included in the area corresponding to the characteristic of the object, the aspect of the sticker may be changed.
- the change in the sticker mode may be, for example, a change in the type of the sticker included in the operation information, or a change in the effect of the sticker included in the operation information. Thereby, the user can enjoy the change of the stickers arranged at various positions of the object included in the moving image.
- the image processing unit 104 may perform image processing based on the movement of the object. For example, the image processing unit 104 may perform a process of invalidating the previously executed image process based on the movement of the object. As a result, it is possible to invalidate the viewing user's actions that would be disadvantageous to the distribution user. Further, the image processing unit 104 may change the application position of the previously executed image processing based on the movement of the object. For example, the image processing unit 104 may change the arrangement position of the sticker based on the movement of the object.
- the object included in the moving image is the body of the distribution user
- communication of a distribution user and a viewing user can be diversified more.
- the above-described invalidation of the image processing and change of the application position may be performed based not only on the movement of the object but also on operation information related to the operation on the moving image by the distribution user. As a result, the distribution user can easily invalidate the image processing or change the application position.
- the image processing unit 104 may perform not only image processing on a moving image displayed on the display operation unit 210 of the user terminal 20 (30) but also other processing based on operation information and the like.
- the image processing unit 104 may perform processing based on the acquisition amount of operation information. More specifically, the image processing unit 104 may perform processing based on a time-series distribution of operation information acquisition amounts.
- Other processing includes, for example, statistical processing or extraction processing for moving images based on operation information and the like.
- the image processing unit 104 may perform processing based on a time-series distribution of the acquisition amount of operation information detected at a predetermined time.
- the predetermined time may be a time between the start time and the end time of the live video, for example.
- the image processing unit 104 divides the distributed moving image for each unit time, and performs processing based on the operation information acquisition amount acquired by the operation information acquisition unit 102 in each divided section. Also good.
- the image processing unit 104 may set a section with a large acquisition amount (for example, a section where the acquisition amount is equal to or greater than a threshold) as the extraction section. Then, the image processing unit 104 may combine a plurality of extraction sections. Thereby, it is possible to easily create a highlight video related to a moving image that has already been distributed.
- a section with a large acquisition amount for example, a section where the acquisition amount is equal to or greater than a threshold
- extraction processing and highlight video generation processing by the image processing unit 104 as described above may be performed sequentially during the distribution of moving images by the user terminal 20 that is the distribution source, or the moving image by the user terminal 20 May be executed as post-processing after the delivery of.
- the predetermined time may be an arbitrary section specified during the distribution of live video.
- the image processing unit 104 may perform processing based on the operation information acquired in the section.
- the image processing unit 104 may perform processing based on the distribution of operation positions. More specifically, the image processing unit 104 may perform statistical processing on a moving image based on the distribution of operation positions acquired in a certain time interval. For example, the image processing unit 104 may acquire information such as which object is most frequently operated among a plurality of objects included in the moving image from the distribution of operation positions by statistical processing.
- the statistical process may be a totaling process such as voting, for example. Thereby, the distribution user can obtain more quantitative data regarding the communication with the viewing user.
- the image processing unit 104 may perform processing based on acoustic information accompanying the moving image, attribute information accompanying the operation information, or the like. Specific examples of these processes will be described later.
- the moving image data including the moving image subjected to the image processing by the image processing unit 104 is distributed to each user terminal via a communication unit (not shown).
- FIG. 6 is a block diagram illustrating a functional configuration example of the user terminal 20 (30) according to an embodiment of the present disclosure. Since the user terminal 20 and the user terminal 30 have the same functional configuration, the functional configuration of the user terminal 20 will be described here.
- the user terminal 20 includes a control unit 200 and a display operation unit 210. Note that when the user terminal 20 performs moving image imaging processing, the user terminal 20 may further include an imaging unit 220.
- the control unit 200 controls the overall operation of each functional unit included in the user terminal 20.
- the control unit 200 is realized by a processing circuit such as a CPU, for example.
- the control unit 200 has functions as an image acquisition unit 201, a display control unit 202, an operation information generation unit 203, and an operation information transmission unit 204.
- the image acquisition unit 201 acquires moving image data including a moving image from the server 10 via a communication unit (not shown).
- the moving image acquired from the server 10 is a moving image once transmitted to the server 10 from the user terminal that is the distribution source of the moving image.
- the image acquisition unit 201 acquires moving image data from the server 10 in time series.
- the image acquisition unit 201 outputs the acquired moving image data to the display control unit 202.
- the display control unit 202 performs control for causing the display operation unit 210 to display a moving image included in the moving image data acquired from the image acquisition unit 201. For example, the display control unit 202 displays a moving image on a predetermined screen in the display operation unit 210.
- the display control unit 202 may cause the display operation unit 210 to display the moving image generated by the imaging unit 220.
- the moving image displayed on the display operation unit 210 is preferably a moving image acquired from the server 10.
- the operation information generation unit 203 generates operation information based on an operation on the moving image displayed on the display operation unit 210. For example, the operation information generation unit 203 generates operation information based on an operation related to the arrangement of the stickers with respect to the moving image displayed on the display operation unit 210. More specifically, the operation information generation unit 203 acquires from the display operation unit 210 the operation position related to the placement of the sticker on the moving image, information related to the sticker, the type of operation for the display operation unit 210, and the operation mode. These are generated as operation information.
- the operation position included in the operation information means an operation position on the moving image.
- the operation information generation unit 203 displays the position (x, y) on the display operation unit 210. ) On the moving image corresponding to () is acquired as the operation position.
- the operation information generation unit 203 outputs the generated operation information to the operation information transmission unit 204.
- the operation information transmission unit 204 transmits the operation information generated by the operation information generation unit 203 to the server 10 via a communication unit (not shown).
- processing by the image acquisition unit 201 and the display control unit 202 is sequentially performed, but the processing by the operation information generation unit 203 and the operation information transmission unit 204 is performed on the display operation unit 210 on which a moving image is displayed. It is executed when there is an operation by the operating body.
- the display operation unit 210 functions as a display unit and an operation unit.
- the display operation unit 210 causes the display control unit 202 to display a moving image.
- the display operation unit 210 acquires an operation by the operation body.
- Information regarding the operation acquired by the display operation unit 210 is output to the operation information generation unit 203.
- the display operation unit 210 may be realized by a member that serves as both a display member and an input member, for example, a touch panel.
- the display operation unit 210 integrally has functions as a display unit and an operation unit, but the present technology is not limited to such an example.
- the display operation unit 210 may have a function as a display unit and a function as an operation unit separately.
- the display unit may be realized by a display device such as an LCD (Liquid Crystal Display) or an OELD (Organic Electro-Luminescence Display).
- a display device that realizes a display unit includes a touch sensor that can recognize a known gesture process for a touch panel having a function as a display operation unit, such as a tap operation, a pinch operation, a swipe operation, or a slide operation by an operating body. May be. Thereby, the operation performed by the touch with respect to the display part by the operating body can be detected.
- achieves a display part may further have a pressure sensor. Thereby, the pressure applied with respect to a display part by the operation body is detectable. Therefore, the operation unit may be realized by various input members such as a keyboard, a touch panel, a mouse, or a wheel, for example.
- the imaging unit 220 images a real space and generates a captured image (moving image).
- the generated moving image is transmitted to the server 10 via a communication unit (not shown).
- the generated moving image may be output to the display control unit 202.
- the imaging unit 220 includes, for example, an imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and various lenses such as a lens for controlling the formation of a subject image on the imaging device. This is realized by the members.
- the user terminal 20 may provide the function of the imaging unit 220 outside the user terminal 20. In this case, the user terminal 20 may acquire a moving image generated by an imaging device such as a digital camera and transmit the moving image to the server 10.
- FIG. 7 is a flowchart illustrating an operation example of an operation process performed by the user terminal 20 (30) according to an embodiment of the present disclosure.
- the flow of processing until the user terminal 20 generates operation information related to the operation by the operating tool for the user terminal 20 and acquires a moving image that has been subjected to image processing based on the operation information will be described.
- the display operation unit 210 detects an operation performed on the moving image displayed on the display operation unit 210 (S101). Subsequently, the operation information generation unit 203 acquires the operation detected by the display operation unit 210 from the display operation unit 210, and generates operation information based on an operation related to the arrangement of the stickers (S103). Next, the operation information transmission unit 204 transmits the operation information to the server 10 (S105). The server 10 that has acquired the operation information performs image processing based on the operation information on the distributed moving image.
- the image acquisition unit 201 acquires, from the server 10, a moving image that has been subjected to image processing based on operation information including at least the operation information transmitted by the user terminal 20 (S 107). Thereby, the operation processing performed by the user terminal 20 is applied to the moving image being distributed.
- FIG. 8 is a flowchart illustrating an operation example of image processing by the server 10 according to an embodiment of the present disclosure. Here, the processing until the server 10 acquires the moving image from the user terminal 20 that is the moving image distribution source, performs image analysis and image processing based on the operation information, and distributes the moving image after the image processing. The flow will be described.
- the image acquisition unit 101 acquires a moving image from the user terminal 20 that is a moving image distribution source (S201). Further, the operation information acquisition unit 102 acquires operation information from a plurality of user terminals including the user terminal 20 that is a distribution source (S203).
- the image analysis unit 103 performs image analysis on a region corresponding to the operation position included in the operation information (S205).
- the image processing unit 104 executes a sticker arrangement process on the moving image based on the operation information and the analysis result by the image analysis unit 103 (S207). Then, the moving image on which the sticker arrangement processing has been performed by the image processing unit 104 is distributed to each user terminal (S209).
- the server 10 according to the present embodiment performs image processing on a moving image acquired from the user terminal 20 based on operation information acquired from each user terminal. At that time, the server 10 may perform image processing after performing image analysis on a region corresponding to the operation position included in the operation information. Thereby, the process using information, such as an object contained in a moving image, can be performed. Therefore, communication between the distribution user and the viewing user can be made more diverse.
- the server 10 according to the present embodiment can also perform image processing without performing image analysis.
- an example of image processing will be described.
- FIG. 9 is a diagram showing a first example of moving image deformation processing by the information processing system 1 according to the present embodiment.
- live video M ⁇ b> 2 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- This live video M2 includes a moving object Obj21 and a background object Obj22.
- the moving object Obj21 is a person object
- the background object Obj22 is a moon object.
- the operation information generation unit 203 generates operation information including the arrangement position, type, and mode of the sticker Stk2, and the operation information transmission unit 204 transmits the operation information to the server 10.
- the image analysis unit 103 performs image analysis on the region 1021 corresponding to the arrangement position in the live video M2 acquired by the image acquisition unit 101, and is included in the region. Recognize objects
- This area 1021 may be a predetermined area set for each operation information.
- the region 1021 is a range where the deformation process for the live video M ⁇ b> 2 by the sticker Stk ⁇ b> 2 extends.
- the area 1021 includes a moving object Obj21 and a background object Obj22.
- the image analysis unit 103 uses a known image analysis technique including optical flow estimation.
- the moving object Obj21 is a person object and an object with a large movement.
- the background object Obj22 is a moon object, which is an object with little or no movement.
- the deformation process is performed on a moving image regardless of the type of object included in the moving image, the entire region to be deformed is deformed. In this case, for example, the deformation process extends to the background included in the moving image, so that a user who views the moving image subjected to the deformation process is uncomfortable.
- the image analysis unit 103 identifies an object with large movement as a target for image processing. For example, in the example illustrated in FIG. 9, the image analysis unit 103 identifies the moving object Obj21 as an object to be subjected to image processing.
- the image processing unit 104 performs image processing only on the moving object Obj21 based on the analysis result by the image analysis unit 103. For example, in the example illustrated in FIG. 9, the image processing unit 104 performs the deformation process only on the moving object Obj21.
- the deformation process is performed by arranging the sticker Stk2 indicating the bomb on the live video M2, but the present technology is not limited to such an example.
- the image processing associated with the sticker is executed for an area corresponding to the position on the moving image.
- the deformation process is performed on the moving object Obj21 having a large movement, but the present technology is not limited to such an example.
- deformation processing or the like may be performed on a background object (for example, a background object Obj22 indicating the moon).
- the image analysis unit 103 may perform a process of changing the color with respect to the background of the moving image (for example, a process of darkening a bright sky). Thereby, the communication via a moving image can be enjoyed more.
- processing for transforming a moving image using a sticker has been described, but the present technology is not limited to such an example.
- processing based on the operation mode of the display operation unit 210 by the operating tool H can be performed without using a sticker.
- FIG. 10 is a diagram showing a second example of moving image deformation processing by the information processing system 1 according to the present embodiment.
- the live video M ⁇ b> 2 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- the viewing user performs a pinch-out operation with the operating tool H.
- the image analysis unit 103 performs image analysis on a region corresponding to the operation position, and recognizes an object to be subjected to image processing.
- the image processing unit 104 performs processing corresponding to the pinch-out operation on the object recognized by the image analysis.
- FIG. 10 is a diagram showing a second example of moving image deformation processing by the information processing system 1 according to the present embodiment.
- the object Obj2 is deformed according to the direction corresponding to the pinch-out operation by the operating body H. In this way, it is possible to perform image processing on a moving image by directly executing a specific operation on the live video M2 without using a sticker. Therefore, the viewing user can more easily enjoy communication via the distributed moving image.
- FIG. 11 is a diagram illustrating an example of image processing according to the feature of the object included in the moving image by the information processing system 1 according to the present embodiment.
- a live video M3 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- This live video M3 includes a person object Obj3.
- the person object Obj3 includes an area 1031 corresponding to the person's head and an area 1032 corresponding to the person's face.
- the operation information generation unit 203 generates operation information including each arrangement position, type, and aspect of the sticker Stk3, and the operation information transmission unit 204 transmits the operation information to the server 10.
- the image analysis unit 103 When the operation information acquisition unit 102 acquires the operation information, the image analysis unit 103 performs image analysis on a region (position) corresponding to the arrangement position in the live video M3 acquired by the image acquisition unit 101, and the region is included in the region. Recognize contained objects. Further, the image analysis unit 103 may determine whether or not an area (position) corresponding to the arrangement position is included in an area corresponding to the characteristics of the object. For example, in the example illustrated in FIG. 11, after recognizing the person object Obj3, the image analysis unit 103 detects the area 1031 corresponding to the person's head and the area 1032 corresponding to the person's face, and corresponds to the arrangement position. It is determined whether or not the area (position) to be included is included in these areas.
- the image processing unit 104 performs image processing such as arranging a sticker by changing the mode of the sticker based on the analysis result including the determination result and the operation information. For example, in the example illustrated in FIG. 11, when the arrangement position is included in the region 1031, the image processing unit 104 changes the sticker Stk3 indicating favor to the sticker Stk31 indicating the hand of the cat and superimposing the live image M3. You may let them. Further, when the arrangement position is included in the area 1032, the image processing unit 104 may change the sticker Stk 32 indicating the characters “CUTE!” And superimpose it on the live video M 3. Accordingly, various displays can be enjoyed from a single sticker according to the characteristics of the object included in the moving image.
- gesture processing by an object included in a moving image by the information processing system 1 will be described. This gesture processing changes or cancels the image processing performed on the moving image in accordance with the movement of the object.
- FIG. 12 is a diagram illustrating an example of gesture processing by an object included in a moving image by the information processing system 1 according to the present embodiment.
- a live video M4 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- This live video M4 includes a person object Obj4.
- a plurality of stickers Stk4 are superimposed on the live video M4 so as to cover the person object Obj4.
- the image analysis unit 103 may recognize the movement of the arm of the person object Obj4. For example, the image analysis unit 103 recognizes the person object Obj4 in advance on the live video M4, and detects that the person object Obj4 moves the arm along the arrow 1041.
- the image analysis unit 103 When the image analysis unit 103 detects the movement of the arm, the image analysis unit 103 analyzes a sticker to be processed corresponding to the movement of the arm.
- a plurality of stickers Stk4 are superimposed on the live video M4 so as to cover the person object Obj4.
- the image analysis unit 103 recognizes a plurality of stickers Stk4 existing in the area corresponding to the person object Obj4 as stickers to be processed corresponding to the movement of the arm on the person object Obj4. Then, the image analysis unit 103 outputs information about the recognized sticker Stk4 being a processing target corresponding to the movement of the arm to the image processing unit 104.
- the image processing unit 104 performs image processing corresponding to the movement of the arm on the live video M4.
- the image processing unit 104 performs a process of moving a plurality of stickers Stk4 recognized as processing targets by the image analysis unit 103 to positions that do not overlap the person object Obj4.
- the person imaged as the person object Obj4 in the live video M4 can eliminate the nuisance to the distribution user due to a sticker or the like by his / her movement without directly operating the user terminal 20. it can.
- the image processing unit 104 performs the process of moving the sticker Stk4 based on the movement of the person object Obj4.
- the present technology is not limited to this example.
- the image processing unit 104 may cause the sticker Stk4 to disappear based on such a motion, or may change the sticker Stk4 to another mode.
- the gesture by the person object Obj4 is not limited to the gesture of shaking off the arm. As long as the image analysis unit 103 can analyze the movement of the object, any gesture can be a target of image analysis.
- the image processing unit 104 can invalidate the deformation process or the like given to the person object Obj4 by using the gesture of the person object Obj4. As a result, image processing unintended by the distribution user can be canceled.
- the present technology is not limited to such an example.
- the above processing may be performed based on an operation on the user terminal 20.
- FIG. 13 is a diagram illustrating an example of sticker movement processing by the information processing system 1 according to the present embodiment.
- the live video M4 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- the live video M2 includes a person object Obj4, and a plurality of stickers Stk4 are superimposed on the live video M4 so as to cover the person object Obj4.
- the distribution user or viewing user performs a pinch-out operation with the operating tool H.
- the image analysis unit 103 performs image analysis on the region corresponding to the operation position, and specifies the sticker superimposed on the person object Obj4. Then, the image processing unit 104 performs processing corresponding to the pinch-out operation on the sticker identified by the image analysis.
- the plurality of stickers Stk4 superimposed on the person object Obj4 move according to the direction corresponding to the pinch-out operation by the operating tool H. In this way, it is possible to perform a process of moving the sticker by directly executing a specific operation on the live video M4. Thereby, the distribution user can cancel
- processing based on operations for stickers Next, processing based on an operation on the sticker by the information processing system 1 will be described.
- the processing based on the operation on the sticker is to change the mode of the sticker or the like based on the operation by the operating body performed on the sticker superimposed on the moving image.
- two examples of processing based on a sticker slide operation and processing based on a tap operation on a sticker will be described.
- FIG. 14 is a diagram illustrating a first example of processing based on a sticker slide operation by the information processing system 1 according to the present embodiment.
- a live video M5 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- This live video M5 includes a person object Obj5.
- the operation information generation unit 203 generates information related to the arrangement position of the sticker Stk5 and the slide operation, and the operation information transmission unit 204 transmits the operation information to the server 10.
- the image processing unit 104 When the operation information acquisition unit 102 acquires the operation information, the image processing unit 104 performs image processing based on the arrangement position of the sticker Stk5, a slide operation, and the like. For example, the image processing unit 104 may move the sticker Stk5 at the predetermined speed in the direction corresponding to the slide operation of the operating tool H, with the arrangement position of the sticker Stk5 as the initial position. Thereby, the sticker Stk5 shows a behavior of sliding on the live video M5.
- said predetermined speed may be determined according to the slide speed of the operating body H in said slide operation, for example.
- the image processing unit 104 when performing the process of sliding the sticker, performs predetermined processing when the sticker passes over the object included in the moving image. You can do this for objects.
- the image analysis unit 103 may recognize the person object Obj5, and the image processing unit 104 may perform a process indicating the effect Efc only on the area corresponding to the person object Obj5.
- the viewing user can not only view the live video but also enjoy it like a game.
- FIG. 15 is a diagram illustrating a second example of sticker slide processing by the information processing system 1 according to the present embodiment.
- a live video M5 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- the live video M5 includes a person object Obj5 and a second sticker Stk52.
- the user places the first sticker Stk51 on the live video display screen 1010 using the operating tool H, and moves toward the position where the second sticker Stk52 is placed.
- the image processing unit 104 may perform a predetermined process when it is determined that the first sticker Stk51 that has undergone the slide process has come into contact with the second sticker Stk52 on the live video M5.
- the image processing unit 104 may perform a process of displaying an effect such that the collided second sticker Stk52 is blown outside the live video M5.
- the image processing unit 104 performs a predetermined process, so that the viewing user can enjoy not only watching a live video but also a game.
- the processing performed when a plurality of stickers collide by slide processing is not particularly limited.
- the image processing unit 104 may perform a process of changing to a different sticker.
- the viewing user can view the live video with more diversity.
- video is not specifically limited.
- the magnitude and direction of the amount of change in the position of the sticker may be appropriately determined according to the slide mode of the operating tool H by the slide process.
- the process based on the tap operation on the sticker changes the aspect of the sticker by performing the tap operation on the sticker superimposed on the moving image.
- FIG. 16 is a diagram illustrating an example of processing based on a tap operation on a sticker by the information processing system 1 according to the present embodiment.
- a live video M5 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- This live video M5 includes a person object Obj5 and a third sticker Stk53.
- the predetermined tap operation is, for example, a tap operation when the operating tool H performs a predetermined number of times or more in a predetermined time, or a pressing operation when the operating tool H applies a predetermined pressure or more to the display operation unit 210.
- the operation information acquisition unit 102 acquires operation information related to a predetermined tap operation
- the image processing unit 104 may perform a process of changing the type or aspect of the third sticker Stk 53 according to the operation information. Good.
- the third sticker Stk 53 changes to the fourth sticker Stk 54 in response to a predetermined tap operation by the operating tool H.
- the fourth sticker Stk54 is a step obtained by enlarging the third sticker Stk53 by a predetermined tap operation by the operating tool H.
- the type or aspect of the sticker changes according to the tap operation, so that a plurality of users can communicate with each other through one sticker.
- the image processing unit 104 changes the type or aspect of the sticker based on operation information related to a predetermined tap operation.
- the present technology is not limited to such an example.
- the image processing unit 104 may change the type or mode of the sticker according to a predetermined slide operation or a predetermined pinch operation on the sticker as well as the tap operation.
- the image processing unit 104 may change the type or mode of the sticker in accordance with an operation of tapping a sticker arranged on the moving image and sliding it in a predetermined direction.
- the image processing unit 104 may change the type or aspect of the sticker according to various operations on the sticker by the operating tool H that can be acquired by the display operation unit 210. Thereby, it is possible to form a communication including a game element via a sticker arranged on a moving image.
- FIG. 17 is a diagram illustrating an example of processing for collecting the stickers included in the moving image by the information processing system 1 according to the present embodiment.
- a live video M6 is displayed on the live video display screen 1010.
- This live video M6 includes a person object Obj6.
- a plurality of stickers Stk60 are densely superimposed on the person object Obj6 and superimposed on the live video M6.
- the image processing unit 104 may perform a process of collecting the stickers Stk60 when the density of the stickers Stk60 exceeds a predetermined threshold. For example, as shown in the upper right of FIG. 17, the image processing unit 104 may perform a process of combining a plurality of stickers Stk60 into one sticker Stk60. At this time, the image processing unit 104 may display the number of the collected stickers Stk60 in the vicinity of the stickers Stk60 as “ ⁇ 10”. In addition, the image processing unit 104 may perform a process of displaying a sticker Stk61 having a different form from the sticker Stk60, instead of the sticker Stk60 after the summary process.
- FIG. 18 is a diagram illustrating an example of image processing according to acoustic information by the information processing system 1 according to the present embodiment.
- the live video M7 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- the live video M7 includes a plurality of person objects playing music and a sticker Stk70.
- the live video M7 is accompanied by acoustic information, and the acoustic information includes an acoustic waveform Snd1.
- the amplitude of the acoustic waveform Snd1 indicates the volume level.
- the image processing unit 104 may perform image processing according to the acoustic information.
- the image processing unit 104 may perform a process of changing the magnitude of the sticker Stk 70 in accordance with the magnitude of the acoustic waveform.
- the image processing unit 104 since the amplitude of the acoustic waveform Snd2 is larger than the amplitude of the acoustic waveform Snd1, the image processing unit 104 performs a process of making the size of the sticker Stk71 larger than that of the sticker Stk70.
- the size of the sticker Stk70 changes according to the audio level of the live video M7. In this way, by performing image processing according to the sound, it is possible to further increase the realism of the live video given to the viewing user.
- the image processing unit 104 may perform image processing according to changes in not only the volume but also the sound quality or the sound field. Thereby, image processing matched with acoustic information can be performed.
- highlight moving image generation processing by the information processing system 1 will be described. This highlight moving image generation process is to generate a highlight moving image by extracting and combining only the raised parts from the distributed moving images.
- FIG. 19 is a diagram for explaining an example of highlight moving image generation processing by the information processing system 1 according to the present embodiment.
- FIG. 19 shows a graph 500 indicating the amount of operation information acquired per unit time.
- the unit time is the size of the section when the time from the start of live video distribution to the end of distribution is divided into a plurality of sections.
- the size of the unit time is not particularly limited. In this application example, for example, the unit time may be about 5 seconds.
- the operation information acquisition amount is the operation information acquisition amount acquired by the operation information acquisition unit 102 in the section.
- the acquisition amount is larger than that of other zones. That is, it can be considered that there were many operations by the user on the live video in this section zone. Therefore, it can be considered that this section zone is a time zone that has been exciting for users who view live video. Therefore, the image processing unit 104 may extract and combine live images in such a time zone. Thereby, it is possible to easily create a highlight video in which only the swelled part of the live video is extracted.
- the highlight moving image generation processing may be sequentially performed during the distribution of the live video.
- the image processing unit 104 acquires the acquisition amount of operation information for each unit time.
- the image processing unit 104 determines whether or not the acquisition amount in each section exceeds a predetermined threshold (corresponding to the threshold line 510 in the graph 500 shown in FIG. 19), and the live video extraction target Set the interval band that becomes.
- the image processing unit 104 extracts each of the live videos in the section band to be extracted, and generates a highlight moving image by combining them so as to be temporally continuous.
- the image processing unit 104 determines the end time of the live video in the previous section and the start of the live video in the subsequent section for the live video of two consecutive sections. Image processing may be appropriately performed before and after the time. Specifically, the image processing unit 104 performs processing such that the moving image corresponding to the end time of the live video in the previous section fades out, and the moving image corresponding to the start end time of the live video in the subsequent section. A process may be performed in which fades in. As a result, the discontinuity of the highlight video due to the combination of live video is eliminated, so that it is possible to prevent the viewer from feeling uncomfortable.
- voting processing after image analysis by the information processing system 1 will be described.
- an object to be voted is recognized by image analysis, and the recognized object is subjected to voting process using operation information.
- FIG. 20 is a diagram for explaining an example of voting processing after image analysis by the information processing system 1 according to the present embodiment.
- the live video M8 is displayed on the live video display screen 1010 in the display operation unit 210 of the user terminal 30.
- the live video M8 includes a first person object Obj80 and a second person object Obj81. These person objects are objects to be voted for in the voting process by the information processing system 1.
- the operation information generating unit 203 sets the tap position on the moving image.
- the operation information including the operation information is generated, and the operation information transmission unit 204 transmits the operation information to the server 10.
- the image analysis unit 103 recognizes the first person object Obj80 and the second person object Obj81 included in the live video M8, and the image processing unit 104 has an area where the tap position included in the operation information corresponds to the person object Obj80. It is determined whether or not it is included in an area corresponding to either 1080 or the area 1081 corresponding to the second person object Obj81. Then, the image processing unit 104 identifies the corresponding object (or none), and counts the number of taps for the object. The above process is repeatedly performed for a predetermined time.
- the predetermined time may be determined by a distribution user who distributes live video using the user terminal 20, or may be a predetermined time.
- the image processing unit 104 After the predetermined time has elapsed, the image processing unit 104 counts the total number of taps corresponding to each object, and performs processing according to the count result. For example, as illustrated in FIG. 20, the image processing unit 104 may display the aggregation result as a heat map 1082 and a heat map 1083. Each heat map is subjected to image processing so as to be superimposed on the first person object Obj80 and the second person object Obj81. A gauge 1084 indicating the frequency corresponding to the color of each heat map may be displayed on the live video M8. Thus, by displaying the count result using the heat map, the user can intuitively understand the count result.
- the image processing unit 104 is not limited to the heat map, and may present the aggregation results to the user in various ways.
- the image processing unit 104 may display the numerical value of the totaling result on the live video M8, or perform a process of displaying a predetermined effect on the object having the larger (or smaller) totaling result. Also good.
- Voting after image analysis based on operation position Next, the voting process after image analysis based on the operation position by the information processing system 1 will be described.
- the voting process after the image analysis based on the operation position is different from the voting process after the image analysis described above, the object to be voted is recognized based on the operation information, and the voting process is performed on the recognized object using the operation information. Is to do.
- FIG. 21 is a diagram for explaining an example of voting processing after image analysis based on an operation position by the information processing system 1 according to the present embodiment.
- a live video M9 is displayed on the live video display screen 1010.
- This live video M8 includes a person object Obj90.
- the operation information generation unit 203 When the viewing user taps the user terminal 30 near one of the eyes or mouth of the person object Obj90, the operation information generation unit 203 generates operation information including the tap position on the moving image, and the operation information transmission unit 204 transmits the operation information to the server 10.
- the image analysis unit 103 analyzes the distribution of operation positions included in the operation information, and specifies an area to be voted based on the distribution.
- the image analysis unit 103 extracts a position group 1090A corresponding to the eyes of the person object Obj9 and a position group 1090B corresponding to the mouth of the person object Obj9 from the position information acquired from the plurality of user terminals 30. To do.
- the image analysis unit 103 recognizes the eye object Obj91 and the mouth object Obj92 of the person object Obj9 by analyzing the vicinity of these position groups, and specifies the areas 1091 and 1092 corresponding to these objects.
- the image analysis by the image analysis unit 103 can be performed only for the range specified by the operation information, not the entire moving image. Therefore, since the amount of calculation in image analysis is reduced, processing delay due to image analysis is less likely to occur, and user convenience can be ensured.
- processing after specifying the region corresponding to the object is the same as the voting processing by the image processing unit 104 described above, and thus the description thereof is omitted.
- This moving image attribute estimation process estimates the attribute (category etc.) of the moving image based on operation information for the moving image.
- FIG. 22 is a diagram for explaining the moving image attribute estimation processing by the information processing system 1 according to the present embodiment.
- a table 1100 including stickers and the number of taps (operation information acquisition amount) for live images of stickers.
- This table 1100 shows the number of taps of the sticker within a predetermined time of one live video.
- the predetermined time is an arbitrary time between the start of distribution of live video and the end of distribution.
- the predetermined time may be a time from the start of distribution to the present if live video is being distributed, or may be a time during which a specific video is being distributed. Whether it is a specific video or not may be determined based on image analysis by the image analysis unit 103, or may be determined by a distribution user who distributes a live video.
- the image processing unit 104 may estimate the attribute of a moving image based on operation information within a predetermined time.
- the number of taps of the sticker “cat” is 450 times
- the number of taps of the sticker “heart” is 210 times
- the number of taps of the sticker “Like” is 124 times.
- the sticker “food” has 23 taps.
- the image processing unit 104 may estimate that the live video is a live video having attributes related to “cat”, “heart”, and “Like”.
- the image processing unit 104 may estimate that the attribute of the live video is “animal-cat”. Note that this moving image estimation processing may be performed using a known technique related to machine learning, for example.
- the image processing unit 104 can automatically set the attribute of the moving image by estimating the attribute of the moving image based on the operation information. Therefore, it is possible to easily and accurately classify moving images.
- FIG. 23 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the illustrated information processing apparatus 900 can realize, for example, the server 10 and the user terminal 20 (30) in the above-described embodiment.
- the information processing apparatus 900 includes a CPU 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 925, and a communication device 929.
- the information processing apparatus 900 may include an imaging apparatus (not shown) as necessary.
- the information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls all or a part of the operation in the information processing apparatus 900 according to various programs recorded in the ROM 903, the RAM 905, the storage apparatus 919, or the removable recording medium 923.
- the CPU 901 controls the overall operation of each functional unit included in the server 10 and the user terminal 20 (30) in the above embodiment.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device 927 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 can be, for example, a display device such as an LCD, PDP, and OELD, an acoustic output device such as a speaker and headphones, and a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as a video such as text or an image, or outputs it as a sound such as sound.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the drive 921 is a reader / writer for a removable recording medium 923 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 923 and outputs the information to the RAM 905.
- the drive 921 writes a record in the mounted removable recording medium 923.
- the connection port 925 is a port for directly connecting a device to the information processing apparatus 900.
- the connection port 925 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like. Further, the connection port 925 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 929 is a communication interface configured with a communication device for connecting to the communication network NW, for example.
- the communication device 929 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 929 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 929 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network NW connected to the communication device 929 is a network connected by wire or wireless, and is, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like. Note that the communication device 929 implements a function as a communication unit.
- the imaging apparatus includes various members such as an imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that captures a real space and generates a captured image.
- the imaging device may capture a still image or may capture a moving image.
- the server 10 is a user terminal that is a distribution source on the basis of operation information that includes information about an operation position related to the placement of a sticker and information related to a sticker to be placed, acquired from the user terminal 30.
- the moving image acquired from the image 20 is subjected to image analysis.
- the server 10 performs image processing on the moving image based on the analysis result and the operation information.
- the server 10 distributes the image-processed moving image to the user terminal 20 and the user terminal 30.
- a viewing user who views a moving image using the user terminal 30 performs a predetermined operation such as an operation of placing a sticker on the moving image, thereby allowing a plurality of users to view the moving image in real time. You can easily communicate with. Therefore, communication between users via moving images becomes smoother.
- the server 10 performs processing other than image processing using the operation information acquired from the user terminal 30.
- the server 10 performs statistical processing or extraction processing based on operation information.
- the image analysis by the image analysis unit 103 and the image processing by the image processing unit 104 are performed by the server 10, but the present technology is not limited to such an example.
- the image analysis and image processing described above may be performed by the user terminal 20 (30). More specifically, for the image processing, the user terminal 20 (30) is based on the moving image acquired from the server 10 and the operation information acquired from another user terminal via the server 10. ) May perform image analysis and image processing.
- the server 10 transmits the moving image acquired from the user terminal 20 that is the moving image distribution source and the operation information acquired from the user terminal 20 (30) to the user terminal 20 (30). With this configuration, the burden on image analysis and image processing can be distributed to the user terminals 20 (30), so that the processing burden on the server 10 is reduced. Therefore, it is possible to make it difficult to cause delays due to various processes.
- each function part with which the server 10 which concerns on said embodiment is equipped is provided in the server 10 and the user terminal 20 (30) separately according to the specification or system configuration of the server 10 and the user terminal 20 (30). May be.
- the information processing system 1 includes the server 10 and the plurality of user terminals 20 (30), the present technology is not limited to such an example.
- the information processing system may be configured by a plurality of user terminals 20 (30) connected via a network.
- image analysis and image processing of the distributed moving image may be performed in each user terminal 20 (30) that acquired the moving image and the operation information.
- each step in the processing of the server and the user terminal in this specification does not necessarily have to be processed in time series in the order described as a flowchart.
- each step in the processing of the server and the user terminal may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
- a communication unit that acquires operation information including a moving image that is live-distributed, an operation position related to the placement of a sticker on the moving image displayed on a display unit of a device, and information about the sticker that is arranged;
- a control unit that analyzes a region corresponding to the operation position on the moving image, and performs an arrangement process of the sticker on the moving image based on the analysis result and the operation information;
- An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the control unit recognizes an object included in an area corresponding to the operation position and performs image processing on the object.
- control unit changes an aspect of the sticker arranged on the moving image based on a feature of the object.
- control unit recognizes the movement of the object in the moving image and performs further image processing on the moving image based on the recognized movement of the object.
- Information processing device (5) The information processing apparatus according to (4), wherein the control unit invalidates image processing applied to the moving image as the further image processing.
- control unit changes a position of a sticker arranged in the moving image as the further image processing.
- the information processing apparatus according to any one of (7) to (10), wherein the control unit sets an extraction interval of the moving image based on a time-series distribution of the acquisition amount. (12) The information processing apparatus according to any one of (1) to (11), wherein the control unit performs processing based on acoustic information corresponding to the moving image. (13) The information processing apparatus according to any one of (1) to (12), wherein the control unit estimates an attribute of the moving image based on the operation information. (14) The control unit according to any one of (1) to (13), wherein the control unit changes at least one aspect of the sticker based on operation information regarding an operation on at least one sticker on the moving image. Information processing device.
- the information processing apparatus changes at least one aspect of the other stickers included in an area corresponding to at least one sticker arranged on the moving image.
- the operation information includes information related to an operation mode for a display unit of the device.
- the operation mode includes at least one of a tap operation, a pinch operation, a swipe operation, a slide operation, and a pressure applied to the display unit of the device by at least one operation body.
- the control unit performs a processing process on the moving image.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
1.情報処理システムの概要
2.各装置の構成
3.情報処理システムの動作
4.情報処理システムによる画像処理例
5.情報処理システムによる操作情報の応用例
6.ハードウェア構成例
7.まとめ
<1.1.情報処理システムの構成>
図1は、本開示の一実施形態に係る情報処理システム1の構成の概要を示す図である。図1に示すように、情報処理システム1は、サーバ10、ユーザ端末20およびユーザ端末30(30a、30b、・・・)からなる。サーバ10、ユーザ端末20およびユーザ端末30は、有線または無線の各種ネットワークNWにより接続される。
ユーザ端末30(20)の表示操作部に表示されるUI(User Interface)の構成および操作処理について、図2~図4を参照しながら詳細に説明する。
<2.1.サーバ>
図5は、本開示の一実施形態に係るサーバ10の機能構成例を示すブロック図である。図5を参照すると、サーバ10は、画像取得部101、操作情報取得部102、画像解析部103および画像処理部104を備える。これらの各機能部の動作は、サーバ10が備えるCPU(Central Processing Unit)等の処理回路によって制御される。
画像取得部101は、ライブ映像の配信元であるユーザ端末20により生成された動画像データを不図示の通信部を介して取得する。例えば、画像取得部101は、ユーザ端末20により生成された動画像データを時系列に取得する。当該動画像データには、例えば、ライブ映像である動画像のほかに、音響情報および動画像の撮像時刻等の情報が含まれる。
操作情報取得部102は、ユーザ端末20またはユーザ端末30の少なくともいずれかから送信された操作情報を不図示の通信部を介して取得する。例えば、操作情報取得部102は、ユーザ端末の少なくともいずれかから送信された操作情報を時系列に取得する。当該操作情報には、例えば、ユーザ端末20の表示操作部210において表示されている動画像上へのステッカの配置に関する操作位置、当該ステッカに関する情報、並びに動画像が表示されている表示画面上に対して行われた操作の種類および操作態様等が含まれる。
画像解析部103は、画像取得部101が取得した動画像データおよび操作情報取得部102が取得した操作情報に基づいて、画像解析を行う。より具体的には、画像解析部103は、動画像上における操作位置に対応する領域を解析する。
画像処理部104は、画像解析部103から取得した解析結果、および操作情報取得部102が取得した操作情報に基づいて、動画像に対して画像処理を行う。ここでいう画像処理とは、例えば、ステッカの配置処理である。さらに、画像処理部104による画像処理には、動画像に対する加工処理もしくは編集処理が含まれてもよい。
図6は、本開示の一実施形態に係るユーザ端末20(30)の機能構成例を示すブロック図である。なお、ユーザ端末20およびユーザ端末30は同様の機能構成を有するため、ここではユーザ端末20の機能構成について説明する。図6を参照すると、ユーザ端末20は、制御部200および表示操作部210を備える。なお、ユーザ端末20により動画像の撮像処理を行う場合、ユーザ端末20はさらに撮像部220を備えてもよい。
制御部200は、ユーザ端末20に含まれる各機能部の動作全般を制御する。制御部200は、例えば、CPU等の処理回路によって実現される。また、制御部200は、画像取得部201、表示制御部202、操作情報生成部203および操作情報送信部204としての機能を有する。
画像取得部201は、サーバ10から動画像を含む動画像データを不図示の通信部を介して取得する。サーバ10から取得する動画像は、動画像の配信元であるユーザ端末からサーバ10へ一度送信された動画像である。例えば、画像取得部201は、サーバ10から動画像データを時系列に取得する。画像取得部201は、取得した動画像データを表示制御部202に出力する。
表示制御部202は、画像取得部201から取得した動画像データに含まれる動画像を表示操作部210に表示させるための制御を行う。例えば、表示制御部202は、動画像を表示操作部210内の所定の画面に表示させる。
操作情報生成部203は、表示操作部210に表示された動画像に対する操作に基づいて操作情報を生成する。例えば、操作情報生成部203は、表示操作部210に表示された動画像に対するステッカの配置に関する操作に基づいて操作情報を生成する。より具体的には、操作情報生成部203は、動画像上へのステッカの配置に関する操作位置、当該ステッカに関する情報、表示操作部210に対する操作の種類、および操作態様を表示操作部210から取得して、これらを操作情報として生成する。なお、ここで操作情報に含まれる操作位置は、動画像上における操作位置を意味する。詳細には、操作体が表示操作部210に対して表示操作部210上の位置(x,y)に接触したとすると、操作情報生成部203は、表示操作部210上の位置(x,y)に対応する動画像上の位置(u,v)を操作位置として取得する。操作情報生成部203は、生成した操作情報を操作情報送信部204に出力する。
操作情報送信部204は、操作情報生成部203が生成した操作情報を不図示の通信部を介してサーバ10に送信する。
表示操作部210は、表示部および操作部としての機能を有する。表示操作部210は、表示制御部202により動画像を表示する。また、表示操作部210は、操作体による操作を取得する。表示操作部210が取得した当該操作に関する情報は、操作情報生成部203に出力される。
撮像部220は、実空間を撮像し、撮像画像(動画像)を生成する。生成された動画像は、不図示の通信部を介してサーバ10に送信される。また、生成された動画像は、表示制御部202に出力されてもよい。本実施形態に係る撮像部220は、例えば、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子、および撮像素子への被写体像の結像を制御するためのレンズなどの各種の部材により実現される。なお、本実施形態に係るユーザ端末20は撮像部220の機能をユーザ端末20の外部に設けてもよい。この場合、ユーザ端末20は、デジタルカメラ等の撮像装置により生成された動画像を取得し、当該動画像をサーバ10へ送信してもよい。
次に、情報処理システムの各装置における動作例について説明する。ここでは、ユーザ端末による操作処理およびサーバによる画像処理の各動作例について説明する。
図7は、本開示の一実施形態に係るユーザ端末20(30)による操作処理の動作例を示すフローチャートである。ここでは、ユーザ端末20が、ユーザ端末20に対して操作体による操作に関する操作情報を生成し、当該操作情報に基づく画像処理が行われた動画像を取得するまでの処理の流れについて説明する。
図8は、本開示の一実施形態に係るサーバ10による画像処理の動作例を示すフローチャートである。ここでは、サーバ10が、動画像の配信元であるユーザ端末20から当該動画像を取得し、操作情報に基づいて画像解析および画像処理を行い、画像処理後の動画像を配信するまでの処理の流れについて説明する。
続いて、本実施形態に係る情報処理システム1による画像処理例について説明する。本実施形態に係るサーバ10は、ユーザ端末20から取得した動画像について、各ユーザ端末から取得した操作情報に基づいて画像処理を行う。その際、サーバ10は、操作情報に含まれる操作位置に対応する領域について画像解析を行ってから画像処理を行ってもよい。これにより、動画像に含まれるオブジェクト等の情報を用いた処理を行うことができる。したがって、配信ユーザと視聴ユーザとのコミュニケーションをより多様なものとすることができる。なお、本実施形態に係るサーバ10は画像解析を行わずに画像処理を行うことも可能である。以下、画像処理例について説明する。
まず、情報処理システム1による動画像の変形処理について説明する。この変形処理は、操作情報に基づいて動画像に含まれるオブジェクトに対して変形加工するものである。
次に、情報処理システム1による動画像に含まれるオブジェクトの特徴に応じた画像処理について説明する。このオブジェクトの特徴に応じた画像処理は、操作位置に対応するオブジェクトの特徴に応じて、動画像に対する画像処理を異ならせるものである。
次に、情報処理システム1による動画像に含まれるオブジェクトによるジェスチャ処理について説明する。このジェスチャ処理は、オブジェクトの動きに応じて、動画像に対して行われた画像処理を変更または解除するものである。
次に、情報処理システム1によるステッカに対する操作に基づく処理について説明する。このステッカに対する操作に基づく処理は、動画像に重畳されたステッカに対して行われた操作体による操作に基づいて、ステッカの態様等を変化させるものである。以下では、ステッカのスライド操作に基づく処理、およびステッカに対するタップ操作に基づく処理の2つの例について説明する。
次に、情報処理システム1による動画像に含まれるステッカをまとめる処理について説明する。このステッカまとめ処理は、動画像に密集して重畳されたステッカをまとめるものである。
次に、情報処理システム1による音響情報に応じた画像処理について説明する。この音声に応じた画像処理は、動画像に付随する音響情報に応じて画像処理を実行、変更または解除するものである。図18は、本実施形態に係る情報処理システム1による音響情報に応じた画像処理の例を示す図である。図18を参照すると、ユーザ端末30の表示操作部210において、ライブ映像表示画面1010にはライブ映像M7が表示されている。このライブ映像M7には音楽を演奏する複数の人物オブジェクト、およびステッカStk70が含まれている。また、ライブ映像M7には音響情報が付随されており、音響情報には音響波形Snd1が含まれる。音響波形Snd1の振幅は、音量の大きさを示す。
以上、本実施形態に係る情報処理システム1による画像処理例について説明した。本実施形態に係る情報処理システム1は、動画像に対して画像解析結果に基づいて画像処理を行うだけではなく、操作情報等に基づいた他の処理を行ってもよい。以下、本実施形態に係る情報処理システム1による操作情報の応用例について説明する。
まず、情報処理システム1によるハイライト動画生成処理について説明する。このハイライト動画生成処理は、配信された動画像から盛り上がった部分のみを抽出して結合することによりハイライト動画を生成するものである。
次に、情報処理システム1による画像解析後の投票処理について説明する。この画像解析後の投票処理は、投票対象となるオブジェクトを画像解析により認識し、認識したオブジェクトについて操作情報を用いて投票処理を行うものである。
次に、情報処理システム1による操作位置に基づく画像解析後の投票処理について説明する。この操作位置に基づく画像解析後の投票処理は、上述した画像解析後の投票処理とは異なり、投票対象となるオブジェクトを操作情報に基づいて認識し、認識したオブジェクトについて操作情報を用いて投票処理を行うものである。
次に、情報処理システム1による動画属性推定処理について説明する。この動画属性推定処理は、動画像に対する操作情報に基づいて、当該動画像の属性(カテゴリ等)を推定するものである。
以上、本実施形態に係る情報処理システム1による操作情報の応用例について説明した。次に、図23を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図23は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態におけるサーバ10およびユーザ端末20(30)を実現しうる。
本実施形態に係る情報処理システム1によれば、サーバ10は、ユーザ端末30から取得したステッカの配置に関する操作位置および配置されるステッカに関する情報を含む操作情報に基づいて、配信元であるユーザ端末20から取得した動画像を画像解析する。そして、サーバ10は、解析結果および操作情報に基づいて当該動画像を画像処理する。そしてサーバ10は、画像処理された動画像をユーザ端末20およびユーザ端末30に配信する。
(1)
ライブ配信されている動画像、およびデバイスの表示部に表示されている前記動画像上へのステッカの配置に関する操作位置および配置される前記ステッカに関する情報を含む操作情報を取得する通信部と、
前記動画像上における前記操作位置に対応する領域を解析し、解析結果、および前記操作情報に基づいて前記動画像に対して前記ステッカの配置処理を行う制御部と、
を備える情報処理装置。
(2)
前記制御部は、前記操作位置に対応する領域に含まれるオブジェクトを認識し、前記オブジェクトに対して画像処理を行う、前記(1)に記載の情報処理装置。
(3)
前記制御部は、前記オブジェクトの特徴に基づいて、前記動画像上に配置される前記ステッカの態様を変化させる、前記(2)に記載の情報処理装置。
(4)
前記制御部は、前記動画像内の前記オブジェクトの動きを認識し、認識された前記オブジェクトの動きに基づいて前記動画像に対してさらなる画像処理を行う、前記(2)または(3)に記載の情報処理装置。
(5)
前記制御部は、前記さらなる画像処理として、前記動画像に適用された画像処理を無効とする、前記(4)に記載の情報処理装置。
(6)
前記制御部は、前記さらなる画像処理として、前記動画像に配置されたステッカの位置を変更する、前記(4)または(5)に記載の情報処理装置。
(7)
前記制御部は、前記操作情報の取得量に基づく処理を行う、前記(1)~(6)のいずれか1項に記載の情報処理装置。
(8)
前記制御部は、前記操作情報の取得量の時系列分布に基づく処理を行う、前記(7)に記載の情報処理装置。
(9)
前記制御部は、前記動画像から少なくとも一のオブジェクトを認識し、前記少なくとも一のオブジェクトに対する前記操作情報の取得量に基づく処理を行う、前記(7)または(8)に記載の情報処理装置。
(10)
前記制御部は、前記操作位置の分布に基づいて少なくとも一のオブジェクトを認識する、前記(7)~(9)のいずれか1項に記載の情報処理装置。
(11)
前記制御部は、前記取得量の時系列分布に基づいて前記動画像の抽出区間を設定する、前記(7)~(10)のいずれか1項に記載の情報処理装置。
(12)
前記制御部は、前記動画像に対応する音響情報に基づく処理を行う、前記(1)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記制御部は、前記操作情報に基づいて前記動画像の属性を推定する、前記(1)~(12)のいずれか1項に記載の情報処理装置。
(14)
前記制御部は、前記動画像上における少なくとも一のステッカに対する操作に関する操作情報に基づいて、前記ステッカの少なくともいずれかの態様を変化させる、前記(1)~(13)のいずれか1項に記載の情報処理装置。
(15)
前記制御部は、前記動画像上に配置されている少なくとも一のステッカに対応する領域に含まれる他の前記ステッカの少なくともいずれかの態様を変化させる、前記(14)に記載の情報処理装置。
(16)
前記操作情報は、前記デバイスの表示部に対する操作態様に関する情報を含む、前記(1)~(15)のいずれか1項に記載の情報処理装置。
(17)
前記操作態様は、少なくとも一の操作体によるタップ操作、ピンチ操作、スワイプ操作、スライド操作、または前記デバイスの表示部に対して加えられる圧力の大きさの少なくともいずれかを含む、前記(16)に記載の情報処理装置。
(18)
前記制御部は、前記動画像に対する加工処理を行う、前記(1)~(17)のいずれか1項に記載の情報処理装置。
(19)
プロセッサが、
ライブ配信されている動画像を取得することと、
デバイスの表示部に表示されている前記動画像上へのステッカの配置に関する操作位置および配置される前記ステッカに関する情報を含む操作情報を取得することと、
前記動画像上における前記操作位置に対応する領域を解析することと、
前記解析結果、および前記操作情報に基づいて前記動画像に対して前記ステッカの配置処理を行うことと、
を含む情報処理方法。
(20)
コンピュータに、
受信されたライブ配信されている動画像を前記コンピュータの表示部に表示させるための表示情報を生成することと、
前記コンピュータの通信部を制御することによりサーバに送信される、前記動画像上へのステッカの配置に関する操作位置および配置される前記ステッカに関する情報を含む操作情報を生成することと、
前記通信部を制御することにより前記サーバから受信される、前記サーバが前記コンピュータを含むデバイスから取得した操作位置に対応する前記動画像上の領域を解析することにより得られる解析結果と前記操作情報とに基づいて前記サーバにより前記ステッカの配置処理が施された動画像を、前記表示部に表示させるための表示情報を生成することと、
を実行させるプログラム。
10 サーバ
20、30 ユーザ端末
101 画像取得部
102 操作情報取得部
103 画像解析部
104 画像処理部
200 制御部
201 画像取得部
202 表示制御部
203 操作情報生成部
204 操作情報送信部
210 表示操作部
220 撮像部
Claims (20)
- ライブ配信されている動画像、およびデバイスの表示部に表示されている前記動画像上へのステッカの配置に関する操作位置および配置される前記ステッカに関する情報を含む操作情報を取得する通信部と、
前記動画像上における前記操作位置に対応する領域を解析し、解析結果、および前記操作情報に基づいて前記動画像に対して前記ステッカの配置処理を行う制御部と、
を備える情報処理装置。 - 前記制御部は、前記操作位置に対応する領域に含まれるオブジェクトを認識し、前記オブジェクトに対して画像処理を行う、請求項1に記載の情報処理装置。
- 前記制御部は、前記オブジェクトの特徴に基づいて、前記動画像上に配置される前記ステッカの態様を変化させる、請求項2に記載の情報処理装置。
- 前記制御部は、前記動画像内の前記オブジェクトの動きを認識し、認識された前記オブジェクトの動きに基づいて前記動画像に対してさらなる画像処理を行う、請求項2に記載の情報処理装置。
- 前記制御部は、前記さらなる画像処理として、前記動画像に適用された画像処理を無効とする、請求項4に記載の情報処理装置。
- 前記制御部は、前記さらなる画像処理として、前記動画像に配置されたステッカの位置を変更する、請求項4に記載の情報処理装置。
- 前記制御部は、前記操作情報の取得量に基づく処理を行う、請求項1に記載の情報処理装置。
- 前記制御部は、前記操作情報の取得量の時系列分布に基づく処理を行う、請求項7に記載の情報処理装置。
- 前記制御部は、前記動画像から少なくとも一のオブジェクトを認識し、前記少なくとも一のオブジェクトに対する前記操作情報の取得量に基づく処理を行う、請求項7に記載の情報処理装置。
- 前記制御部は、前記操作位置の分布に基づいて少なくとも一のオブジェクトを認識する、請求項7に記載の情報処理装置。
- 前記制御部は、前記取得量の時系列分布に基づいて前記動画像の抽出区間を設定する、請求項7に記載の情報処理装置。
- 前記制御部は、前記動画像に対応する音響情報に基づく処理を行う、請求項1に記載の情報処理装置。
- 前記制御部は、前記操作情報に基づいて前記動画像の属性を推定する、請求項1に記載の情報処理装置。
- 前記制御部は、前記動画像上における少なくとも一のステッカに対する操作に関する操作情報に基づいて、前記ステッカの少なくともいずれかの態様を変化させる、請求項1に記載の情報処理装置。
- 前記制御部は、前記動画像上に配置されている少なくとも一のステッカに対応する領域に含まれる他の前記ステッカの少なくともいずれかの態様を変化させる、請求項14に記載の情報処理装置。
- 前記操作情報は、前記デバイスの表示部に対する操作態様に関する情報を含む、請求項1に記載の情報処理装置。
- 前記操作態様は、少なくとも一の操作体によるタップ操作、ピンチ操作、スワイプ操作、スライド操作、または前記デバイスの表示部に対して加えられる圧力の大きさの少なくともいずれかを含む、請求項16に記載の情報処理装置。
- 前記制御部は、前記動画像に対する加工処理を行う、請求項1に記載の情報処理装置。
- プロセッサが、
ライブ配信されている動画像を取得することと、
デバイスの表示部に表示されている前記動画像上へのステッカの配置に関する操作位置および配置される前記ステッカに関する情報を含む操作情報を取得することと、
前記動画像上における前記操作位置に対応する領域を解析することと、
前記解析結果、および前記操作情報に基づいて前記動画像に対して前記ステッカの配置処理を行うことと、
を含む情報処理方法。 - コンピュータに、
受信されたライブ配信されている動画像を前記コンピュータの表示部に表示させるための表示情報を生成することと、
前記コンピュータの通信部を制御することによりサーバに送信される、前記動画像上へのステッカの配置に関する操作位置および配置される前記ステッカに関する情報を含む操作情報を生成することと、
前記通信部を制御することにより前記サーバから受信される、前記サーバが前記コンピュータを含むデバイスから取得した操作位置に対応する前記動画像上の領域を解析することにより得られる解析結果と前記操作情報とに基づいて前記サーバにより前記ステッカの配置処理が施された動画像を、前記表示部に表示させるための表示情報を生成することと、
を実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/768,950 US11237717B2 (en) | 2015-11-04 | 2016-08-04 | Information processing device and information processing method |
JP2017548656A JP6777089B2 (ja) | 2015-11-04 | 2016-08-04 | 情報処理装置、情報処理方法およびプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015216617 | 2015-11-04 | ||
JP2015-216617 | 2015-11-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017077751A1 true WO2017077751A1 (ja) | 2017-05-11 |
Family
ID=58661814
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/073014 WO2017077751A1 (ja) | 2015-11-04 | 2016-08-04 | 情報処理装置、情報処理方法およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US11237717B2 (ja) |
JP (2) | JP6777089B2 (ja) |
WO (1) | WO2017077751A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6198983B1 (ja) * | 2017-04-26 | 2017-09-20 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
JP2017229085A (ja) * | 2017-08-22 | 2017-12-28 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
CN110062269A (zh) * | 2018-01-18 | 2019-07-26 | 腾讯科技(深圳)有限公司 | 附加对象显示方法、装置及计算机设备 |
JP2019537298A (ja) * | 2016-10-01 | 2019-12-19 | フェイスブック,インク. | クライアントデバイスによって得られたビデオデータをレンダリング中に1つまたは複数のエフェクトで拡張するための構成 |
WO2020071545A1 (ja) * | 2018-10-04 | 2020-04-09 | パロニム株式会社 | 情報処理装置 |
JP2020109943A (ja) * | 2019-11-20 | 2020-07-16 | グリー株式会社 | 動画配信システム、動画配信方法、動画配信プログラム、情報処理端末及び動画視聴プログラム |
JP2020188514A (ja) * | 2020-08-06 | 2020-11-19 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
JP2021022803A (ja) * | 2019-07-26 | 2021-02-18 | 株式会社リンクコーポレイトコミュニケーションズ | 情報処理装置、端末装置、情報処理方法、およびプログラム |
JP2023518878A (ja) * | 2020-03-23 | 2023-05-08 | 北京字節跳動網絡技術有限公司 | 特殊効果処理方法及び装置 |
JP2023175756A (ja) * | 2018-05-07 | 2023-12-12 | アップル インコーポレイテッド | クリエイティブカメラ |
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
US12008230B2 (en) | 2020-05-11 | 2024-06-11 | Apple Inc. | User interfaces related to time with an editable background |
US12033296B2 (en) | 2018-05-07 | 2024-07-09 | Apple Inc. | Avatar creation user interface |
US12081862B2 (en) | 2020-06-01 | 2024-09-03 | Apple Inc. | User interfaces for managing media |
US12099713B2 (en) | 2023-07-11 | 2024-09-24 | Apple Inc. | User interfaces related to time |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12080421B2 (en) | 2013-12-04 | 2024-09-03 | Apple Inc. | Wellness aggregator |
US20160019360A1 (en) | 2013-12-04 | 2016-01-21 | Apple Inc. | Wellness aggregator |
WO2016144385A1 (en) | 2015-03-08 | 2016-09-15 | Apple Inc. | Sharing user-configurable graphical constructs |
CN113521710A (zh) | 2015-08-20 | 2021-10-22 | 苹果公司 | 基于运动的表盘和复杂功能块 |
DK201770423A1 (en) | 2016-06-11 | 2018-01-15 | Apple Inc | Activity and workout updates |
US10736543B2 (en) | 2016-09-22 | 2020-08-11 | Apple Inc. | Workout monitor interface |
US10203855B2 (en) * | 2016-12-09 | 2019-02-12 | Snap Inc. | Customized user-controlled media overlays |
KR101944112B1 (ko) * | 2016-12-22 | 2019-04-17 | 주식회사 시어스랩 | 사용자 저작 스티커를 생성하는 방법 및 장치, 사용자 저작 스티커 공유 시스템 |
US10845955B2 (en) | 2017-05-15 | 2020-11-24 | Apple Inc. | Displaying a scrollable list of affordances associated with physical activities |
DK201970532A1 (en) | 2019-05-06 | 2021-05-03 | Apple Inc | Activity trends and workouts |
KR20220016503A (ko) | 2019-06-01 | 2022-02-09 | 애플 인크. | 다중 모드 활동 추적 사용자 인터페이스 |
CN114514497B (zh) * | 2019-09-27 | 2024-07-19 | 苹果公司 | 用于自定义图形对象的用户界面 |
US10972682B1 (en) * | 2019-12-12 | 2021-04-06 | Facebook, Inc. | System and method for adding virtual audio stickers to videos |
DK202070616A1 (en) | 2020-02-14 | 2022-01-14 | Apple Inc | User interfaces for workout content |
KR20210135683A (ko) * | 2020-05-06 | 2021-11-16 | 라인플러스 주식회사 | 인터넷 전화 기반 통화 중 리액션을 표시하는 방법, 시스템, 및 컴퓨터 프로그램 |
CN112363658B (zh) * | 2020-10-27 | 2022-08-12 | 维沃移动通信有限公司 | 视频通话的互动方法和装置 |
CN113516735A (zh) * | 2021-01-12 | 2021-10-19 | 腾讯科技(深圳)有限公司 | 图像处理方法、装置、计算机可读介质及电子设备 |
WO2022245669A1 (en) | 2021-05-15 | 2022-11-24 | Apple Inc. | User interfaces for group workouts |
US11977729B2 (en) | 2022-06-05 | 2024-05-07 | Apple Inc. | Physical activity information user interfaces |
US20230390626A1 (en) | 2022-06-05 | 2023-12-07 | Apple Inc. | User interfaces for physical activity information |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015073175A (ja) * | 2013-10-02 | 2015-04-16 | 日本写真印刷株式会社 | 動画スタンプシステム、動画スタンプシステムの端末装置、動画スタンプシステムのスタンプサーバ、及び動画スタンプシステムのプログラム |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010030667A1 (en) * | 2000-04-10 | 2001-10-18 | Kelts Brett R. | Interactive display interface for information objects |
US20020018067A1 (en) * | 2000-08-08 | 2002-02-14 | Carcia Peter P. | System for reproducing images in an altered form in accordance with sound characteristics |
JP2001142980A (ja) * | 2000-10-04 | 2001-05-25 | Kazuhiro Ide | ステッカーサービスシステム及びその方法 |
JP2002297798A (ja) | 2001-03-30 | 2002-10-11 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
US7697714B2 (en) * | 2005-09-19 | 2010-04-13 | Silverbrook Research Pty Ltd | Associating an object with a sticker and a surface |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US20090132924A1 (en) * | 2007-11-15 | 2009-05-21 | Yojak Harshad Vasa | System and method to create highlight portions of media content |
CN101983396B (zh) * | 2008-03-31 | 2014-07-09 | 皇家飞利浦电子股份有限公司 | 基于用户指令修改表示的方法 |
CA2698052C (en) * | 2009-03-30 | 2021-02-02 | Stickeryou, Inc. | Internet-based method and system for making user-customized stickers |
US8806331B2 (en) * | 2009-07-20 | 2014-08-12 | Interactive Memories, Inc. | System and methods for creating and editing photo-based projects on a digital network |
TWI439960B (zh) * | 2010-04-07 | 2014-06-01 | Apple Inc | 虛擬使用者編輯環境 |
US8989786B2 (en) * | 2011-04-21 | 2015-03-24 | Walking Thumbs, Llc | System and method for graphical expression during text messaging communications |
JP2013101528A (ja) * | 2011-11-09 | 2013-05-23 | Sony Corp | 情報処理装置、表示制御方法、およびプログラム |
US8941707B2 (en) * | 2011-12-01 | 2015-01-27 | Tangome, Inc. | Video messaging |
JP5568610B2 (ja) | 2012-08-28 | 2014-08-06 | 株式会社プレミアムエージェンシー | 拡張現実システム、映像合成装置、映像合成方法及びプログラム |
US9007465B1 (en) * | 2012-08-31 | 2015-04-14 | Vce Company, Llc | Obtaining customer support for electronic system using first and second cameras |
US20140095335A1 (en) * | 2012-09-28 | 2014-04-03 | Interactive Memories, Inc. | Method for Dynamic Invoicing of Print Vendors at Real-Time Negotiated or Advertised Pricing for Online Printing Services |
US9465985B2 (en) * | 2013-06-09 | 2016-10-11 | Apple Inc. | Managing real-time handwriting recognition |
US9876828B1 (en) * | 2013-07-25 | 2018-01-23 | Overlay Studio, Inc. | Collaborative design |
US10271010B2 (en) * | 2013-10-31 | 2019-04-23 | Shindig, Inc. | Systems and methods for controlling the display of content |
US20150127753A1 (en) * | 2013-11-04 | 2015-05-07 | Meemo, Llc | Word Recognition and Ideograph or In-App Advertising System |
CN105684045B (zh) | 2013-11-13 | 2019-05-14 | 索尼公司 | 显示控制装置、显示控制方法和程序 |
JP6122768B2 (ja) | 2013-11-19 | 2017-04-26 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、表示方法およびコンピュータプログラム |
US20150172246A1 (en) * | 2013-12-13 | 2015-06-18 | Piragash Velummylum | Stickers for electronic messaging cards |
KR102114617B1 (ko) * | 2014-01-08 | 2020-05-25 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
WO2015175240A1 (en) * | 2014-05-15 | 2015-11-19 | Narvii Inc. | Systems and methods implementing user interface objects |
US9792716B2 (en) * | 2014-06-13 | 2017-10-17 | Arcsoft Inc. | Enhancing video chatting |
WO2016073992A1 (en) * | 2014-11-07 | 2016-05-12 | H4 Engineering, Inc. | Editing systems |
US20160334972A1 (en) * | 2015-05-13 | 2016-11-17 | Yahoo!, Inc. | Content overlay for social network posts |
US20180314409A1 (en) * | 2015-10-08 | 2018-11-01 | Magnificus Software Inc. | Method and system for creating and using emojis and other graphic content in instant messaging systems |
US10218938B2 (en) * | 2016-04-14 | 2019-02-26 | Popio Ip Holdings, Llc | Methods and systems for multi-pane video communications with photo-based signature verification |
-
2016
- 2016-08-04 US US15/768,950 patent/US11237717B2/en active Active
- 2016-08-04 WO PCT/JP2016/073014 patent/WO2017077751A1/ja active Application Filing
- 2016-08-04 JP JP2017548656A patent/JP6777089B2/ja active Active
-
2020
- 2020-10-06 JP JP2020169358A patent/JP7095722B2/ja active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015073175A (ja) * | 2013-10-02 | 2015-04-16 | 日本写真印刷株式会社 | 動画スタンプシステム、動画スタンプシステムの端末装置、動画スタンプシステムのスタンプサーバ、及び動画スタンプシステムのプログラム |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
JP2019537298A (ja) * | 2016-10-01 | 2019-12-19 | フェイスブック,インク. | クライアントデバイスによって得られたビデオデータをレンダリング中に1つまたは複数のエフェクトで拡張するための構成 |
US10977847B2 (en) | 2016-10-01 | 2021-04-13 | Facebook, Inc. | Architecture for augmenting video data obtained by a client device with one or more effects during rendering |
JP2017229062A (ja) * | 2017-04-26 | 2017-12-28 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
JP6198983B1 (ja) * | 2017-04-26 | 2017-09-20 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
JP2017229085A (ja) * | 2017-08-22 | 2017-12-28 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
CN110062269A (zh) * | 2018-01-18 | 2019-07-26 | 腾讯科技(深圳)有限公司 | 附加对象显示方法、装置及计算机设备 |
US11640235B2 (en) | 2018-01-18 | 2023-05-02 | Tencent Technology (Shenzhen) Company Limited | Additional object display method and apparatus, computer device, and storage medium |
US12033296B2 (en) | 2018-05-07 | 2024-07-09 | Apple Inc. | Avatar creation user interface |
JP7404587B2 (ja) | 2018-05-07 | 2023-12-25 | アップル インコーポレイテッド | クリエイティブカメラ |
JP2023175756A (ja) * | 2018-05-07 | 2023-12-12 | アップル インコーポレイテッド | クリエイティブカメラ |
WO2020071545A1 (ja) * | 2018-10-04 | 2020-04-09 | パロニム株式会社 | 情報処理装置 |
JP2021022803A (ja) * | 2019-07-26 | 2021-02-18 | 株式会社リンクコーポレイトコミュニケーションズ | 情報処理装置、端末装置、情報処理方法、およびプログラム |
JP7001645B2 (ja) | 2019-07-26 | 2022-01-19 | 株式会社リンクコーポレイトコミュニケーションズ | 情報処理装置、端末装置、情報処理方法、およびプログラム |
JP7261727B2 (ja) | 2019-11-20 | 2023-04-20 | グリー株式会社 | 動画配信システム、動画配信方法及びサーバ |
JP2020109943A (ja) * | 2019-11-20 | 2020-07-16 | グリー株式会社 | 動画配信システム、動画配信方法、動画配信プログラム、情報処理端末及び動画視聴プログラム |
JP2023518878A (ja) * | 2020-03-23 | 2023-05-08 | 北京字節跳動網絡技術有限公司 | 特殊効果処理方法及び装置 |
JP7473674B2 (ja) | 2020-03-23 | 2024-04-23 | 北京字節跳動網絡技術有限公司 | 特殊効果処理方法及び装置 |
US12008230B2 (en) | 2020-05-11 | 2024-06-11 | Apple Inc. | User interfaces related to time with an editable background |
US12081862B2 (en) | 2020-06-01 | 2024-09-03 | Apple Inc. | User interfaces for managing media |
JP7012792B2 (ja) | 2020-08-06 | 2022-01-28 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
JP2020188514A (ja) * | 2020-08-06 | 2020-11-19 | 株式会社 ディー・エヌ・エー | 動画を配信するためのシステム、方法、及びプログラム |
US12099713B2 (en) | 2023-07-11 | 2024-09-24 | Apple Inc. | User interfaces related to time |
US12101567B2 (en) | 2023-07-31 | 2024-09-24 | Apple Inc. | User interfaces for altering visual media |
Also Published As
Publication number | Publication date |
---|---|
US11237717B2 (en) | 2022-02-01 |
JP7095722B2 (ja) | 2022-07-05 |
JP6777089B2 (ja) | 2020-10-28 |
US20180300037A1 (en) | 2018-10-18 |
JPWO2017077751A1 (ja) | 2018-08-30 |
JP2021007042A (ja) | 2021-01-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7095722B2 (ja) | 情報処理装置およびプログラム | |
US11003253B2 (en) | Gesture control of gaming applications | |
US10179407B2 (en) | Dynamic multi-sensor and multi-robot interface system | |
US20170061696A1 (en) | Virtual reality display apparatus and display method thereof | |
JP6760271B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
KR20170026164A (ko) | 가상 현실 디스플레이 장치 및 그 장치의 표시 방법 | |
CN111045511B (zh) | 基于手势的操控方法及终端设备 | |
CN106527929B (zh) | 一种图片信息隐藏方法及装置 | |
US10572017B2 (en) | Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments | |
KR20120123330A (ko) | 프리젠테이션을 위한 카메라 탐색 | |
KR20160016955A (ko) | 의도를 통한 증강 현실에서의 가상 객체의 조작 | |
US20130265448A1 (en) | Analyzing Human Gestural Commands | |
CN103440033B (zh) | 一种基于徒手和单目摄像头实现人机交互的方法和装置 | |
JP2014048937A (ja) | ジェスチャ認識装置、その制御方法、表示機器、および制御プログラム | |
US10257436B1 (en) | Method for using deep learning for facilitating real-time view switching and video editing on computing devices | |
JP2014233035A (ja) | 情報処理装置、表示制御方法及びプログラム | |
JP7059934B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
KR20150094680A (ko) | 타겟팅 및 누르기 내추럴 사용자 입력 | |
CA2838878A1 (en) | Method and apparatus for controlling contents in electronic device | |
US20220398816A1 (en) | Systems And Methods For Providing Real-Time Composite Video From Multiple Source Devices Featuring Augmented Reality Elements | |
CN111818382B (zh) | 一种录屏方法、装置及电子设备 | |
US11328187B2 (en) | Information processing apparatus and information processing method | |
US20190339771A1 (en) | Method, System and Apparatus For Brainwave and View Based Recommendations and Story Telling | |
CN113711164A (zh) | 用于应用和对应设备的用户控制的方法和装置 | |
JP6842194B2 (ja) | 表示制御装置、表示制御方法及び表示制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16861820 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017548656 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15768950 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16861820 Country of ref document: EP Kind code of ref document: A1 |