CN116546162A - Video distribution system, video distribution server and video distribution method - Google Patents
Video distribution system, video distribution server and video distribution method Download PDFInfo
- Publication number
- CN116546162A CN116546162A CN202211565374.9A CN202211565374A CN116546162A CN 116546162 A CN116546162 A CN 116546162A CN 202211565374 A CN202211565374 A CN 202211565374A CN 116546162 A CN116546162 A CN 116546162A
- Authority
- CN
- China
- Prior art keywords
- processor
- video
- real
- user
- distribution
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000012790 confirmation Methods 0.000 claims description 19
- 230000007423 decrease Effects 0.000 claims 2
- 238000004891 communication Methods 0.000 description 79
- 238000012545 processing Methods 0.000 description 51
- 238000003384 imaging method Methods 0.000 description 30
- 230000005540 biological transmission Effects 0.000 description 25
- 230000008569 process Effects 0.000 description 19
- 230000008859 change Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 4
- 102100036848 C-C motif chemokine 20 Human genes 0.000 description 3
- 102100035353 Cyclin-dependent kinase 2-associated protein 1 Human genes 0.000 description 3
- 101000710013 Homo sapiens Reversion-inducing cysteine-rich protein with Kazal motifs Proteins 0.000 description 3
- 102100029860 Suppressor of tumorigenicity 20 protein Human genes 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 3
- 239000000470 constituent Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention aims to solve the problem of providing a video distribution system, a video distribution server and a video distribution method, wherein a user can easily find real-time video which is shot at a place which the user wants to see. In order to solve the above-described problem, the video distribution system according to the embodiment includes a transmitting unit and a distribution unit. The transmitting unit transmits a map showing the position of a camera equipped with the mobile body to the terminal device. The distribution unit distributes, in real time, a video captured by the camera selected from the cameras shown on the map to the terminal device.
Description
Technical Field
The invention relates to a video release system, a video release server and a video release method.
Background
There is a system that displays video photographed and distributed in real time by a camera provided at a position different from a position where a viewer is located.
[ Prior Art literature ]
(patent literature)
Patent document 1: japanese patent laid-open No. 2021-56191
Disclosure of Invention
[ problem to be solved by the invention ]
However, it is difficult for the user to find a real-time video that is being photographed at a desired place.
The embodiment of the invention aims to solve the problem of providing a video distribution system, a video distribution server and a video distribution method, which are easy for a user to find a real-time video which is shot at a place which the user wants to watch.
[ means of solving the problems ]
The video distribution system according to an embodiment includes a transmitting unit and a distribution unit. The transmitting unit transmits a map showing the position of a camera equipped with the mobile body to the terminal device. The distribution unit distributes, in real time, a video captured by the camera selected from the cameras shown on the map to the terminal device.
(effects of the invention)
In the present invention, a user can easily find a real-time video that is being photographed at a place at which the user wants to see.
Drawings
Fig. 1 is a block diagram showing an example of the configuration of a distribution system according to an embodiment and a main part of constituent elements included in the distribution system.
Fig. 2 is a flowchart showing an example of processing performed by the processor of the imaging device in fig. 1.
Fig. 3 is a flowchart showing an example of processing performed by the processor of the publishing server of fig. 1.
Fig. 4 is a flowchart showing an example of processing performed by the processor of the terminal apparatus in fig. 1.
Fig. 5 is a diagram showing an example of the distribution position map.
Detailed Description
Hereinafter, a distribution system according to an embodiment will be described with reference to the drawings. In the following description of the embodiments, the scale of each part may be changed as appropriate. For convenience of explanation, the drawings used in the following description of the embodiments may be omitted. In the drawings and the present specification, the same reference numerals denote the same components.
Fig. 1 is a block diagram showing an example of the configuration of a distribution system 1 according to the embodiment and the main parts of the constituent elements included in the distribution system 1. The distribution system 1 is a system that provides video distribution services. The video distribution service here is a service for distributing video captured by the capturing apparatus 100 to viewing users in real time. In addition, a video distributed in real time is hereinafter referred to as a "real-time video". Real-time video may also be referred to as live publishing or live broadcasting, etc. In addition, the viewing user refers to a user who views real-time video among users of the video distribution service. Further, a user who publishes real-time video among users who utilize the video publishing service is referred to as a "publishing user". As an example, the distribution system 1 includes an imaging device 100, a distribution server 200, and a terminal device 300. In fig. 1, each device is shown as one, but the number of devices is not limited. The distribution system 1 typically includes a plurality of photographing apparatuses 100 and terminal apparatuses 300. However, in the following description, one imaging device 100 and one terminal device 300 are representatively described as examples.
The distribution system 1 is an example of a video distribution system.
The imaging device 100, the distribution server 200, and the terminal device 300 are connected to a network NW. The network NW is typically a communication network comprising the internet.
The image capturing apparatus 100 captures an image and uploads the image in real time. The imaging device 100 is a device equipped with a mobile body. The moving object is a vehicle such as an automobile, a motorcycle, or a bicycle, or other means of transportation. In this case, the imaging device 100 may be shared with a camera for autopilot. The camera 100 may be shared with a drive recorder. The moving object is, for example, a pedestrian. The person is equipped by holding or wearing the photographing apparatus 100. The moving body may be an unmanned aerial vehicle such as an automated guided vehicle or an unmanned aerial vehicle. As an example, the imaging apparatus 100 includes a processor 101, a read-only memory (ROM) 102, a random-access memory (RAM) 103, an auxiliary storage 104, a communication interface 105, an input device 106, an output device 107, an imaging unit 108, and a positioning unit 109. And, a bus 110 or the like connects the respective sections.
The processor 101 is a central part of a computer that performs processing such as computation and control necessary for the operation of the imaging device 100, and performs various computation and processing. The processor 101 is, for example, a central processing unit (central processing unit, CPU), a microprocessor (micro processing unit, MPU), a system on a chip (SoC), a digital signal processor (digital signal processor, DSP), a graphics processor (graphics processing unit, GPU), an application specific integrated circuit (application specific integrated circuit, ASIC), a programmable logic device (programmable logic device, PLD), or a field-programmable gate array (field-programmable gate array, FPGA), or the like. Alternatively, the processor 101 may be configured by combining a plurality of these. The processor 101 may be a combination of these hardware accelerators. The processor 101 controls the respective sections based on programs such as firmware, system software, and application software stored in the ROM 102, the auxiliary storage device 104, or the like, so as to realize various functions of the imaging device 100. The processor 101 executes a process described below based on the program. A part or the whole of the program may be incorporated in the circuit of the processor 101.
The ROM 102 and the RAM 103 are main storage devices of a computer having the processor 101 as a center.
The ROM 102 is a nonvolatile memory dedicated to readout of data. The ROM 102 stores, for example, firmware or the like among the above programs. The ROM 102 also stores data and the like used when the processor 101 performs various processes.
The RAM 103 is a memory for reading and writing data. The RAM 103 is used as a work area or the like for storing data temporarily used when the processor 101 performs various processes. RAM 103 is typically a volatile memory.
The auxiliary storage 104 is an auxiliary storage of a computer having the processor 101 as a center. The auxiliary storage 104 is, for example, an electrically erasable programmable read-only memory (EEPROM), a Hard Disk Drive (HDD), a flash memory, or the like. The auxiliary storage 104 stores, for example, system software, application software, and the like among the programs. The auxiliary storage device 104 stores data used when the processor 101 performs various processes, data generated by the processes in the processor 101, various setting values, and the like.
In addition, the auxiliary storage 104 stores a Database (DB) and a video DB.
The user DB is a database that stores and manages various information, various settings, and the like related to users of the video distribution service in association with user Identification (ID). The user ID is identification information uniquely assigned to each user. The user DB stores, for example, the user name of each user, the number of points held, and the like.
The video DB is a database storing and managing information related to videos.
The communication interface 105 is an interface for the imaging device 100 to communicate via the network NW or the like.
The input device 106 receives an operation performed by an operator of the imaging apparatus 100. The input device 106 is, for example, a keyboard, a keypad, a touchpad, a mouse, a controller, or the like. The input device 106 may be a device for inputting voice.
The output device 107 displays a screen for notifying the operator of the photographing apparatus 100 of various information. The output device 107 is, for example, a display such as a liquid crystal display or an organic Electroluminescence (EL) display. Further, as the input device 106 and the output device 107, a touch panel may be used. That is, a display panel provided in a touch panel may be used as the output device 107, and a touch panel provided in a touch panel may be used as the input device 106.
The imaging unit 108 is a camera or the like that captures video. The imaging unit 108 is an example of a camera equipped with a mobile body.
The positioning unit 109 performs positioning of the position of the imaging device 100 using a global navigation satellite system (global navigation satellite system, GNSS) such as a global positioning system (Global Positioning System, GPS) or the like. That is, the positioning unit 109 performs positioning of the position of the imaging unit 108.
The bus 110 includes a control bus, an address bus, a data bus, and the like, and transmits signals transmitted and received in each section of the imaging device 100.
The distribution server 200 displays the position of each imaging device 100 on a map. Thus, the distribution server 200 can select a real-time video to be watched by the user from the map. In addition, the distribution server 200 receives the video transmitted from the photographing apparatus 100 and transmits the video to the terminal apparatus 300. As an example, the distribution server 200 includes a processor 201, a ROM 202, a RAM 203, a secondary storage 204, and a communication interface 205. And, a bus 206 or the like connects the respective portions.
The distribution server 200 is an example of a video distribution server.
The processor 201 is a central part of a computer that performs processing such as computation and control necessary for the operation of the distribution server 200, and performs various computation and processing. The processor 201 is, for example, CPU, MPU, soC, DSP, GPU, ASIC, PLD, FPGA, or the like. Alternatively, the processor 201 is configured to combine a plurality of these. The processor 201 may be a combination of these hardware accelerators. The processor 201 controls the respective sections based on programs such as firmware, system software, and application software stored in the ROM 202, the auxiliary storage 204, or the like, so as to realize various functions of the distribution server 200. The processor 201 executes a process described below based on the program. In addition, a part or the whole of the program may be incorporated in the circuit of the processor 201.
The ROM 202 and the RAM 203 are main storage devices of a computer having the processor 201 as a center.
The ROM 202 is a nonvolatile memory dedicated to readout of data. The ROM 202 stores, for example, firmware or the like among the above programs. The ROM 202 also stores data and the like used when the processor 201 performs various processes.
The RAM 203 is a memory for reading and writing data. The RAM 203 is used as a work area or the like for storing data temporarily used when the processor 201 performs various processes. RAM 203 is typically a volatile memory.
The auxiliary storage 204 is an auxiliary storage of a computer having the processor 201 as a center. The auxiliary storage 204 is, for example, EEPROM, HDD, flash memory, or the like. The auxiliary storage 204 stores, for example, system software, application software, and the like among the programs. The auxiliary storage 204 stores data used when the processor 201 performs various processes, data generated by the processes in the processor 201, various setting values, and the like.
The communication interface 205 is an interface for the distribution server 200 to communicate via the network NW or the like.
Bus 206 includes a control bus, an address bus, a data bus, and the like, and transmits signals transmitted and received in each section of distribution server 200.
The terminal apparatus 300 is an apparatus that displays real-time video. The terminal apparatus 300 is an apparatus for viewing user operations. The terminal device 300 is, for example, a smart phone, a tablet terminal, a personal computer (personal computer, PC), a television receiver, or the like. As an example, the terminal apparatus 300 includes a processor 301, a ROM302, a RAM 303, an auxiliary storage device 304, a communication interface 305, an input device 306, and an output device 307. And, a bus 308 or the like connects the respective portions.
The processor 301 is a central part of a computer that performs processing such as computation and control necessary for the operation of the terminal device 300, and performs various computation and processing. The processor 301 is, for example, CPU, MPU, soC, DSP, GPU, ASIC, PLD, FPGA, or the like. Alternatively, the processor 301 may combine a plurality of these. The processor 301 may be a combination of these hardware accelerators. The processor 301 controls the respective sections based on programs such as firmware, system software, and application software stored in the ROM302, the auxiliary storage device 304, or the like, so as to realize various functions of the terminal device 300. The processor 301 executes a process described below based on the program. A part or the whole of the program may be incorporated in a circuit of the processor 301.
The ROM 302 and the RAM 303 are main storage devices of a computer having the processor 301 as a center.
The ROM 302 is a nonvolatile memory dedicated to readout of data. The ROM 302 stores, for example, firmware or the like among the above programs. The ROM 302 also stores data and the like used when the processor 301 performs various processes.
The RAM 303 is a memory for reading and writing data. The RAM 303 is used as a work area or the like for storing data temporarily used when the processor 301 performs various processes. RAM 303 is typically a volatile memory.
The auxiliary storage 304 is an auxiliary storage of a computer having the processor 301 as a center. The auxiliary storage 304 is, for example, EEPROM, HDD, flash memory, or the like. The auxiliary storage 304 stores, for example, system software, application software, and the like among the programs. The auxiliary storage device 304 stores data used when the processor 301 performs various processes, data generated by the processes in the processor 301, various setting values, and the like.
The communication interface 305 is an interface for the terminal apparatus 300 to communicate via the network NW or the like.
The input device 306 receives an operation performed by an operator of the terminal apparatus 300. The input device 306 is, for example, a keyboard, a keypad, a touchpad, a mouse, a controller, or the like. The input device 306 may be a device for inputting voice.
The output device 307 displays a screen for notifying the operator of the terminal apparatus 300 of various information. The output device 307 is, for example, a display such as a liquid crystal display or an organic EL display. Further, as the input device 306 and the output device 307, a touch panel may be used. That is, a display panel provided with a touch panel may be used as the output device 307, and a touch panel provided with a touch panel may be used as the input device 306.
The bus 308 includes a control bus, an address bus, a data bus, and the like, and transmits signals transmitted and received in each section of the terminal apparatus 300.
The operation of the distribution system 1 according to the embodiment will be described below with reference to fig. 2 to 4. The following description of the operation is an example of the processing, and various kinds of processing that can obtain the same result can be appropriately used. Fig. 2 is a flowchart showing an example of processing performed by the processor 101 of the imaging apparatus 100. The processor 101 executes the processing of fig. 2 based on a program stored in the ROM 102 or the secondary storage device 104 or the like, for example. Fig. 3 is a flowchart showing an example of processing performed by the processor 201 of the distribution server 200. The processor 201 executes the processing of fig. 3 based on a program stored in the ROM 202 or the auxiliary storage 204 or the like, for example. Fig. 4 is a flowchart showing an example of processing performed by the processor 301 of the terminal apparatus 300. The processor 301 executes the processing of fig. 4 based on a program stored in the ROM 302 or the secondary storage device 304 or the like, for example.
In step ST11 of fig. 2, the processor 101 of the imaging apparatus 100 determines whether or not to change the setting related to the distribution of the real-time video. If it is determined that the setting is not to be changed, the processor 101 determines No (No) in step ST11, and proceeds to step ST12.
In step ST12, the processor 101 determines whether or not to start distribution of real-time video. If the distribution of the real-time video is not started, the processor 101 determines no in step ST12, and proceeds to step ST13.
In step ST13, the processor 101 determines whether acknowledgement information is received by the communication interface 105. If the confirmation information is not received, the processor 101 determines no in step ST13, and proceeds to step ST14. The confirmation information will be described later.
In step ST14, the processor 101 determines whether to end the distribution of the real-time video. If the distribution of the real-time video is not finished, the processor 101 determines no in step ST14, and proceeds to step ST15.
In step ST15, the processor 101 determines whether to display various information about the posting user. If the various information is not displayed, the processor 101 determines no in step ST15, and returns to step ST11. In this way, the processor 101 is in a standby state in which steps ST11 to ST15 are repeated until it is determined that the setting is changed, the distribution of the real-time video is started, the confirmation information is received, the distribution of the real-time video is ended, or various information is displayed.
For example, if an operation instructing to change the setting is performed using the input device 106, the processor 101 determines to change the setting. If it is determined that the setting is changed while the standby state is in step ST11 to step ST15, the processor 101 determines Yes in step ST11 and proceeds to step ST16.
In step ST16, the processor 101 performs processing for setting change. For example, the processor 101 displays a setting screen for changing the setting on the output device 107. The operator of the imaging apparatus 100 inputs the setting change content by, for example, operating the input device 106 while viewing the setting screen. The setting contents include, for example, a viewing permission setting, a viewing point setting, a moving body setting, and a distribution prohibition setting. The viewing permission setting is a setting indicating whether or not a user who has issued the real-time video needs to be permitted to view the real-time video. The viewing point setting is a setting regarding whether or not a necessary point is required for viewing real-time video and the number of points required for viewing. The number of points required for viewing is, for example, a number of points set for a set time, a number of points set for one viewing, or a number of points set for a predetermined period. In addition, money may be used instead of points. The moving object setting is a setting indicating what moving object the imaging device 100 is equipped with. The distribution prohibition setting is a setting of a place where the distribution of the video is not performed. During the period when the position of the imaging device 100 is within the place, an automatic stop is issued. For example, if the own-home periphery is set as the location, the distribution is automatically stopped during the own-home periphery.
In the present specification and claims, the term "viewing" does not necessarily mean that the user is viewing the video, and is regarded as being viewed if the terminal apparatus 300 receives the video.
After the setting change content is input, the processor 101 generates setting information. The setting information includes a user ID of the publishing user and setting change contents. The setting information is information indicating that the setting is changed according to the setting change content. After generating the setting information, the processor 101 instructs the communication interface 105 to transmit the setting information to the distribution server 200. Upon receiving the transmission instruction, the communication interface 105 transmits the setting information to the distribution server 200. The transmitted setting information is received by the communication interface 205 of the distribution server 200. After the processing of step ST16, the processor 101 returns to step ST11.
On the other hand, in step ST31 of fig. 3, the processor 201 of the distribution server 200 determines whether or not the setting information is received by the communication interface 205. If the setting information is not received, the processor 201 determines no in step ST31, and proceeds to step ST32.
In step ST32, the processor 201 determines whether or not the start information is received by the communication interface 205. If the start information is not received, the processor 201 determines no in step ST32, and proceeds to step ST33.
In step ST33, the processor 201 determines whether a map request is received by the communication interface 205. If the map request is not received, the processor 201 determines no in step ST33, and proceeds to step ST34.
In step ST34, the processor 201 determines whether or not the viewing request is received by the communication interface 205. If the viewing request is not received, the processor 201 determines no in step ST34, and proceeds to step ST35.
In step ST35, the processor 201 determines whether to stop transmitting the real-time video to the terminal apparatus 300. If it is determined that the transmission of the real-time video to the terminal apparatus 300 is not stopped, the processor 201 determines no in step ST35, and proceeds to step ST36.
In step ST36, the processor 201 determines whether end information is received by the communication interface 205. If the end information is not received, the processor 201 determines no in step ST36, and proceeds to step ST37.
In step ST37, the processor 201 determines whether or not an information request is received by the communication interface 205. If the information request is not received, the processor 201 determines no in step ST37, and returns to step ST31. In this way, the processor 201 is in a standby state in which steps ST31 to ST37 are repeated until it is determined that the setting information, the start information, the map request, the viewing request, the end information, or the information request is received, or the transmission of the real-time video to the terminal apparatus 300 is stopped. Further, the start information, map request, viewing request, end information, and information request will be described below.
If the setting information is received while in the standby state of step ST31 to step ST37, the processor 201 determines yes in step ST31 and proceeds to step ST38.
In step ST38, the processor 201 stores the setting contents in the user DB based on the setting change contents in the received setting information. Further, the processor 201 stores the setting content in the user DB in association with the user ID in the setting information. After the processing of step ST38, the processor 201 returns to step ST31.
On the other hand, when the distribution user wants to start the distribution of the real-time video, for example, the distribution user performs an operation of instructing to start the distribution of the real-time video using the input device 106 of the photographing apparatus 100. If this is done, the processor 101 decides to start the distribution of real-time video. If it is determined that the distribution of the real-time video is started while in the standby state of steps ST11 to ST15 in fig. 2, the processor 101 determines yes in step ST12 and proceeds to step ST17.
In step ST17, the processor 101 performs processing for starting the distribution of real-time video. That is, when the video is not being captured, the processor 101 controls the capturing section 108 to start capturing the video. Then, the processor 101 instructs the communication interface 105 to transmit the start information to the distribution server 200. The start information includes a user ID of the publishing user. The start information is information indicating that the release of the real-time video is started. Upon receiving the transmission instruction, the communication interface 105 transmits the start information to the distribution server 200. The transmitted start information is received by the communication interface 205 of the publication server 200. Further, the processor 101 instructs the communication interface 105 to start transmitting the video captured by the capturing unit 108 and the position information indicating the current position of the capturing device 100 measured by the measuring unit 109 to the distribution server 200 in real time. Upon receiving the transmission instruction, the communication interface 105 starts transmitting the video and the position information to the distribution server 200 in real time. The video transmitted is received by the communication interface 205 of the publishing server 200. In addition, the processor 101 instructs the communication interface 105 to send navigation information to the publication server 200 as needed. The navigation information is information obtained from navigation such as car navigation. The navigation information includes, for example, information indicating a destination of the mobile body and a path to the destination. Upon receiving the transmitted instruction, the communication interface 105 transmits the navigation information to the distribution server 200. The transmitted navigation information is received by the communication interface 205 of the publishing server 200. After the processing of step ST17, the processor 101 returns to step ST11.
However, when the position of the photographing apparatus 100 enters the place set in the distribution prohibition setting during the distribution of the real-time video, the processor 101 temporarily stops the distribution. That is, the processor 101 instructs the communication interface 105 to stop sending real-time video and location information to the distribution server 200 in real-time. When the position of the imaging device 100 is away from the location set in the allocation prohibition setting, the processor 101 instructs the communication interface 105 to resume the real-time transmission of the real-time video and the position information to the distribution server 200. The communication interface 105 stops and restarts transmission according to these instructions.
If the start information is received while in the standby state of step ST31 to step ST37, the processor 201 determines yes in step ST32 and proceeds to step ST39.
In step ST39, the processor 201 controls the communication interface 205, and starts receiving the video and the position information transmitted from the photographing device 100, after the start information. In addition, the processor 201 assigns a video ID to the video. The video ID is unique identification information for each video. The processor 201 stores the user ID in the received start information, the video ID, and the received position information in association in the video DB. Further, the processor 201 updates the position information in real time. After the processing of step ST39, the processor 201 returns to step ST31.
On the other hand, when the viewing user wants to view real-time video using the video distribution service, the terminal apparatus 300 starts application software for the video distribution service. With the start of the software, the processor 301 of the terminal apparatus 300 starts the processing shown in fig. 4.
In step ST51 of fig. 4, the processor 301 of the terminal apparatus 300 performs a process of accepting specification of a search condition for searching for a shooting position of a real-time video. The viewing user inputs a desired photographing position as a search condition, for example, using the input device 306. The location is input by, for example, an address, a place name, a name of a building, latitude and longitude, or the like. Alternatively, the viewing user may input the position by designating the position on the map displayed on the output device 307.
In step ST52, the processor 301 generates a map request. The map request includes the user ID and the search condition input in step ST 51. The map request is information requesting to send a distribution location map including a location specified by a search condition. The distribution position map is a map in which the capturing positions of the real-time videos being distributed, that is, the positions of the capturing devices 100 that are distributing the real-time videos, are shown on the map. Further, the map request may include information specifying the range of the map. After generating the map request, the processor 301 instructs the communication interface 305 to transmit the map request to the distribution server 200. Upon receiving the transmitted instruction, the communication interface 305 transmits the map request to the distribution server 200. The map request sent is received by the communication interface 205 of the distribution server 200.
On the other hand, if a map request is received while in the standby state of steps ST31 to ST37 of fig. 3, the processor 201 of the distribution server 200 determines yes in step ST33 and proceeds to step ST40.
In step ST40, the processor 201 generates a distribution position map including the position specified by the search condition in the map request.
Fig. 5 is a diagram showing an example of the distribution position map M1. The distribution position map M1 is formed by displaying an area AR1, an area AR2, an icon I1, an icon I2, an arrow A1, and the like on the map M2, for example.
The map M2 is, for example, a map including a position specified by the search condition. When the map request does not specify the map range, the map M2 is a map of a predetermined scale centered on the position. However, in the case where the range of the map is specified in the map request, the range of the map M2 is the range specified in the map request. In this case, if the range of the map does not contain the position specified by the search condition, the map M is a map that does not contain the position.
The area AR1 is an area in which the position of the map M2, the position specified by the search condition, or the like is represented by a character string or the like.
The icon I1 is an icon indicating the position of the photographing device 100 that is delivering real-time video on the map M2 within the range shown in the map M2. Further, the processor 201 determines which camera 100 is distributing real-time video within the map M2, for example, using the position information transmitted by the camera 100. Icon I1 is a button. By touching or clicking the icon I1, the icon I1 is placed in a selected state, and the video captured by the imaging device 100 shown by the icon I1 can be viewed. Further, the icon I1 associates the video ID of the video. The icons I1 include the types of icons I1-1 and I1-2.
The icon I1-1 is an icon I1 when the mobile object equipped with the imaging device 100 is an automobile. The icon I1-2 is an icon I1 when the moving object equipped with the imaging device 100 is a pedestrian. In addition, when the moving object equipped with the imaging device 100 is another moving object such as a motorcycle, the icon I1 corresponding to the type of the moving object is changed.
The icon I2 is an icon indicating a position specified by the search condition on the map M2. The position indicated by the icon I2 is an example of the position indicated by the terminal device 300.
The arrow A1 is an arrow indicating the traveling direction of the moving body shown by the icon I1. Further, the processor 201 determines the traveling direction of the moving body using the navigation information transmitted by the photographing device 100. Alternatively, the processor 201 determines the traveling direction of the moving body using the position information transmitted by the photographing device 100.
For example, the area AR2 is displayed in the vicinity of an arrow A1, the arrow A1 indicating the traveling direction of the moving body that is expected to pass through the position indicated by the icon I2. In FIG. 5, an area AR2 is displayed in the vicinity of the arrow A1 of icons I1-1a, I1-2a, and I1-2 b. The area AR2 shows the time it takes for the mobile body to move to the position indicated by the icon I2. Further, the processor 201 determines the time using the navigation information transmitted by the photographing apparatus 100. Alternatively, the processor 201 determines the time using the position information transmitted by the photographing device 100.
In step ST41, the processor 201 generates map information. The map information includes the distribution position map M1 generated in step ST 40. The map information is information indicating that the distribution position map M1 is displayed. After generating the map information, the processor 201 instructs the communication interface 205 to transmit the map information to the terminal apparatus 300. Upon receiving the instruction for transmission, the communication interface 205 transmits the map information to the terminal apparatus 300. The map information transmitted is received by the communication interface 305 of the terminal apparatus 300. After the processing of step ST41, the processor 201 returns to step ST31.
The distribution position map is an example of a map image in which the position of a camera equipped in a mobile body is shown on the map M2. Thus, the processor 201 functions as an example of a display unit for displaying a map image on the terminal device by performing the processing of step ST 41.
On the other hand, in step ST53 of fig. 4, the processor 301 of the terminal apparatus 300 waits for the reception of the map information by the communication interface 305. If the map information is received, the processor 301 determines yes in step ST53, and proceeds to step ST54.
In step ST54, the processor 301 instructs the output device 307 to display the distribution position map M1. Upon receiving the instruction for display, the output device 307 displays the distribution position map M1.
The processor 301 performs the processing of step ST54, thereby functioning as an example of a display unit for displaying the distribution position map M1.
The distribution position map M1 may be a map in which the range of the map M2 can be changed. If an operation to change the range of the map M2 is performed, the processor 301 makes a map request corresponding to the changed range, for example. Then, the processor 301 displays the distribution position map M1 with the changed range based on the map information transmitted as a response to the map request.
The viewing user makes the icon I1 corresponding to the real-time video to be viewed into the selected state by operating it.
In step ST55, the processor 301 waits for any icon I1 on the distribution position map M1 to be selected. If any icon I1 on the distribution position map M1 is selected, the processor 301 determines YES in step ST55, and proceeds to step ST56.
In step ST56, the processor 301 generates a viewing request. The viewing request includes a user ID of the viewing user and a video ID corresponding to the selected icon I1. The viewing request is information requesting to transmit real-time video determined by the video ID. After generating the viewing request, the processor 301 instructs the communication interface 305 to transmit the viewing request to the distribution server 200. Upon receiving the transmitted instruction, the communication interface 305 transmits the viewing request to the distribution server 200. The transmitted viewing request is received by the communication interface 205 of the distribution server 200.
On the other hand, if the viewing request is received while in the standby state of step ST31 to step ST37, the processor 201 of the distribution server 200 determines yes in step ST34, and proceeds to step ST42.
In step ST42, the processor 201 determines whether viewing of real-time video determined by the video ID in the viewing request is required to be permitted. The processor 201 refers to the viewing permission setting in the user DB, and determines that the viewing of the real-time video is permitted when the viewing of the real-time video is permitted by the distribution user who is set to be required to distribute the real-time video. The viewing permission setting is included in the setting content associated with the user ID of the distribution user. If viewing of the real-time video is required to be permitted, the processor 201 determines yes in step ST42, and proceeds to step ST43.
In step ST43, the processor 201 performs processing for confirming to the publishing user whether viewing is permitted. That is, the processor 201 generates the confirmation information. The confirmation information includes, for example, a user ID, a user name, and the like of the viewing user. The confirmation information is information for asking whether the viewing user is allowed to view real-time video. After generating the confirmation information, the processor 201 instructs the communication interface 205 to transmit the confirmation information to the photographing device 100. The transmission destination image capturing apparatus 100 is an image capturing apparatus 100 that is capturing a video specified by a video ID corresponding to the selected icon I1. Upon receiving the instruction to transmit, the communication interface 205 transmits the confirmation information to the photographing apparatus 100. The transmitted confirmation information is received by the communication interface 105 of the photographing apparatus 100.
If the acknowledgement is received while in the standby state of step ST11 to step ST15, the processor 101 determines yes in step ST13 and proceeds to step ST18.
In step ST18, the processor 101 generates an image corresponding to the confirmation screen. The processor 101 then instructs the output device 107 to display the generated image. Upon receiving the instruction for display, the output device 107 displays a confirmation screen.
The confirmation screen includes, for example, a user name, a user ID, an allow button, and a reject button. The user name and user ID in the confirmation screen are the user name and user ID in the confirmation information.
The permission button is a button for issuing a user operation when permitting the viewing user specified by the user name and the user ID in the confirmation screen to view.
The reject button is a button for issuing a user operation when the viewing user specified by the user name and the user ID in the confirmation screen is not permitted to view.
In step ST19, the processor 101 determines whether or not an operation of permitting viewing of real-time video such as an operation of permitting button is performed. If the operation of permitting viewing is performed, the processor 101 determines yes in step ST19, and proceeds to step ST20.
In step ST20, the processor 101 instructs the communication interface 105 to transmit the permission information to the distribution server 200. The permission information is information indicating that viewing of real-time video is permitted. Upon receiving the transmitted instruction, the communication interface 105 transmits the permission information to the distribution server 200. The transmitted permission information is received by the communication interface 205 of the distribution server 200. After the processing of step ST20, the processor 101 returns to step ST11.
On the other hand, when the operation such as the reject button is operated and the real-time video is not permitted to be viewed, the processor 101 determines that the operation to permit the viewing is not performed. When a predetermined time elapses without either one of the operation of permitting viewing of the real-time video or the operation of not permitting viewing of the real-time video, the processor 101 determines that the operation of permitting viewing is not performed as a result of the timeout. If it is not determined that the operation for permitting viewing is not performed, the processor 101 determines no in step ST19, and proceeds to step ST21.
In step ST21, the processor 101 instructs the communication interface 105 to transmit rejection information to the distribution server 200. Upon receiving the instruction for transmission, the communication interface 105 transmits the rejection information to the distribution server 200. The transmitted rejection information is received by the communication interface 205 of the publication server 200. After the processing of step ST21, the processor 101 returns to step ST11.
On the other hand, in step ST43 of fig. 3, the processor 201 of the distribution server 200 waits for reception of the permission information or the rejection information by the communication interface 205. If the permission information or the rejection information is received, the processor 201 determines yes in step ST43, and proceeds to step ST44.
As described above, by performing the processing in step ST43, the processor 201 functions as an example of a confirmation unit for confirming whether or not the viewing user is permitted to view the video when the viewing user is about to view the real-time video.
In step ST44, the processor 201 determines whether or not viewing of real-time video is permitted. If the rejection information is received in step ST43, the processor 201 determines that viewing of real-time video is not permitted. If it is determined that the viewing of real-time video is not permitted, the processor 201 determines no in step ST44, and returns to step ST31.
Further, the processor 201 instructs the communication interface 205 to transmit a rejection notification indicating that viewing of real-time video is not permitted to the terminal apparatus 300 before returning from step ST44 to step ST31. Upon receiving the instruction to transmit, the communication interface 205 transmits the rejection notification to the terminal apparatus 300. The rejection notification is received by the communication interface 305 of the terminal apparatus 300.
In contrast, if the permission information is received in step ST43, the processor 201 determines that viewing of real-time video is permitted. If it is determined that the viewing of real-time video is permitted, the processor 201 of the distribution server 200 determines yes in step ST44, and proceeds to step ST45.
In addition, if viewing of real-time video is not required to be permitted, the processor 201 determines no in step ST42, and proceeds to step ST45.
In step ST45, the processor 201 instructs the communication interface 205 to start transmitting the real-time video determined by the video ID in the viewing request to the terminal apparatus 300. Upon receiving the instruction for transmission, the communication interface 205 starts transmitting the real-time video to the terminal apparatus 300. Thus, the distribution server 200 has the following configuration: the real-time video transmitted by the photographing apparatus 100 is relayed and transmitted to the terminal apparatus 300. The reception of the real-time video to be transmitted is started by means of the communication interface 305 of the terminal device 300.
The processor 201 subtracts the point required for viewing the real-time video set by the viewing point setting from the point held by the viewing user. For example, when the set point is required every set time, the processor 201 subtracts the set point from the held point of the viewing user every time the set time elapses. For example, if it is set that a point is required for each viewing, the processor 201 subtracts the point from the point held by the viewing user.
The processor 201 adds the point subtracted from the held point of the viewing user to the held point of the publishing user. The distribution user is a distribution user who is distributing real-time video to be watched by the viewing user. That is, the distribution user is a distribution user who is distributing real-time video determined by the video ID in the viewing request. The issuing user may exchange the held points with cash or merchandise, or the like.
After the processing of step ST45, the processor 201 returns to step ST31.
The processor 201 functions as an example of a point unit that adds the point or money held by the issuing user by adding the point or money subtracted from the held point or held money of the viewing user to the held point or held money of the issuing user. The processor 201 functions as an example of a point unit that adds the point or money subtracted from the held point or money of the viewing user to the held point or money of the issuing user, thereby reducing the point or money held by the viewing user, and increases the point or money held by the issuing user according to the reduction amount.
The processor 201 performs the processing of step ST34 and step ST45 in cooperation with the communication interface 205, and thereby functions as an example of a distribution unit that distributes the video captured by the camera selected from the map to the terminal device 300 in real time.
On the other hand, in step ST57 of fig. 4, the processor 301 of the terminal apparatus 300 determines whether or not to start display of real-time video. For example, when the rejection notification is received, the processor 301 determines that the display of the real-time video is not started. If it is determined that the display of the real-time video is not started, the processor 301 determines no in step ST57, and proceeds to step ST58.
In step ST58, the processor 301 displays an image indicating that viewing of real-time video is not permitted on the output device 307. After the processing of step ST58, the processor 301 returns to step ST54.
After the processing of step ST58, the processor 301 returns to step ST54.
In contrast, for example, when the reception of the real-time video is started, the processor 301 determines to start the display of the real-time video. If the display of the real-time video is started, the processor 301 determines yes in step ST57, and proceeds to step ST59.
In step ST59, the processor 301 controls the output device 307 to start display (play) of the real-time video received by the communication interface 305. The processor 301 plays the real-time video using, for example, live Streaming (Live Streaming).
In step ST60, the processor 301 determines whether to stop playing of the real-time video being played. For example, when the stop button is operated by the viewing user, the processor 301 determines to stop the playback of the real-time video. If the playing of the live video being played is not stopped, the processor 301 determines no in step ST60, and repeats the processing of step ST 60. In contrast, if the playing of the live video being played is stopped, the processor 301 determines yes in step ST60, and proceeds to step ST61.
In step ST61, the processor 301 controls the output device 307 to stop the play of the real-time video being played. In addition, the processor 301 controls the communication interface 305 to stop the reception of the real-time video.
In step ST62, the processor 301 instructs the communication interface 305 to transmit stop information to the distribution server 200. The stop information includes a user ID of the viewing user and a video ID of the real-time video to stop playing. The stop information is information indicating that the playback and viewing of the real-time video are stopped. Upon receiving the instruction for transmission, the communication interface 305 transmits the stop information to the distribution server 200. The transmitted stop information is received by the communication interface 205 of the publishing server 200. After the processing of step ST62, the processor 301 returns to step ST54.
On the other hand, if the stop information is received, the processor 201 of the distribution server 200 determines to stop the transmission of the real-time video. In addition, when the transmission of the real-time video becomes impossible, for example, when the communication with the terminal apparatus 300 is disconnected, the processor 201 also determines to stop the transmission of the real-time video.
If it is determined that the transmission of the real-time video to the terminal apparatus 300 is stopped while in the standby state of step ST31 to step ST37, the processor 201 determines yes in step ST35 and proceeds to step ST46.
In step ST46, the processor 201 controls the communication interface 105 to stop transmitting the real-time video to the terminal apparatus 300. The terminal device 300 is, for example, a terminal device 300 used by a viewing user specified by a user ID in stop information. And, the real-time video is a real-time video determined by the video ID in the stop information. Alternatively, the terminal device 300 is a terminal device 300 that becomes unable to transmit real-time video. The real-time video is a real-time video that has become impossible to transmit.
Further, for a viewing user who is a subject to which transmission of real-time video is stopped, the processor 201 stops processing of subtracting the point set in the viewing point setting from the held point of the viewing user every time the time set in the viewing point setting elapses.
After the processing of step ST46, the processor 201 returns to step ST31.
When the distribution user is to end the distribution of the real-time video, the distribution user performs an operation of instructing to end the distribution of the real-time video using the input device 106 of the photographing apparatus 100. For example, when this operation is performed, the processor 101 determines to end the distribution of the real-time video. If it is determined that the distribution of the real-time video is completed while in the standby state of step ST11 to step ST15, the processor 101 determines yes in step ST14 and proceeds to step ST22.
In step ST22, the processor 101 performs processing for ending the distribution of the real-time video. That is, the processor 101 instructs the communication interface 105 to end transmitting real-time video and location information to the distribution server 200 in real time. Upon receiving the instruction, the communication interface 105 ends the transmission. Further, the processor 101 instructs the communication interface 105 to send the end information to the distribution server 200. The end information includes, for example, a user ID of the publishing user and a video ID of the real-time video to end the publishing. The end information is information indicating that the distribution of the real-time video is ended. Upon receiving the transmission instruction, the communication interface 105 transmits the end information to the distribution server 200. The transmitted end information is received by the communication interface 205 of the publication server 200.
If the end information is received while in the standby state of step ST31 to step ST37, the processor 201 determines yes in step ST36 and proceeds to step ST47.
In step ST47, the processor 201 controls the communication interface 205 to end the reception of the real-time video and the position information transmitted from the photographing device 100. Further, the processor 201 controls the communication interface 205 to stop transmission to each of the photographing devices 100 that are receiving the real-time video. Further, for each viewing user who is viewing the real-time video, the processor 201 stops the process of subtracting the point set in the viewing point setting from the held point of the viewing user every time the time set in the viewing point setting elapses. After the processing of step ST47, the processor 201 returns to step ST31.
When a user wants to confirm various information such as his own number of points held, the user performs an operation of instructing to display various information, for example, using the input device 106 of the imaging apparatus 100. If this operation is performed, the processor 101 determines to display various information.
If it is determined that various information is displayed while in the standby state of steps ST11 to ST15 in fig. 2, the processor 101 determines yes in step ST15 and proceeds to step ST23.
In step ST23, the processor 101 performs processing for acquiring various information. That is, the processor 101 generates an information request. The information request includes, for example, a user ID of the publishing user. The information request is information requesting to transmit various information about the publishing user. After generating the information request, the processor 101 instructs the communication interface 105 to transmit the information request to the distribution server 200. Upon receiving the transmitted instruction, the communication interface 105 transmits the information request to the distribution server 200. The transmitted information request is received by the communication interface 205 of the publication server 200.
On the other hand, if the information request is received while in the standby state of steps ST31 to ST37 of fig. 3, the processor 201 of the distribution server 200 determines yes in step ST36 and proceeds to step ST47.
In step ST47, the processor 201 acquires various information requested by the information request from the user DB or the like. Then, the processor 201 instructs the communication interface 205 to transmit the various information to the photographing device 100. Upon receiving the instruction for transmission, the communication interface 205 transmits the various information to the photographing apparatus 100. The transmitted various information is received by the communication interface 105 of the photographing apparatus 100. After the processing of step ST47, the processor 201 returns to step ST31.
On the other hand, the processor 101 of the photographing apparatus 100 waits for various information to be received by the communication interface 105. By receiving various information, the processor 101 acquires various information.
In step ST24, the processor 101 displays the various information acquired in step ST23 on the output device 107. After the processing of step ST24, the processor 101 returns to step ST11.
The distribution system 1 of the embodiment displays the position of the imaging device 100 on a map. Therefore, the viewing user can easily find the photographing apparatus 100 that is photographing at the position he/she wants to see.
In addition, the distribution system 1 of the embodiment distributes real-time video captured by the capturing apparatus 100 equipped on a moving body. Thus, the viewing user can know the current status of various sites.
In addition, in the distribution system 1 of the embodiment, a distribution user whose real-time video is watched can obtain points or money. Thus, a publishing user can obtain revenue by publishing real-time video.
In addition, in the distribution system 1 of the embodiment, the viewing user who views the real-time video pays points or money. The points paid by the viewing user are then paid to the publishing user. In this way, the issuing system 1 of the embodiment can pay points to the issuing user.
In addition, in the distribution system 1 according to the embodiment, when a user views a real-time video according to a set content, it is necessary to distribute permission of the user. Thus, the distribution user can prevent unwanted viewing users from viewing real-time video. In addition, the publishing user can prevent real-time video from being viewed at times when it is not desired to be seen.
The delivery system 1 according to the embodiment displays an arrow A1 indicating the traveling direction of the mobile body on the delivery position map M1. Thus, the viewing user can know the location where the photographing apparatus 100 is to photograph next.
The distribution system 1 according to the embodiment displays the expected time taken for the mobile object to move to the position of the icon I2 on the distribution position map M1. Thus, the viewing user can easily find a real-time video for shooting a desired place.
The above embodiment can be modified as follows.
In the above embodiment, even if no one views a real-time video, the photographing apparatus 100 starts transmission of the real-time video. However, the image capturing apparatus 100 may start transmission when the user initially watches the image. For example, the processor 201 of the distribution server 200 instructs the photographing apparatus 100 to start transmission of real-time video before the processing of step ST 45. In response to the instruction, the photographing apparatus 100 starts transmission of the real-time video. In addition, the distribution server 200 starts the reception of the real-time video.
In the above embodiment, the distribution system 1 relays a real-time video from the photographing apparatus 100 to the terminal apparatus 300 via the distribution server 200. However, the distribution system 1 may also transmit real-time video from the photographing apparatus 100 to the terminal apparatus 300 using peer-to-peer (P2P) or the like without relaying through the distribution server 200.
In the above embodiment, the point obtained by the publishing user is a point subtracted from the held point of the viewing user. However, the number of points obtained by the publishing user is not limited thereto. For example, the distribution user obtains points paid by an operator or the like of the video distribution service. Such points are paid for, for example, a portion of advertising revenue obtained by an operator of the video distribution service. In addition, the distribution system determines points paid by the operator to the distribution user, for example, based on the number of viewers of real-time video. The processor 201 performs a process of paying such points to the issuing user. Note that, for such points, money may be used instead of the points.
The processor 201 performs the process of paying out points or money to the issuing user as described above, and thereby functions as an example of a point unit that increases points or money held by the issuing user.
In the processor 101, the processor 201, and the processor 301, part or all of the processing performed by the program in the above-described embodiment may be implemented by a hardware configuration of a circuit.
The program for realizing the processing of the embodiment is transferred in a state stored in the device, for example. However, the apparatus may be transferred in a state where the program is not stored. Then, the program may be transferred separately and written into the device. The transfer of the program at this time may be realized by, for example, recording on a removable storage medium or downloading via the internet, a network such as a local area network (local area network, LAN), or the like.
The embodiments of the present invention have been described above, but the embodiments are shown as examples and do not limit the scope of the present invention. The embodiments of the present invention can be implemented in various forms within a scope not departing from the gist of the present invention.
Reference numerals
1 publishing system
100 shooting device
101. 201, 301 processor
102、202、302ROM
103、203、303RAM
104. 204, 304 auxiliary storage device
105. 205, 305 communication interface
106. 306 input device
107. 307 output device
108 shooting part
109 position measuring section
110. 206, 308 bus
200 publishing server
300 terminal device
Claims (8)
1. A video distribution system comprising:
a display unit that displays, on a terminal device, a map image formed on a map by showing the position of a camera equipped with a mobile body; the method comprises the steps of,
and a distribution unit configured to distribute, in real time, a video captured by the camera selected from the cameras shown on the map to the terminal device.
2. The video distribution system according to claim 1, further comprising a point unit that increases a point or a piece of money held by a distribution user who distributes the video when the video is viewed.
3. The video distribution system according to claim 2, wherein the point unit decreases the number of points or money held by a viewing user who views the video, and increases the number of points or money held by the distribution user according to the decrease amount.
4. The video distribution system according to claim 1, further comprising a confirmation unit configured to confirm whether or not the viewing user is permitted to view the video to a distribution user who distributes the video when the viewing user is about to view the video.
5. The video distribution system according to claim 1, wherein the map image shows a traveling direction of the moving body.
6. The video distribution system according to claim 1, wherein the map image shows a time taken for the moving body to move to a position designated by the terminal device.
7. A video distribution server comprising:
a display unit that displays, on a terminal device, a map image formed on a map by showing the position of a camera equipped with a mobile body; the method comprises the steps of,
and a distribution unit configured to distribute, in real time, a video captured by the camera selected from the cameras shown on the map to the terminal device.
8. A video distribution method displays a map image formed by showing the position of a camera equipped with a mobile body on a map,
and distributing in real time video captured by the camera selected from the aforementioned cameras shown on the aforementioned map.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022014758A JP2023112812A (en) | 2022-02-02 | 2022-02-02 | Video distribution system, video distribution server, and video distribution method |
JP2022-014758 | 2022-02-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116546162A true CN116546162A (en) | 2023-08-04 |
Family
ID=87445801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211565374.9A Pending CN116546162A (en) | 2022-02-02 | 2022-12-07 | Video distribution system, video distribution server and video distribution method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2023112812A (en) |
CN (1) | CN116546162A (en) |
-
2022
- 2022-02-02 JP JP2022014758A patent/JP2023112812A/en active Pending
- 2022-12-07 CN CN202211565374.9A patent/CN116546162A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2023112812A (en) | 2023-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113015012B (en) | Live broadcast data processing method, device, computer equipment and storage medium | |
US10194068B2 (en) | Digital camera that communicates with another digital camera | |
JP2008217831A (en) | Method and system for mediating digital data, and program for digital data mediation service | |
US20080014909A1 (en) | Information communication system | |
KR101974190B1 (en) | Mobile terminal, service server, system for providing tour service including the same and method for providing tour service | |
CN116546162A (en) | Video distribution system, video distribution server and video distribution method | |
JP2002237813A (en) | Information downloading system and data communication terminal | |
KR101432834B1 (en) | Method of providing interactive menus on demand to terminals coupled to a communication network | |
JP2008107927A (en) | Information transfer system for vehicle | |
JP7447399B2 (en) | Video distribution control device, video distribution control method, and program | |
JP2006101279A (en) | Video acquiring system and video imaging apparatus or the like | |
JP2008232967A (en) | Information delivery service system | |
JP2007073055A (en) | Apparatus for distributing satellite image | |
JPH11164284A (en) | Two-way type television system | |
US7502555B2 (en) | Information terminal device | |
CN116647720A (en) | Publishing server, program, and publishing system | |
JP2006109092A (en) | Image distribution system, its control method, and program | |
JP2002353925A (en) | Information receiver and information system for utilization | |
WO2024161605A1 (en) | Information processing device, user terminal, and information processing method | |
JP7459310B2 (en) | Distribution control device, distribution mediation server, and distribution mediation method | |
WO2024232011A1 (en) | Mobile terminal, information processing device, and information processing method | |
JP4255360B2 (en) | Program reservation system | |
KR102278623B1 (en) | Method and system for providing contents using set top box in accommodations | |
JP2007104568A (en) | Communication terminal, data transmitting method, data receiving method, data transmitting program, data receiving program, and recording medium | |
JP2007115077A (en) | Communication terminal, information display method, information display program and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |