JP6548538B2 - Image delivery system and server - Google Patents

Image delivery system and server Download PDF

Info

Publication number
JP6548538B2
JP6548538B2 JP2015182023A JP2015182023A JP6548538B2 JP 6548538 B2 JP6548538 B2 JP 6548538B2 JP 2015182023 A JP2015182023 A JP 2015182023A JP 2015182023 A JP2015182023 A JP 2015182023A JP 6548538 B2 JP6548538 B2 JP 6548538B2
Authority
JP
Japan
Prior art keywords
moving image
image data
scene
server
client
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015182023A
Other languages
Japanese (ja)
Other versions
JP2017059953A (en
Inventor
成幸 宮崎
成幸 宮崎
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2015182023A priority Critical patent/JP6548538B2/en
Publication of JP2017059953A publication Critical patent/JP2017059953A/en
Application granted granted Critical
Publication of JP6548538B2 publication Critical patent/JP6548538B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image delivery system and a server.
  There is a communication system in which data is stored in a storage unit of a server accessible to a plurality of client devices, and each client device accesses the server as necessary to acquire data from the storage unit of the server. According to such a communication system, a plurality of devices can refer to the same data (data stored by the server). Also, a number of conventional techniques for such communication systems have been proposed.
  However, with the spread of the Internet, the increase in storage capacity, and the increase in the amount of information of multimedia data, the data size of data to be transmitted from the server to the client device is increasing. For example, with the increase in resolution of digital cameras, the data size of image data generated by photographing has been increasing. An increase in the amount of communication between the server and the client device is not preferable because it causes a restriction on communication speed, an increase in communication charges, and the like.
  In the client device, only a part of scenes of moving image data (stored moving image data) stored by the server may be required. However, in the related art, even if the client device needs only a part of the stored moving image data, moving image data including unnecessary scenes is transmitted from the server to the client device. For example, even if the client device needs only a part of stored moving image data, moving image data from the entire frame of the stored moving image data or the first frame of the stored moving image data to the last frame of the required scene is from the server to the client device Sent to Transmission of such unnecessary scenes causes an increase in the amount of communication.
  Further, as a technology related to the above-described communication system, there is a technology disclosed in Patent Document 1. In the technology disclosed in Patent Document 1, data compression processing is performed on moving image data so that the moving image data is transmitted by the server at the same speed as the reproduction speed of the moving image data based on the communication speed between the server and the client device. Applied. According to the technology disclosed in Patent Document 1, the amount of communication can be reduced by data compression processing. However, in the technology disclosed in Patent Document 1, even if the client device requires high-quality moving image data, there is a possibility that low-quality moving image data may be transmitted from the server to the client device. Then, in the technology disclosed in Patent Document 1, data compression processing is performed at the time of transmission of moving image data. Therefore, when transmission of moving image data from a server to a plurality of client devices is simultaneously performed, data compression processing is individually performed for each client device. Then, due to the plurality of data compression processes, there is a possibility that transmission of moving image data from the server to the plurality of client devices may be delayed.
JP, 2005-117084, A
  An object of the present invention is to provide a technology capable of transmitting desired moving image data from a server to a client device, and reducing the amount of communication between the server and the client device.
The first aspect of the present invention is
An image delivery system comprising a server and a client device, wherein
The server is
Storage means capable of storing first moving image data, and second moving image data smaller in data size than the first moving image data and representing the same moving image as the first moving image data;
First transmitting means for transmitting the second moving image data to the client device;
First receiving means for receiving from the client apparatus scene information indicating at least a part of a scene of the second moving image data;
First extraction means for extracting moving image data of a scene indicated by the scene information from the first moving image data;
A second transmission unit that transmits extracted moving image data, which is moving image data extracted by the first extraction unit, to the client device;
Have
The client device is
Second receiving means for receiving the second moving image data from the server;
First generation means for generating the scene information based on the second moving image data;
Third transmitting means for transmitting the scene information to the server;
Third receiving means for receiving the extracted moving image data from the server;
I have a,
The first extraction means is moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information. The image delivery system is characterized in that it is extracted from the first moving image data .
The second aspect of the present invention is
A computer apparatus connectable to another computer device,
Storage means capable of storing first moving image data, and second moving image data smaller in data size than the first moving image data and representing the same moving image as the first moving image data;
First transmitting means for transmitting the second moving image data to the other computer device ;
Receiving means for receiving scene information indicating at least a part of the second moving image data from the other computer device ;
Extracting means for extracting moving image data of a scene indicated by the scene information from the first moving image data;
Second transmitting means for transmitting extracted moving image data, which is moving image data extracted by the extracting means, to the other computer device ;
Have
The extraction means is for moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information. It is a computer apparatus characterized by extracting it from the first moving image data .
The third aspect of the present invention is
A control method of an image delivery system having a server and a client device, comprising:
A first transmission step of transmitting, to the client device, second moving image data in which the server has a smaller data size than the first moving image data and which represents the same moving image as the first moving image data;
A first receiving step in which the client device receives the second moving image data from the server;
A generating step of generating scene information indicating a scene of at least a part of the second moving image data based on the second moving image data;
A second transmission step in which the client device transmits the scene information to the server;
A second receiving step in which the server receives the scene information from the client device;
An extraction step in which the server extracts moving image data of a scene indicated by the scene information from the first moving image data;
A third transmission step of transmitting, to the client device, the extracted moving image data, which is the moving image data extracted by the extraction step, by the server;
A third receiving step in which the client device receives the extracted moving image data from the server;
I have a,
In the extraction step, moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information is It is a control method characterized by extracting it from the first moving image data .
The fourth aspect of the present invention is
A control method of a computer device connectable to another computer device , comprising:
A first transmitting step of transmitting, to the other computer device , second moving image data having a smaller data size than the first moving image data and representing the same moving image as the first moving image data;
Receiving scene information indicating at least a part of the second moving image data from the other computer device ;
Extracting moving image data of a scene indicated by the scene information from the first moving image data;
A second transmitting step of transmitting extracted moving image data, which is moving image data extracted by the extracting step, to the other computer device ;
Have
In the extraction step, moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information is It is a control method characterized by extracting it from the first moving image data .
  A fifth aspect of the present invention is a program causing a computer to execute the steps of the control method described above.
  According to the present invention, desired moving picture data can be transmitted from the server to the client device, and the amount of communication between the server and the client device can be reduced.
Block diagram showing an example of the configuration of the image delivery system according to the present embodiment A flow chart showing an example of a processing flow of moving image generation processing according to the present embodiment A diagram showing an example of an image management table according to the present embodiment A flow chart showing an example of the processing flow of the image delivery system according to the present embodiment A flowchart showing an example of the processing flow of S401 in FIG. 4 A diagram showing an example of a GUI according to the present embodiment A diagram showing an example of a cutout request according to the present embodiment A flowchart showing an example of the processing flow of S404 in FIG. 4 A flowchart showing an example of the processing flow of S803 of FIG. 8 A diagram showing an example of the processing flow of S901 in FIG.
  Hereinafter, embodiments of the present invention will be described. FIG. 1 is a block diagram showing an example of the configuration of an image delivery system according to the present embodiment. As shown in FIG. 1, the image delivery system according to the present embodiment includes a camera (image generation device) 100, a server 110, and a client device 120 that can be connected to each other. As the server 110 and the client device 120, for example, a personal computer (PC) can be used. The image generation device is not limited to the camera 100. The image generation device may be any device as long as it can generate image data. For example, a PC may be used as an image generation device. The camera 100 and the server 110 are connected to each other, and the server 110 and the client device are connected to each other.
The camera 100 includes a central processing unit (CPU) 105, a memory 104, a network board 106, and a charge coupled device (CCD) image sensor 108. The CPU 105 controls the processing of the camera 100 by executing a program stored in the memory 104. Network board 106 is used for communication between camera 100 and server 110. For example, the CPU 105 uses the network board 106 to communicate with the server 110 via the Internet. The CCD image sensor 108 generates image data (photographed image data) by photographing. Specifically, the CCD image sensor 108 generates photographed image data by converting light into an electrical signal. The memory 104 stores a program necessary for the processing of the CPU 105. Further, the memory 104 is used as a working memory of the camera 100 and stores photographed image data.
  The server 110 includes a CPU 115, a network board 116, a memory 114, and an HDD (Hard Disk Drive) 117. The CPU 115 controls the processing of the server 110 by executing a program recorded in the HDD 117. The HDD 117 stores a program including a program necessary for the processing of the CPU 115. The HDD 117 can store data other than the program. The memory 114 is used as a working memory of the server 110. The network board 116 is used for communication between the camera 100 and the server 110 and for communication between the client device 120 and the server 110. For example, using the network board 116, the CPU 115 communicates with the camera 100 via the Internet, and communicates with the client device 120 via the Internet.
  The client device 120 includes a CPU 125, a network board 126, a memory 124, an HDD 127, a display unit 121, a mouse 122, and a keyboard 123. The CPU 125 controls the processing of the client device 120 by executing a program recorded in the HDD 127. The HDD 127 stores a program including a program necessary for the processing of the CPU 125. The HDD 127 can store data other than the program. The memory 124 is used as a working memory of the client device 120. Network board 126 is used for communication between client device 120 and server 110. For example, the CPU 125 uses the network board 126 to communicate with the server 110 via the Internet. The display unit 121 displays the processing result of the CPU 125. As the display unit 121, a liquid crystal display panel, an organic EL display panel, a plasma display panel, or the like can be used. The mouse 122 and the keyboard 123 are operation units that receive user operations on the client device 120. For example, the mouse 122 and the keyboard 123 receive a user operation to input various information used by a program of the client device 120 to the client device 120. Note that at least one of the display unit 121, the mouse 122, and the keyboard 123 may or may not be detachable from the client apparatus 120. The functional unit that can be attached to and detached from the client device 120 can be said to be part of the client device 120 or can be said to be a device separate from the client device 120.
  Although specific hardware is shown in FIG. 1 as the functional units of the camera 100, the server 110, and the client apparatus 120, the configurations of the respective functional units will be described if the processing described below is performed. Is not particularly limited. For example, a storage unit other than a memory or an HDD may be used. Specifically, an optical disc may be used as the storage unit. An operation unit other than a mouse or a keyboard may be used. Specifically, a touch panel may be used as the operation unit.
In the present embodiment, first moving image data and second moving image data are used as image data (captured image data). The second moving image data is moving image data having a smaller data size than the first moving image data and representing the same moving image as the first moving image data. The memory 104 of the camera 100, the HDD 117 of the server 110, and the HDD 127 of the client device 120 can store the first moving image data and the second moving image data. In the present embodiment, RAW moving image data (moving image data in RAW format), which is an electric signal generated by the CCD image sensor 108 of the camera 100, is used as first moving image data. Then, MP4 moving image data (moving image data in MP4 format) generated by performing data compression processing on RAW moving image data is used as second moving image data. The data compression process can also be said to be a format conversion process for converting the format of moving image data from RAW format to MP4 format.
  The formats of the first moving image data and the second moving image data are not particularly limited. The moving image data generated by performing the first data compression process on the RAW moving image data is used as the first moving image data, and the moving image data generated by performing the second data compression process on the RAW moving image data is the second It may be used as video data.
  In the present embodiment, the CPU 105 of the camera 100 generates MP4 moving image data from RAW moving image data. Note that the MP4 moving image data may be generated by the CPU 115 of the server 110. A processing flow of processing (second generation processing; moving image generation processing) of generating RAW moving image data and MP4 moving image data will be described using the flowchart of FIG. The process flow illustrated in FIG. 2 is started with the shooting performed by the camera 100 as a trigger. First, in S201, the CCD image sensor 108 converts light into an electric signal, and records the obtained electric signal, ie, RAW moving image data, in the memory 104. Next, in S202, the CPU 105 reads from the memory 104 the RAW moving image data recorded in the memory 104 in S201, and converts the read RAW moving image data into MP4 moving image data using a program recorded in the memory 104. . Then, the CPU 105 records the obtained MP4 moving image data in the memory 104. After that, in step S203, the CPU 105 adds information related to the RAW moving image data generated in step S201 and the MP4 moving image data generated in step S202 to the image management table recorded in the memory 104. Then, the processing flow of FIG. 2 is ended.
  An example of an image management table (camera management table) used by the camera 100 is shown in FIG. In the camera management table, an image ID, which is an identifier indicating a combination, is associated with the combination of the RAW moving image data and the MP4 moving image data. If the same image ID is not associated with a plurality of combinations, the image ID may be determined in any manner. For example, a number indicating the imaging order may be used as the image ID. In the camera management table, for RAW moving image data, the file name of RAW moving image data, the frame rate of RAW moving image data, and the RAW moving image data storage folder (the folder in which RAW moving image data is stored in memory 104) It is associated. Then, in the camera management table, the file name of the MP4 video data, the frame rate of the MP4 video data, and the storage folder of the MP4 video data are associated with the MP4 video data. Here, the “MP4 moving image data storage folder” is a folder in which the MP4 moving image data is stored in the memory 104. In the image management table, "-" means that there is no corresponding data.
  Next, an example of the processing flow of the image delivery system according to the present embodiment will be described using the flowchart of FIG. 4. Before the process flow of FIG. 4 starts, RAW moving image data and MP4 moving image data are transmitted from the camera 100 to the server 110 using the network board 106 by the CPU 105 of the camera 100 (fourth transmission processing). Then, in the server 110, the RAW moving image data and the MP4 moving image data are received from the camera 100 by the CPU 115 using the network board 116 (fourth reception processing). Then, the CPU 115 records the received RAW moving image data and MP4 moving image data in the HDD 117. Only the MP4 moving image data may be recorded in the HDD 117. Raw moving image data may not be transmitted from the camera 100 to the server 110 at this timing.
  First, in S401, the CPU 115 of the server 110 transmits MP4 moving image data to the client device 120 using the network board 116 (first transmission process). Then, the CPU 125 of the client device 120 receives the MP4 moving image data from the server 110 using the network board 126 (second reception process). The process flow of S401 will be described in detail using the flow chart of FIG.
  First, in S501, the CPU 115 uses the network board 116 to refer to the image management table (client management table) recorded in the HDD 127 of the client apparatus 120. An example of the client management table is shown in FIG. The client management table has the same configuration as the camera management table. However, the client management table indicates a folder in the HDD 127 as a storage folder. In the client management table of FIG. 3A, only the MP4 moving image data of the image ID “103” and the RAW moving image data of the image ID “103” are shown. From this client management table, it can be understood that only the MP4 moving image data of the image ID “103” and the RAW moving image data of the image ID “103” are recorded in the client apparatus 120.
  Next, in S502, the CPU 115 detects unrecorded MP4 moving image data, which is MP4 moving image data recorded in the server 110 and not recorded in the client device 120. Unrecorded MP4 moving image data is detected by comparing the image management table (server management table) recorded in the HDD 117 of the server 110 with the client management table. An example of the server management table is shown in FIG. The server management table has the same configuration as the camera management table and the client management table. However, the server management table indicates a folder in the HDD 117 as a storage folder. When the client management table of FIG. 3 (A) and the server management table of FIG. 3 (B) are compared, MP4 moving image data of image ID “101” and image ID “ 102 "MP4 movie data is detected.
  Then, in S503, the CPU 115 switches the subsequent processing according to the detection result of S502. Specifically, when the unrecorded MP4 moving image data is not detected, the transmission of the MP4 moving image data from the server 110 to the client device 120 is omitted, and the processing flow of FIG. 5 is ended. If unrecorded MP4 moving image data is detected, the process proceeds to S504.
  In S504, the CPU 115 transmits the unrecorded MP4 moving image data and the image ID of the unrecorded MP4 moving image data to the client device 120 using the network board 116. Then, the CPU 125 uses the network board 126 to receive the unrecorded MP4 moving image data and the image ID of the unrecorded MP4 moving image data from the server 110.
  Then, in S505, the CPU 125 records the received unrecorded MP4 moving image data in the HDD 127, and adds information on the received unrecorded MP4 moving image data to the client management table. Thereafter, the process flow of FIG. 5 is ended.
  The CPU 125 of the client device 120 performs display control to display a confirmation screen (dialog) as shown in FIG. 6A on the display unit 121, and allows the user to select whether or not to receive MP4 moving image data. It is also good. Further, the MP4 moving image data transmitted and received in S401 is not particularly limited. All unrecorded MP4 moving image data may be transmitted and received, or unrecorded moving image data selected from among a plurality of unrecorded MP4 moving image data may be transmitted and received. The MP4 moving image data stored in the client device 120 may be transmitted and received again.
  It returns to the explanation of FIG. When the process of S401 (transmission and reception of MP4 moving image data) is completed, the process of S402 is performed. In S402, the CPU 125 of the client device 120 reproduces a moving image (MP4 moving image) based on the received MP4 moving image data. That is, the CPU 125 performs display control to display the MP4 moving image on the display unit 121. The reproduction of the MP4 moving image is performed using a GUI (GUI of an application) as shown in FIG. 6 (B). When a plurality of MP4 moving image data are received, any one of the plurality of MP4 moving image data is selected, and an MP4 moving image based on the selected MP4 moving image data is reproduced. In FIG. 6B, an image display area 601 is an area where the reproduced MP4 moving image is displayed.
  When the reproduction of the MP4 moving image starts, the process of S403 is performed. In S403, the CPU 125 generates scene information indicating at least a part of a scene of the MP4 moving image data based on the MP4 moving image data to be reproduced (first generation processing). Specifically, the CPU 125 generates scene information in accordance with the scene selection operation. The scene selection operation is a user operation in which the user selects at least a part of scenes of the MP4 moving image while confirming the reproduced MP4 moving image. The scene selection operation is performed using a GUI as shown in FIG. 6 (B). Thereafter, transmission of scene information (third transmission processing) is performed by the client device 120, and reception of scene information (first reception processing) is performed by the server 110. Specifically, the CPU 125 transmits the cutout request including the scene information to the server 110 using the network board 126. Then, the CPU 115 receives the cutout request from the client device 120 using the network board 116.
  The method of generating scene information is not particularly limited as long as the method uses MP4 moving image data. For example, a scene whose image motion is equal to or greater than a threshold may be detected from the MP4 moving image data, and scene information indicating the detected scene may be generated. In addition, a scene whose volume is equal to or higher than a threshold may be detected from the MP4 moving image data, and scene information indicating the detected scene may be generated. According to these methods, it is possible to automatically generate scene information indicating a scene that is expected to be exciting, without requiring a user operation.
  In FIG. 6B, the play button 602 is a button operated by the user to start the MP4 moving image reproduction, and the pause button 603 is a button operated by the user for pausing the MP4 moving image reproduction. The seek bar 604 indicates the entire period of the MP4 moving image, and the slider 606 indicates the current reproduction position (the time position of the frame being reproduced). The user can change the playback position by operating the slider 606 using the mouse 122.
  The cutout request described above is a request for transmission of moving image data or still image data. The radio button 608 is a radio button selected by the user to request the server 110 to transmit moving image data, and the radio button 609 is a radio button selected by the user to request the server 110 to transmit still image data It is. The radio button 608 and the radio button 609 are exclusively selectable. That is, the user can select only one of the radio button 608 and the radio button 609, and can not select both the radio button 608 and the radio button 609.
When the video radio button 608 is selected, the start point designation button 610 and the end point designation button 612 can be operated. On the other hand, the still image scene designation button 614 can not be operated. The start point specification button 610 is a button operated by the user to determine the start point of the scene, and the end point specification button 612 is a button operated by the user to determine the end point of the scene. When the start point specifying button 610 is pressed, the start point marker 605 is arranged at the position of the slider 606 at the timing when the start point specifying button 610 is pressed. The start point marker 605 indicates the start point of the scene. When the end point specification button 612 is pressed, the end point specification button 612
The end point marker 607 is placed at the position of the slider 606 at the timing when the key is pressed. An end point marker 607 indicates the end point of the scene. When the start point marker 605 and the end point marker 607 are arranged on the seek bar 604, the cutout reproduction button 613 can be operated. The cut-out reproduction button 613 is a button operated by the user to start the reproduction of the MP4 moving image of only the scene indicated by the start point marker 605 and the end point marker 607. When the cut-out reproduction button 613 is pressed, an MP4 moving image from the frame of the start point indicated by the start point marker 605 to the frame of the end point indicated by the end point marker 607 is displayed in the image display area 601.
  When the still image radio button 609 is selected, the still image scene designation button 614 can be operated. On the other hand, the start point designation button 610 and the end point designation button 612 can not be operated. In addition, the cutout reproduction button 613 can not be operated. The still image scene designation button 614 is a button operated by the user to determine a frame that is the start point of the scene and the end point of the scene. When the still image scene designation button 614 is pressed, a frame marker is arranged at the position of the slider 606 at the timing when the still image scene designation button 614 is pressed. The frame marker indicates a frame that is the start of the scene and the end of the scene.
  The request button 616 is a button operated by the user to transmit the extraction request to the server 110. When the request button 616 is pressed in a state where the still image radio button 609 is selected, scene information indicating a frame indicated by the frame marker is generated. Then, a cutout request including the generated scene information is transmitted from the client device 120 to the server 110. When the request button 616 is pressed in a state where the radio button 608 for moving images is selected, scene information indicating a scene from the frame of the start point indicated by the start point marker 605 to the frame of the end point indicated by the end point marker 607 Is generated. Then, a cutout request including the generated scene information is transmitted from the client device 120 to the server 110. The cancel button 615 is a button operated by the user to cancel the transmission of the cutout request. When the cancel button 615 is pressed, all designations of scenes and the like are ignored, and the application is ended without transmitting the cutout request to the server 110.
  An example of the extraction request transmitted from the client apparatus 120 to the server 110 is shown in FIG. In the example of FIG. 7A, the extraction request includes a start frame number, an end frame number, an image ID, and a format specification. For example, the combination of the start frame number and the end frame number is scene information. The start frame number is a number indicating the frame at the start of the scene, and the end frame number is a number indicating the frame at the end of the scene. When the request button 616 is pressed in a state in which the moving image radio button 608 is selected, the number of the start frame indicated by the start marker 605 is set as the start frame number. Then, the frame number of the end point indicated by the end point marker 607 is set as the end frame number. When the request button 616 is pressed in a state where the still image radio button 609 is selected, the frame number indicated by the frame marker is set as the start frame number and the end frame number. The image ID included in the cutout request is the image ID of the MP4 moving image data used to generate the cutout request (scene information). The format designation is format information indicating the format of image data requested to the server. The format designation is also generated based on the MP4 moving image data, similarly to the scene information. The cutout request may not include the format specification (format information). The transmission / reception of the format specification may not be performed. The format designation, the scene information, and / or the image ID may be transmitted and received separately from the cutout request.
  It returns to the explanation of FIG. When the process of S403 (transmission and reception of the cutout request) is completed, the process of S404 is performed. In S404, the moving image data of the scene indicated by the cutout request (scene information) is cut out (extracted) from the RAW moving image data. The processing flow of the cutout processing will be described in detail with reference to the flowchart of FIG.
  First, in S801, the CPU 115 of the server 110 determines corresponding RAW moving image data that is RAW moving image data corresponding to the received cutout request. Specifically, the CPU 115 determines that the RAW moving image data associated with the same image ID as the image ID included in the cutout request is the corresponding RAW moving image data.
  Next, in S802, the CPU 115 determines whether the server 110 stores the corresponding RAW moving image data. Specifically, the CPU 115 determines whether or not the information of the corresponding RAW moving image data is described in the server management table. If the information on the corresponding raw moving image data is described in the server management table, it is determined that the corresponding raw moving image data is stored in the server 110, and the process proceeds to S803. If the information on the corresponding raw moving image data is not described in the server management table, it is determined that the corresponding raw moving image data is not stored in the server 110, and the process proceeds to S804.
  For example, when the server management table is the server management table of FIG. 3B and the image ID included in the cutout request is “101”, it is assumed that the server 110 stores the corresponding RAW moving image data. The process is advanced to S803. When the server management table is the server management table shown in FIG. 3B and the image ID included in the extraction request is "102", it is determined that the corresponding RAW moving image data is not stored by the server 110. The process proceeds to step S804.
  When the corresponding raw moving image data is not stored in the server 110, error information regarding the absence of the corresponding raw moving image data may be transmitted from the server 110 to the client apparatus 120. Then, the user may be notified by the CPU 125 of the client device 120 that “corresponding RAW moving image data does not exist”, “the cutout request can not be accepted”, and the like. In this case, the camera 100 is unnecessary in the image delivery system. The notification to the user can be realized by image display, voice output, lighting of a lamp, and the like.
  In S803, the CPU 115 performs clipping processing (first clipping processing; first extraction processing) of clipping out the moving image data of the scene indicated by the clipping request from the corresponding RAW moving image data. The processing flow of the first cutout processing will be described in detail with reference to the flowchart of FIG.
  First, in S901, the CPU 115 detects a scene (corresponding scene) corresponding to the scene indicated by the cutout request from the corresponding RAW moving image data. The frame rate of the MP4 movie data does not necessarily match the frame rate of the RAW movie data. For example, the application of MP4 video data may be different from the application of RAW video data. Specifically, while MP4 moving image data is moving image data for distribution or simple viewing, RAW moving image data may not. Therefore, the frame rate of MP4 moving image data may be different from the frame rate of RAW moving image data. For these reasons, the process of S901 is required. The process flow of S901 will be described in detail using the flowchart of FIG.
First, in S1001, the CPU 115 acquires the start frame number and the end frame number from the cutout request. Next, in S1002, the CPU 115 causes the server to set the frame rate of the MP4 moving image data corresponding to the image ID included in the cutout request and the frame rate of the RAW moving image data corresponding to the image ID included in the cutout request. Acquire from the management table. Then, the CPU 115 calculates the ratio of the two acquired frame rates. Thereafter, in S1003, the CPU 115 converts the start frame number and the end frame number into frame numbers indicating frames of RAW moving image data based on the ratio calculated in S1002. The processes of S1002 and S1003 are represented, for example, by the following equation 1. In Equation 1, “RAW number” is a frame number of RAW moving image data, and “MP4 number” is a frame number of MP4 moving image data. “RAW rate” is a frame rate of RAW moving image data, and “MP4 rate” is a frame rate of MP4 moving image data. "INT (X)" is a function that truncates the decimal part of X. When the cutout request (the start frame number and the end frame number) in FIG. 7A is corrected using Expression 1, the cutout request shown in FIG. 7B is obtained as the cutout request after correction. Then, the process flow of FIG. 10 is ended.

RAW number = MP4 number x INT ((RAW rate / MP4 rate) + 0.5)
... (Equation 1)
  According to the processing flow of FIG. 10, a frame of RAW moving image data whose time position is closest to the time position of the start point of the scene indicated by the scene information is detected as the start point of the corresponding scene. Then, a frame of RAW moving image data whose time position is closest to the time position of the end point of the scene indicated by the scene information is detected as the end point of the corresponding scene. When the scene information indicates one frame of the MP4 moving image data, a frame of RAW moving image data whose time position is closest to the time position of the frame indicated by the scene information is detected as a corresponding scene (corresponding frame). The method of detecting the corresponding scene is not particularly limited. For example, among a plurality of frames whose time position is earlier than the time position of the start point of the scene indicated by the scene information, the frame whose time position is closest to the time position of the start point of the scene indicated by the scene information is the start point of the corresponding scene. It may be detected. Among the plurality of frames whose time position is later than the time position of the end point of the scene indicated by the scene information, the frame whose time position is closest to the time position of the end point of the scene indicated by the scene information is detected as the start point of the corresponding scene May be
  It returns to the explanation of FIG. After S901, in S902, the CPU 115 cuts out moving image data of the corresponding scene from the corresponding RAW moving image data. When the corresponding scene is a scene of a plurality of frames, moving image data representing the plurality of frames in order is cut out. When the corresponding scene is a scene of one frame, still image data representing the one frame is cut out. Hereinafter, the moving image data cut out in the process of S902 is referred to as "first extracted moving image data".
  Then, in S903, the CPU 115 determines whether or not the received cutout request includes a format designation. If the format request is included in the extraction request, the process proceeds to S904. If the format request is not included in the extraction request, the process proceeds to S905.
In S904, the CPU 115 converts the format of the first extracted moving image data into the format indicated by the format specification, thereby generating first converted moving image data that is the converted moving image data (first conversion process). Thereafter, the process proceeds to S905. For example, when the cutout request of FIG. 7C is received, the cutout request is corrected by the same method (method using equation 1) as the case of the cutout request of FIG. 7A is received. As a result, the cutout request of FIG. 7D is obtained as the cutout request after correction. Next, a moving image of the corresponding scene (a scene from the start frame number indicated by the clipping request of FIG. 7D to the end frame number indicated by the clipping request of FIG. 7D) indicated by the clipping request of FIG. Data is cut out from the corresponding RAW video data. And the format of the clipped video data is R
Converted from AW format to MP4 format. The processes of S903 and S904 may be omitted.
  In S 905, the CPU 115 transmits the moving image data to the client device 120 using the network board 116 (second transmission process). Then, the CPU 125 of the client device 120 receives the moving image data from the server 110 using the network board 126 (third reception process). If the format request is not included in the extraction request, the first extracted video data is transmitted and received, and if the format request is included in the extraction request, the first converted video data is used instead of the first extracted video data. Are sent and received. Thereafter, the processing flow of FIG. 9 is ended. That is, the process of S803 in FIG. 8 is completed. Then, the process flow of FIG. 8 is ended.
  It returns to the explanation of FIG. In S804, the CPU 115 transmits a transmission request for instructing transmission of the corresponding RAW moving image data to the camera 100 using the network board 116 (fifth transmission processing). Then, the CPU 105 of the camera 100 receives the transmission request from the server 110 using the network board 106 (fifth reception process). In the present embodiment, transmission / reception of the transmission request is performed as transmission / reception of the transmission request before correction.
  Next, in step S805, the CPU 105 determines corresponding RAW moving image data. The determination method of the corresponding RAW moving image data is the same as in S801. Then, in S806, the CPU 105 determines whether the camera 100 stores the corresponding RAW moving image data. Specifically, the CPU 105 determines whether the information on the corresponding RAW moving image data is described in the camera management table. If the information on the corresponding RAW moving image data is described in the camera management table, it is determined that the corresponding RAW moving image data is stored in the camera 100, and the process proceeds to S807. If the information on the corresponding RAW moving image data is not described in the camera management table, it is determined that the corresponding RAW moving image data is not stored in the camera 100, and the process proceeds to S808.
  For example, when the camera management table is the camera management table shown in FIG. 3C and the image ID included in the transmission request (extraction request) is "102", the camera 100 stores the corresponding RAW moving image data. Then, the process proceeds to S807. When the camera management table is the camera management table of FIG. 3C and the image ID included in the transmission request is "103", it is determined that the corresponding RAW moving image data is not stored in the camera 100. The process proceeds to step S808.
  In S807, the CPU 105 performs cutout processing (second cutout processing; second extraction processing) of cutting out moving image data of a scene indicated by a transmission request (clipping request) from corresponding RAW moving image data. Thereafter, the process proceeds to S809. The method of the second cutout process is the same as the method of the first cutout process. By the process of S 808, moving image data of the corresponding scene is cut out from the corresponding RAW moving image data. Hereinafter, the moving image data cut out by the process of S 808 will be referred to as “second extracted moving image data”. Then, when the format specification is included in the transmission request, the second converted moving image data is generated by converting the format of the second extracted moving image data into the format indicated by the format specification in S808. 2 conversion processing). If the transmission request does not include the format designation, the second extracted moving image data is transmitted from the camera 100 to the server 110 in S808. If the transmission request includes a format designation, the second converted moving image data is transmitted from the camera 100 to the server 110.
In step S808, the CPU 105 transmits, to the server 110, error information regarding the absence of the corresponding RAW moving image data, using the network board 106. Then, the CPU 115 of the server 110 receives the error information from the camera 100 using the network board 116. Thereafter, the process proceeds to S809.
  In S809, the CPU 115 of the server 110 transmits the data (second extracted moving image data, second converted moving image data, or error information) received by the processing of S807 or S808 to the client device 120 using the network board 106. Do. Then, the CPU 125 of the client device 120 receives the data from the server 110 using the network board 126. Thereafter, the process flow of FIG. 8 is ended.
  The second extracted moving image data may be transmitted from the camera 100 to the server 110 regardless of whether the transmission request includes the format designation. Then, generation of the second converted moving image data may be performed by the server 110. In addition, the entire corresponding RAW moving image data may be transmitted from the camera 100 to the server 110. Then, the server 110 may cut out the second extracted moving image data and generate the second extracted moving image data.
  It returns to the explanation of FIG. Next to S404 (the processing flow of FIG. 8), the CPU 125 of the client apparatus 120 performs processing based on the received data in S405. Then, the processing flow of FIG. 4 is ended. When the error information is transmitted from the server 110 to the client device 120, the CPU 125 notifies the user that "corresponding RAW moving image data does not exist", "cannot be accepted for cutout request", and the like. When the moving image data is transmitted from the server 110 to the client device 120, the CPU 125 records the received moving image data in the HDD 127 or reproduces a moving image based on the received moving image data. Here, the moving image data transmitted from the server 110 to the client device 120 is first extracted moving image data, first converted moving image data, second extracted moving image data, or second converted moving image data. A moving image based on the received moving image data may be reproduced in the image display area 601 of FIG. 6B, or may be reproduced using another application.
  As described above, according to the present embodiment, the second moving image data having a small data size is transmitted from the server to the client device, and the client device determines a scene (desired scene) based on the second moving image data. Ru. Then, scene information indicating the determined scene is transmitted from the client device to the server, and the server extracts moving image data of the scene indicated by the scene information from the first moving image data having a large data size. Thereafter, the extracted moving image data is transmitted from the server to the client device. Thereby, desired moving image data (moving image data of a desired scene) can be transmitted from the server to the client device. In addition, by limiting the second moving image data transmitted from the server to the client device to moving image data of the scene indicated by the scene information, the amount of communication between the server and the client device can be reduced. Further, according to the present embodiment, since the second moving image data is prepared in advance, the transmission of the second moving image data from the server to the client device is not delayed.
  Although the present invention has been described in detail based on its preferred embodiment, the present invention is not limited to this specific embodiment, and various forms within the scope of the present invention are also included in the present invention. included. For example, in the present embodiment, an example using captured image data has been described, but moving image data to be processed is not limited to captured image data. For example, moving image data to be processed may be moving image data of computer graphics or moving image data of animation.
<Other Embodiments>
The present invention supplies a program that implements one or more functions of the above-described embodiments to a system or apparatus via a network or storage medium, and one or more processors in a computer of the system or apparatus read and execute the program. Can also be realized. It can also be implemented by a circuit (eg, an ASIC) that implements one or more functions.
110: Server 115, 125: CPU 116, 126: Network board 117: HDD 120: Client device

Claims (14)

  1. An image delivery system comprising a server and a client device, wherein
    The server is
    Storage means capable of storing first moving image data, and second moving image data smaller in data size than the first moving image data and representing the same moving image as the first moving image data;
    First transmitting means for transmitting the second moving image data to the client device;
    First receiving means for receiving from the client apparatus scene information indicating at least a part of a scene of the second moving image data;
    First extraction means for extracting moving image data of a scene indicated by the scene information from the first moving image data;
    A second transmission unit that transmits extracted moving image data, which is moving image data extracted by the first extraction unit, to the client device;
    Have
    The client device is
    Second receiving means for receiving the second moving image data from the server;
    First generation means for generating the scene information based on the second moving image data;
    Third transmitting means for transmitting the scene information to the server;
    Third receiving means for receiving the extracted moving image data from the server;
    I have a,
    The first extraction means is moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information. An image delivery system characterized by extracting from the first moving image data .
  2. The client device further includes display control means for performing display control to display a moving image based on the second moving image data on a display unit,
    The first generation means is characterized in that the scene information is generated in response to a user operation of selecting a scene of at least a part of the second moving image data while the user confirms a moving image based on the second moving image data. The image delivery system according to claim 1.
  3. The first generation unit further generates format information indicating a format of moving image data based on the second moving image data,
    The third transmission means further transmits the format information.
    The first receiving means further receives the format information.
    The server converts the format of the moving image data extracted from the first moving image data into the format indicated by the format information, when the format information is received by the first receiving unit; And have
    The second transmission means substitutes the moving picture data converted by the first converting means from the moving picture data extracted from the first moving picture data, when the format information is received by the first receiving means. The image delivery system according to claim 1 or 2, wherein the image delivery system transmits the image to the client device.
  4. When the scene information indicates one frame of the second moving image data,
    The first extraction means, a still picture data of one frame corresponding to said one frame the scene information indicating any one of claims 1-3, characterized by extracting from the first moving image data 1 The image delivery system described in Item.
  5. The image delivery system further comprises an image generating device,
    The image generating device is
    A second generation unit configured to generate the first moving image data and the second moving image data;
    Fourth transmitting means for transmitting the first moving image data and the second moving image data to the server;
    Have
    The image according to any one of claims 1 to 4 , wherein the server further includes fourth receiving means for receiving the first moving image data and the second moving image data from the image generation device. Delivery system.
  6. The server requests transmission of the first moving image data when the first receiving unit receives the scene information and the storage unit does not store the first moving image data. And fifth transmitting means for transmitting to the image generating device,
    The image generating apparatus further includes fifth receiving means for receiving the transmission request from the server,
    The image distribution system according to claim 5 , wherein the fourth transmission unit transmits the first moving image data to the server in response to the fifth reception unit receiving the transmission request.
  7. The transmission request includes the scene information,
    The image generation apparatus further includes second extraction means for extracting moving image data of a scene indicated by the scene information from the first moving image data,
    The fourth transmission means transmits the moving image data extracted by the second extraction means to the server in response to the fifth reception means receiving the transmission request.
    The second transmission unit transmits the moving image data transmitted from the fourth transmission unit to the client device instead of the extracted moving image data in response to the fifth reception unit receiving the transmission request. The image delivery system according to claim 6 , characterized in that:
  8. The first generation unit further generates format information indicating a format of moving image data based on the second moving image data,
    The third transmission means further transmits the format information.
    The first receiving means further receives the format information.
    The transmission request includes the format information, when the format information is received by the first receiving unit.
    A second conversion unit configured to convert a format of the moving image data extracted by the second extraction unit into a format indicated by the format information, when the transmission request includes the format information; And further,
    When the format information is included in the transmission request, the fourth transmission unit is the moving image data after conversion by the second conversion unit in response to the fifth reception unit receiving the transmission request. The image delivery system according to claim 7, wherein the image delivery system is sent to the server instead of the moving image data extracted by the second extraction means.
  9. The second extraction means is moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information. the image distribution system according to claim 7 or 8, characterized in that extracted from the first video data.
  10. When the scene information indicates one frame of the second moving image data,
    The second extraction unit extracts one frame corresponding to the one frame indicated by the scene information from the first moving image data.
    The fourth transmission means transmits, to the server, still image data representing one frame extracted by the second extraction means in response to the fifth reception means receiving the transmission request. The image delivery system according to any one of claims 7 to 9 , which is characterized by the above.
  11. A computer apparatus connectable to another computer device,
    Storage means capable of storing first moving image data, and second moving image data smaller in data size than the first moving image data and representing the same moving image as the first moving image data;
    First transmitting means for transmitting the second moving image data to the other computer device ;
    Receiving means for receiving scene information indicating at least a part of the second moving image data from the other computer device ;
    Extracting means for extracting moving image data of a scene indicated by the scene information from the first moving image data;
    Second transmitting means for transmitting extracted moving image data, which is moving image data extracted by the extracting means, to the other computer device ;
    I have a,
    The extraction means is for moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information. A computer apparatus characterized by extracting it from the first moving image data .
  12. A control method of an image delivery system having a server and a client device, comprising:
    A first transmission step of transmitting, to the client device, second moving image data in which the server has a smaller data size than the first moving image data and which represents the same moving image as the first moving image data;
    A first receiving step in which the client device receives the second moving image data from the server;
    A generating step of generating scene information indicating a scene of at least a part of the second moving image data based on the second moving image data;
    A second transmission step in which the client device transmits the scene information to the server;
    A second receiving step in which the server receives the scene information from the client device;
    An extraction step in which the server extracts moving image data of a scene indicated by the scene information from the first moving image data;
    A third transmission step of transmitting, to the client device, the extracted moving image data, which is the moving image data extracted by the extraction step, by the server;
    A third receiving step in which the client device receives the extracted moving image data from the server;
    I have a,
    In the extraction step, moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information is A control method characterized by extracting from the first moving image data .
  13. A control method of a computer device connectable to another computer device , comprising:
    A first transmitting step of transmitting, to the other computer device , second moving image data having a smaller data size than the first moving image data and representing the same moving image as the first moving image data;
    Receiving scene information indicating at least a part of the second moving image data from the other computer device ;
    Extracting moving image data of a scene indicated by the scene information from the first moving image data;
    A second transmitting step of transmitting extracted moving image data, which is moving image data extracted by the extracting step, to the other computer device ;
    I have a,
    In the extraction step, moving image data from a frame whose time position is closest to the time position of the start point of the scene indicated by the scene information to a frame whose time position is closest to the time position of the end point of the scene indicated by the scene information is A control method characterized by extracting from the first moving image data .
  14. A program that causes a computer to execute each step of the control method according to claim 12 or 13 .
JP2015182023A 2015-09-15 2015-09-15 Image delivery system and server Active JP6548538B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015182023A JP6548538B2 (en) 2015-09-15 2015-09-15 Image delivery system and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015182023A JP6548538B2 (en) 2015-09-15 2015-09-15 Image delivery system and server

Publications (2)

Publication Number Publication Date
JP2017059953A JP2017059953A (en) 2017-03-23
JP6548538B2 true JP6548538B2 (en) 2019-07-24

Family

ID=58390585

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015182023A Active JP6548538B2 (en) 2015-09-15 2015-09-15 Image delivery system and server

Country Status (1)

Country Link
JP (1) JP6548538B2 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3772502B2 (en) * 1997-12-22 2006-05-10 ソニー株式会社 Receiving apparatus and information processing apparatus
JP4183231B2 (en) * 2002-05-09 2008-11-19 キヤノンマーケティングジャパン株式会社 Image processing server, control method therefor, program, image processing system, and terminal
JP2004289718A (en) * 2003-03-25 2004-10-14 Nippon Hoso Kyokai <Nhk> Photographed video editing method and apparatus therefor
JP2005117084A (en) * 2003-10-02 2005-04-28 Nagoya City Still image information distribution system, server, and client
JP2006166407A (en) * 2004-11-09 2006-06-22 Canon Inc Imaging device and its control method
JP5051218B2 (en) * 2006-04-10 2012-10-17 ヤフー! インコーポレイテッド Video generation based on aggregated user data
US20110206351A1 (en) * 2010-02-25 2011-08-25 Tal Givoli Video processing system and a method for editing a video asset
WO2012011466A1 (en) * 2010-07-20 2012-01-26 シャープ株式会社 Relay device, relay method, communications system, relay control program, and recoding medium
US8768142B1 (en) * 2012-01-26 2014-07-01 Ambarella, Inc. Video editing with connected high-resolution video camera and video cloud server

Also Published As

Publication number Publication date
JP2017059953A (en) 2017-03-23

Similar Documents

Publication Publication Date Title
US20190199973A1 (en) Image capture apparatus, method for setting mask image, and recording medium
US9781356B1 (en) Panoramic video viewer
CN105190511B (en) Image processing method, image processing apparatus and image processing program
US9001229B2 (en) Multiple sensor input data synthesis
US20160227285A1 (en) Browsing videos by searching multiple user comments and overlaying those into the content
US10447874B2 (en) Display control device and display control method for automatic display of an image
EP3151548A1 (en) Video recording method and device
US8253794B2 (en) Image processing apparatus and image display method
US8934627B2 (en) Video event capture, storage and processing method and apparatus
US8294823B2 (en) Video communication systems and methods
US20150208103A1 (en) System and Method for Enabling User Control of Live Video Stream(s)
US10771736B2 (en) Compositing and transmitting contextual information during an audio or video call
WO2017000399A1 (en) Video content retrieval method and device
JP2006086952A (en) Digital camera and program
JPWO2013132828A1 (en) Communication system and relay device
WO2019205872A1 (en) Video stream processing method and apparatus, computer device and storage medium
KR20110043612A (en) Image processing
US20150222815A1 (en) Aligning videos representing different viewpoints
RU2628108C2 (en) Method of providing selection of video material episode and device for this
KR20100011970A (en) Apparatus and method for low bandwidth play position previewing of video content
JP2010531089A (en) Digital camera and method for storing image data including personal related metadata
CN105516755B (en) A kind of video previewing method and device
US20140298179A1 (en) Method and device for playback of presentation file
JP5488056B2 (en) Image processing apparatus, image processing method, and program
JP4978324B2 (en) Image recording apparatus, image recording system, and image reproducing method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180911

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20181116

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20190222

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20190319

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190514

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190528

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190625

R151 Written notification of patent or utility model registration

Ref document number: 6548538

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151