WO2016031431A1 - 情報処理装置、情報処理方法、プログラム、サーバー及び情報処理システム - Google Patents
情報処理装置、情報処理方法、プログラム、サーバー及び情報処理システム Download PDFInfo
- Publication number
- WO2016031431A1 WO2016031431A1 PCT/JP2015/070685 JP2015070685W WO2016031431A1 WO 2016031431 A1 WO2016031431 A1 WO 2016031431A1 JP 2015070685 W JP2015070685 W JP 2015070685W WO 2016031431 A1 WO2016031431 A1 WO 2016031431A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- schedule
- image data
- information
- video file
- period
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 36
- 238000003672 processing method Methods 0.000 title claims description 8
- 238000000034 method Methods 0.000 claims description 25
- 238000003384 imaging method Methods 0.000 claims description 13
- 230000005540 biological transmission Effects 0.000 claims description 12
- 238000004519 manufacturing process Methods 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 25
- 230000000694 effects Effects 0.000 description 20
- 230000008569 process Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 7
- 230000007704 transition Effects 0.000 description 6
- 238000003825 pressing Methods 0.000 description 4
- 230000015654 memory Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 235000014510 cooky Nutrition 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/21805—Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234336—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
- H04N21/2353—Processing of additional data, e.g. scrambling of additional data or processing content descriptors specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25808—Management of client data
- H04N21/25841—Management of client data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25866—Management of end-user data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
- H04N21/2665—Gathering content from different sources, e.g. Internet and satellite
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/27—Server based end-user applications
- H04N21/274—Storing end-user multimedia data in response to end-user request, e.g. network recorder
- H04N21/2743—Video hosting of uploaded data from client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6581—Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8173—End-user applications, e.g. Web browser, game
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/781—Television signal recording using magnetic recording on disks or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, a program, a server, and an information processing system.
- Patent Document 1 corresponds to a period specified by a user based on information indicating a period specified by the user, a creation condition corresponding to the period, and content corresponding to the creation condition. It is described that a creation condition to be specified is specified, and content corresponding to the specified creation condition is specified as an output target.
- Patent Document 1 assumes that content corresponding to a creation condition corresponding to a period specified by the user is an output target, after the schedule ends, within the period It was not envisaged to edit the photos taken in this way into an optimal video file based on the schedule information.
- Patent Document 1 does not assume any processing of a photograph taken by each user based on schedule information when a plurality of users share a schedule.
- a schedule period acquisition unit that acquires a period from the start to the end of the schedule, and image data captured within the period after the schedule period has elapsed.
- an information processing apparatus comprising: an image data acquisition unit that acquires the image data; and a video file creation unit that creates a video file obtained by combining the image data.
- the period from the start to the end of the schedule is acquired, and the image data captured within the period after the scheduled period has elapsed.
- Obtaining, creating a video file combining the image data An information processing method is provided.
- a program for causing a computer to function as a means for creating a video file combining the image data.
- an image data acquisition unit that acquires, from the first device, image data captured within a period from the start to the end of the schedule based on the schedule information related to the schedule,
- a server is provided.
- the means for acquiring image data captured from the first device based on the schedule information related to the schedule from the first device to the second device, and the image data to the second device A transmission unit for transmitting, wherein the second device transmits a video file that combines the image data and the image data captured by the second device within the period, so that the second device generates the computer.
- a program for functioning is provided.
- image data acquisition that acquires, from the first device, first image data captured by the first device within a period from the start to the end of the schedule based on the schedule information regarding the schedule.
- a transmission unit that transmits the first image data to the second device, a server, an imaging unit that captures an image of the subject, and from the start to the end of the schedule based on the schedule information
- a scheduled period acquisition unit for acquiring a period
- a second unit comprising: a receiving unit configured to receive first image data; and a video file creating unit configured to create a video file obtained by combining the first image data and the second image data.
- FIG. 6 is a schematic diagram illustrating an outline of processing in which slide show creation information is sent from a server to a terminal, and the terminal creates a slide show based on the slide show creation information. It is a schematic diagram which shows the case where a slide show is produced using the image over a some user. It is the model which summarized the process until slide show preparation. It is a schematic diagram which shows the structure of the terminal and server for producing
- Schedule sharing system configuration 2. Association of information on invitees and invitees based on the schedule identification ID 3. When the invitee's terminal has not downloaded the schedule sharing application 4. When the invitee's terminal is downloading the schedule sharing application Automatic slideshow creation function based on schedule information
- the server 100 includes a communication unit 102, an identification information issuing unit 104, a schedule registration unit 106, a schedule identification information issuing unit 108, and a database 110.
- the terminal 200 includes a database 210, a communication unit 220, an operation input unit 222, a schedule information creation unit 224, an imaging unit 226, a display processing unit 228, and a display unit 230.
- These components shown in FIG. 1 can be configured by a circuit or a central processing unit such as a CPU and a program (software) for causing it to function.
- the program can be stored in a storage unit such as a memory included in the server 100 or the terminals 200 and 300, or a memory inserted from the outside.
- the communication unit 102 of the server 100 is an interface that communicates with the terminals 200 and 300.
- the identification information issuing unit 104 issues a UUID described later.
- the schedule registration unit 106 registers a schedule when schedule information and UUID are transmitted from the terminal 200 in step S12 of FIG.
- the schedule identification information issuing unit 108 issues a schedule identification ID described later.
- the database 110 stores various types of information related to schedule sharing.
- the database 210 of the terminal 200 is a database provided in a schedule sharing application to be described later, or a database such as a hard disk provided in the terminal 200.
- the communication unit 220 is an interface that communicates with the server 100 or the terminal 300.
- the operation input unit 222 is a component such as a touch sensor or an operation button that is input by a user operation.
- the schedule information creation unit 224 creates schedule information to be described later in response to a user operation.
- the image pickup unit 226 includes an image pickup device such as a CCD sensor or a CMOS sensor and an image pickup optical system, and photoelectrically changes a subject image formed on the image pickup surface of the image pickup device by the image pickup optical system to obtain image data such as a still image or a moving image. To get.
- the display processing unit 228 performs processing for performing display on the display unit 230.
- the display unit 230 includes a liquid crystal display (LCD) or the like.
- each user shares a schedule without using personal information using the terminals 200 and 300 owned by each user.
- a unique ID (hereinafter also referred to as a UUID) is assigned to a schedule sharing application (hereinafter also referred to as a schedule sharing application) downloaded by the terminals 200 and 300.
- An event identification ID is assigned to a schedule (event) created by a shared individual, and a public event ID is assigned to a schedule (event) created by a company.
- these event identification ID and public event ID are referred to as scheduled identification IDs. Then, information associated with these schedule identification IDs is managed on the server 100 side, and schedule sharing is realized by exchanging this schedule identification ID between users.
- the user can perform individual schedule (event) units without going through complicated processes such as registration of personal information and login using ID / password (PW).
- Information can be shared between any users. Once shared, the shared members can freely change or add content, and even if it is changed in the middle, the change is immediately reflected in all shared users. Since registration of personal information is not required and a process such as login is not required, users can easily share a schedule between users without performing complicated processing.
- each user can share his / her schedule with family and friends with simple settings based on the scheduler function.
- you can automatically distribute the latest information just by selecting the information you want, such as new product information of your favorite company, coupons for frequently-used stores, route operation information, currency exchange and stock information, etc. Connected mode).
- it is possible to notify only the information required by the user in cooperation with the wearable device, and it is also possible to reflect the life log owned by the device on the scheduler (a mode connected to things).
- Users can create schedules using their own terminals 200 and 300.
- the created schedule is stored in the terminals 200 and 300.
- the users of the terminals 200 and 300 can share the created schedule with users of other terminals and can invite participation in the schedule.
- the server 100 manages participants, non-participants, and unanswered participants in the shared schedule.
- the server 100 issues a unique schedule identification ID to the schedule that is difficult to guess by the user.
- each user sets a nickname so that the user can be identified when sharing the schedule.
- One user may have one nickname, and the nickname may overlap with other users. Even when nicknames are duplicated among a plurality of users, the UUID is uniquely set for the scheduled sharing application downloaded by each terminal, and thus the user can be uniquely specified using the UUID.
- the terminal 200 after downloading the schedule sharing application, the terminal 200 performs schedule registration in step S10.
- the UUID issued from the server 100 by downloading the schedule sharing application is stored in the database 210 of the terminal 200.
- FIG. 2 is a schematic diagram showing a screen of the terminal 200.
- a screen 500 shows an initial state.
- the button 202 presses the button 202 in a state where a check mark is added to “My schedule”, “Invited schedule”, and “A company domestic trip” as “Events to be displayed” on the screen 500, the screen transitions to the screen 502. .
- a calendar is displayed, and below the calendar, all events related to the events selected as “display events” (“your own schedule”, “invited schedule”, “company A domestic trip”) are displayed. Event information is displayed.
- all events including events other than the selected event can be displayed by performing a predetermined operation.
- “My schedule” and “Invited schedule” are schedules (events) created by individuals, and “A company domestic trip” is a schedule (event) created by a company.
- the user wants to newly register an event, when the user presses the button 204 on the screen 502 in FIG. On the screen 506, the user inputs schedule information such as “title”, “description” of the event, “start” and “end”, and “location” of the event, and presses the completion button 206 to schedule the terminal 200.
- Register When registering an “image” (image) on the screen 506, it is possible to select and register from a plurality of images displayed on the screen 508 in FIG. Transition from the screen 506 to the screen 508 can be performed by operating a predetermined button.
- the registered “image” is displayed on the screens 502 and 504 in FIG. 2 to indicate a schedule. Thereby, the user can visually discriminate each schedule from the displayed “image” (image) on the screens 502 and 504.
- step S10 in FIG. 1 is completed.
- “travel to Izu” is registered as a schedule.
- the registered schedule is stored in the database 210 (see FIG. 1) in the schedule sharing application of the terminal 200.
- the registered schedule is displayed below the calendar on the screen 502 in FIG. 2, and when all events are displayed, it is displayed on the screen 504 in FIG.
- FIG. 4 is a schematic diagram showing a screen for inviting to a registered schedule.
- the screen can transition to the screen 510 (schedule details screen) of FIG.
- the screen 502 in FIG. 2 or an arbitrary schedule displayed on the screen 504 can be changed to the screen 510 in FIG. 4 by selecting a schedule that has not yet invited a friend.
- the user of the terminal 200 can share the registered schedule with other users by pressing a button 208 (“Invite friends”) displayed on the lower side of the screen 510 in FIG. Can be invited to the event.
- a button 208 (“Invite friends”) displayed on the lower side of the screen 510 in FIG. Can be invited to the event.
- the UUID and schedule information of the terminal 200 are sent to the server 100 in step S12 in FIG. 1, and the user of the terminal 200 plans to invite another user in the server 100 ( Hereinafter, registration of “scheduled to be invited” is performed.
- the server 100 registers the schedule information sent from the terminal 200 in association with the UUID of the terminal 200, and issues a schedule identification ID for identifying the registered schedule.
- the schedule identification ID is returned to the terminal 200 in step S13.
- the terminal 200 that has received the schedule identification ID stores the schedule identification ID in the database 210.
- the schedule identification ID is not issued, and when the user presses the button 208 and the schedule to be invited is registered in the server 100.
- the server 100 issues a schedule identification ID and returns it to the terminal 200.
- step S14 of FIG. 1 an invitation message is transmitted to an invitee (terminal 300).
- the invitation message to the invitee is transmitted together with the schedule identification ID.
- the screen 514 in FIG. 4 shows a case where the invitee's terminal 300 has not downloaded the schedule sharing application.
- SMS Short Message Service
- SNS social network services
- the terminal 300 of the invitee who has received the invitation message and the schedule identification ID transmits the schedule identification ID to the server 100 in step S15.
- the server 100 receives the schedule identification ID from the terminal 300 in step S15, the server 100 transmits the schedule information associated with the received schedule identification ID to the terminal 300 in step S16.
- the terminal 300 can acquire schedule information corresponding to the schedule identification ID together with the schedule identification ID already received from the terminal 200.
- the schedule information of the invited schedule is displayed on the screen of the terminal 300. Further, the schedule identification ID and the schedule information are reflected (registered) in the database 310 of the terminal 300 in step S17 of FIG.
- the user of the terminal 300 operates the terminal 300 to input whether to participate in the invited schedule. Then, when the user of the terminal 300 inputs participation or non-participation in the invited schedule, in step S18 of FIG. 1, the UUID of the terminal 300 and the schedule identification ID together with the participation or non-participation information from the terminal 300 to the server 100. Is sent. Based on the notification from the terminal 300 in step S18, the server 100 registers that the user of the terminal 300 recognized from the UUID is participating or not participating in the schedule corresponding to the schedule identification ID. If the user of terminal 300 has not answered the invited schedule, only the UUID and schedule identification ID of terminal 300 are transmitted in step S19. The server 100 registers that the user of the terminal 300 recognized from the UUID has not answered the schedule corresponding to the schedule identification ID.
- FIG. 5 is a schematic diagram illustrating a state in which the server 100 associates the information of the invitee and the invitee based on the schedule identification ID.
- corresponding schedule information 401 is associated with a certain schedule identification ID 400.
- the UUID 402 of the inviter (scheduler) terminal is associated with the schedule identification ID 400
- the nickname 404 and the photograph 406 of the inviter are associated with the UUID 402.
- the server 100 manages the participation information 418, 428, 438 of “participation”, “non-participation”, and “unanswered” according to the notification from the terminal of the invitee regarding the invitee.
- FIG. 6 is a schematic diagram showing an example of the data structure of the information shown in FIG.
- information of FIG. 5 is rearranged and managed for each item of “schedule actors”, “schedules”, “users”, “event actors”, “events”, and “devices”.
- schedule information is included in the schedule information.
- user information is managed in the “users” item.
- device information is managed in the “devices” item.
- each item includes information (updated_at DATETIME) regarding schedule information update.
- the schedule information is updated by the user of the terminals 200 and 300, the information is transmitted to the server 100 together with the schedule identification ID, and the server 100 updates the schedule information associated with the schedule identification ID.
- FIG. 7 is a schematic diagram showing in detail the case where the invitee's terminal 300 has not downloaded the schedule sharing application.
- an application e-mail, SMS, SNS or other existing communication apps
- the schedule identification ID is sent to these other applications.
- information related to the destination of the invitee selected by the user on the screen 512 of FIG. 4 is notified to another application such as e-mail, SMS, or SNS.
- the schedule sharing application is installed in the terminal 300 in step S26 of FIG.
- the schedule sharing application is activated from the link in the invitation message.
- the invitation message includes schedule identification ID information. For this reason, when the schedule sharing application is activated in step S28, the schedule identification ID is transmitted to the server 100 in step S29. Upon receiving the schedule identification ID, the server 100 transmits schedule information associated with the schedule identification ID to the terminal 300 in step S30.
- the information of the schedule identification ID is included in the URL information of the download link in the invitation message.
- the schedule sharing application can acquire the information of the schedule identification ID because the history of the URL is known from the cookie information of the browser after the download. Therefore, in step S29, the schedule identification ID can be transmitted to the server 100 simultaneously with the start of the schedule sharing application. Thereby, the server 100 can acquire the schedule identification ID when the schedule sharing application is activated from the link in the invitation message.
- the subsequent processing is the same as in FIG. That is, when the terminal 300 acquires the schedule identification ID and the schedule information, the schedule information of the invited schedule is displayed on the terminal 300.
- the schedule identification ID and the schedule information are reflected (registered) in the database 310 of the terminal 300.
- the terminal 300 transmits the UUID and schedule identification ID of the terminal 300 together with the participation or non-participation information. If the user of terminal 300 does not input an answer for participation or non-participation, only the UUID and schedule identification ID of terminal 300 are transmitted in step S19.
- FIG. 8 is a schematic diagram illustrating a case where the invitee's terminal 300 has already downloaded the schedule sharing application.
- the schedule can be shared using the schedule sharing application that the terminal 300 has already downloaded.
- the schedule can be shared through an existing communication application such as mail, SMS, or SNS.
- the process for sharing a schedule through e-mail, SMS, SNS, etc. is basically the same as in FIG. 7, but it is necessary to perform a process (step S26) of clicking the download link to download the schedule sharing application. This is different from the process of FIG.
- step S40 when registration of the schedule in step S10 is completed, an invitee is selected in step S40. Specifically, a button 208 (“Invite a friend”) displayed on the lower side of the screen 510 in FIG. 4 is pressed, and an invitee is selected on the screen 512 in FIG. These processes are the same as those in FIG. At this time, when the invitee is a user who has been invited in the past, the schedule sharing application of the terminal 200 recognizes the UUID of the invitee in advance.
- a button 208 (“Invite a friend”) displayed on the lower side of the screen 510 in FIG. 4 is pressed, and an invitee is selected on the screen 512 in FIG.
- the server 100 registers the schedule information sent from the terminal 200 in association with the UUID of the terminal 200, and issues a schedule identification ID for identifying the registered schedule.
- the schedule identification ID is returned to the terminal 200 in step S13.
- the server 100 also associates the schedule identification ID with the UUID of the invitee, and initially registers the invitee as an unanswered person.
- step S44 the schedule identification ID is transmitted together with the schedule information from the server 100 to the terminal 300 having the UUID of the invitee.
- the invitee's terminal 300 holds the schedule identification ID and schedule information. Therefore, unlike the process of FIG. 7, the terminal 300 does not need to send the schedule identification ID to the server 100 in order to acquire the schedule information.
- the terminal 300 displays the schedule information of the invited schedule.
- step S17 the schedule identification ID and the schedule information are reflected (registered) in the database 310 of the terminal 300.
- step S46 When the user of the terminal 300 inputs that he / she participates or does not participate in the invited schedule, in step S46, the UUID and the schedule identification ID of the terminal 300 are transmitted from the terminal 300 to the server 100 together with information on participation or non-participation. Further, if the user of the terminal 300 does not input that he / she participates or does not participate in the invited schedule, the process of step S46 is not performed, and the server 100 still treats the terminal 300 as an unanswered person.
- the server 100 receives the schedule identification ID and the schedule information based on the UUID of the invitee transmitted from the terminal 200. It can be transmitted to the terminal 300 of the invitee. Therefore, the invitee's terminal 300 does not need to acquire the schedule information by transmitting the schedule identification ID to the server 100, and can simplify the process.
- a UUID can be set for the terminals 200 and 300 that have downloaded the schedule sharing application. Then, the server 100 can associate each user's terminals 200 and 300 with the schedule based on the UUID. Therefore, the user does not need to perform processing such as login, and can share the schedule with a simple procedure.
- an automatic slide show creation function based on schedule information will be described.
- the memories can be automatically made into a slide show (or movie movie) using images taken by the terminal 200 between the start and end of the schedule.
- the slide show is automatically created, the schedule information held by the terminals 200 and 300, images taken within the scheduled time set in the schedule information, and slide show creation information (including slide show effects and music) are used. Create.
- a slide show In the conventional calendar, current or future information is mainly valuable, but in this embodiment, by creating a slide show based on past information set in the schedule information, past information is also valuable. Can be given. Even if the user does not have technical knowledge, a slide show can be automatically created on the terminal 200, 300 side. Furthermore, the created slide show can be shared between friends and family or uploaded to social through SNS or the like.
- FIG. 9 and FIG. 10 are schematic diagrams showing an example in which a slide show is created for the scheduled “mochitsuki competition”.
- FIG. 9 shows an example in which a slide show is automatically created after the scheduled “mochitsuki tournament” shown on the screen 510 in FIG. 4 is completed, and a slide show playback screen 532 is automatically displayed in the screen 530 on which the schedule information is displayed. Is shown.
- FIG. 10 shows the contents of a slide show automatically created by the terminal 200 for the scheduled “mochitsuki tournament”. When the playback button 534 on the playback screen 532 shown in FIG. 9 is pressed, the slide show shown in FIG. 10 is started.
- the slide show shown in FIG. 10 is created based on the schedule information.
- the date and time (2014.5.31) that the scheduled “mochitsuki tournament” was held is displayed, and after crossfading, the first image is displayed along with the title, and is slowly enlarged for 2 to 3 seconds. Is done.
- the first image is crossfaded and moved to the next image, the next image is slowly reduced for 2 to 3 seconds.
- the subsequent images are sequentially displayed in the same manner, and then the participants are displayed.
- the logo “schedule sharing application!” Of the schedule sharing application is displayed.
- the linkage between the schedule information and the slide show is performed based on the date and time, event title, participant (optional), event description, and venue (optional) included in the schedule information.
- the terminal 200 extracts photos in the database 210 based on the date and time included in the schedule information, and creates a slide show video with music.
- the terminal 200 can automatically create a slide show by itself using the date and time included in the schedule information and the photos taken within the scheduled period.
- the terminal 300 can automatically create a slide show by itself using the date and time included in the schedule information and the photos taken during the scheduled period.
- FIG. 11 is a schematic diagram showing an outline of processing in which slide show creation information is sent from the server 100 to the terminal 200, and the terminal 200 creates a slide show based on the slide show creation information.
- the server 100 sends a slide show effect setting file group 550 as slide show creation information to the terminal 200.
- the slide show effect setting file group 550 includes effect target conditions (for general users, for specific company channels, for specific (public) events), frame images (background images), BGM sound, effect patterns (display time, effect technique (crossfade) Etc.), display information (excerpted from the scheduled detailed information, including text information), and information such as total performance time.
- the terminal 200 is based on the slide show effect setting file group 550, and if the schedule is created by an individual, and the effect target condition is “general”, the company 200 or the public event advertisement is not inserted. Create a normal slide show. Also, if the schedule is due to the creation of a company and the production target condition is “for a specific company channel”, an advertisement for a specific company is inserted, or a slide show suitable for the brand image of the company is created. In addition, when the production target condition is “for a specific (public) event”, a slide show suitable for the specific event is created. For example, when the public event is a soccer game, the production is determined so that the slide show has a lively feeling corresponding to the soccer game.
- the frame image (background image), BGM sound, production pattern (display time, production technique (crossfade, etc.)), display information (extracted from scheduled detailed information), total production time, etc. are also detailed by the slide show production setting file group 550. It is stipulated in. Thus, the slide show presentation setting file group 550 functions as a specification when the terminal 200 creates a slide show.
- a slide show presentation setting file group 550 is sent from the server 100 to the terminal 200, and the terminal 200 creates a slide show based on the slide show setting file group 550.
- a slide show effect setting file group 550 is registered in advance, and a frame image (background image) included in the slide show setting file is uploaded.
- the person in charge of the company can edit the slide show effect setting file group 550 for a specific company by operating the terminal 580.
- the slide show can be edited so as to match the image of a specific company, for example, by editing the information for the production target condition “for a specific company channel”.
- the terminal 200 uses the photo data stored in the database 210 of the terminal 200 based on the slide show effect setting file group 550 and the schedule information, and creates a slide show when the schedule ends. As a result, the playback screen 532 is automatically displayed in the screen 530.
- the created slide show can be played on the terminal 200 by pressing a button 534, and can also be shared with other users through social media or the like.
- the server 100 sends the slide show creation information to the terminal 200, but the terminal 200 may hold the slide show creation information in advance.
- the slide show creation information may be downloaded together.
- the terminal 200 can create a slide show without communicating with the server 100.
- the user of the terminal 200 can automatically create a slide show based on the photos taken by the user.
- a plurality of users can share a schedule, and when a plurality of users participate in a schedule, a picture of the schedule (event) is taken by a plurality of participating users.
- the terminals 200 and 300 can also create a slide show based on photos taken by a plurality of participating users.
- a slide show based on photos taken by a plurality of users who participated in the schedule, it is possible to create a slide show using various photos with different photographers.
- FIG. 12 is a schematic diagram showing a case where a slide show is created using images that span a plurality of users.
- the photograph taken by each terminal 200, 300, 400 is transmitted to the server 100 together with the schedule identification ID.
- photos existing in the local database of each terminal 200, 300, 400 of the schedule sharer are uploaded to the server 100 side.
- the server 100 captures photographs (image data) 406, 416, 426, 436 taken by the users of the terminals 200, 300,. .. Are managed in association with the UUIDs of the terminals 200, 300,.
- the photos 406, 416, 426, 436 taken by the terminals 200, 300,. 200.
- the photos 406, 416, 426, and 436 associated with the planned schedule identification ID are transmitted to the terminal 200. Can be prevented.
- the terminal 200 uses the data of the photos 406, 416, 426, and 436 transmitted from the server 100 based on the slide show effect setting file group and the schedule information by the above-described method, and performs the slide show when the schedule ends. create.
- step S50 when the user takes a picture within the start and end times of the schedule information on the day of the created schedule, the data of the pictures taken within that time are stored in the database 210 of the terminal 200.
- step S ⁇ b> 52 terminal 200 reads a slide show effect setting file group from server 100 and stores the data in database 210.
- FIG. 14 is a schematic diagram showing the configuration of the terminal 200 and the server 100 for automatically generating a slide show.
- the terminal 200 includes a schedule information acquisition unit (schedule period acquisition unit) 232, an image data acquisition unit 234, a slide show automatic creation unit (video file creation unit) 236, and a slide show creation information acquisition unit 238. Configured. Further, as shown in FIG. 13, the database 210 of the terminal 200 stores data of photographs taken by the user.
- the slide show creation information holding unit 112 of the server 100 holds slide show creation information.
- the slide show creation information includes frame image data (n file), music data (n file), slide show effect setting information, and the like.
- the server 100 transmits slide show creation information to the terminal 200.
- the transmitted slide show creation information is acquired by the slide show creation information acquisition unit 238 of the terminal 200.
- the image data acquisition unit 234 of the terminal 200 acquires image data captured during the period from the start to the end of the schedule, among the image data of the photos stored in the database.
- the slide show automatic creation unit 236 uses the image data acquired by the image data acquisition unit 234 to automatically create a movie when the schedule ends.
- a video file can be automatically created based on schedule information after the schedule ends. Therefore, the user can enjoy playing a video created using a memorable video without performing complicated operations.
- a scheduled period acquisition unit that acquires a period from the start to the end of the schedule based on schedule information related to the schedule;
- An image data acquisition unit that acquires image data captured within the period after the scheduled period has elapsed;
- a video file creation unit for creating a video file combining the image data;
- An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the video file creation unit creates a slide show as the video file.
- the video file creation unit creates the video file based on video file creation information that defines specifications for creating the video file, according to any one of (1) to (3), Information processing device.
- the information processing apparatus according to (4) further including a reception unit that receives the video file creation information from a server that manages the schedule information.
- An imaging unit for imaging the subject is provided, The information processing apparatus according to (1), wherein the image data acquisition unit acquires the image data captured by the imaging unit.
- the information for creating the video file is a production target condition indicating that the video file is intended for general users or businesses, a background image, sound, a presentation pattern including a presentation time or a presentation technique, or detailed information on display.
- the information processing apparatus according to (4) including: (9) As the video file creation information, when the video file is for a specific company, the video file for the promotion of the specific company or the video file that matches the image of the specific company is created.
- the information for creating a video file includes information for creating the video file that matches the image of the specific event when the video file is for a specific event. Information processing device.
- an image data acquisition unit that acquires, from the first device, image data captured within a period from the start to the end of the schedule based on the schedule information related to the schedule;
- a transmission unit configured to transmit the image data to a second device, wherein the second device creates a video file in which the image data and the image data captured by the second device within the period are combined;
- a transmitting unit for transmitting, Server with.
- An image data acquisition unit that acquires, from the first device, first image data captured by the first device within a period from the start to the end of the schedule based on the schedule information regarding the schedule;
- a server comprising: a transmission unit that transmits the first image data to a second device;
- An imaging unit that captures an image of a subject, a scheduled period acquisition unit that acquires a period from the start to the end of the schedule based on the schedule information, and the imaging unit within the period after the scheduled period has elapsed.
- An image data acquisition unit that acquires captured second image data, a reception unit that receives the first image data transmitted from the server, and the first image data and the second image data are combined.
- a video file creation unit that creates the video file created, and the second device,
- An information processing system comprising:
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Library & Information Science (AREA)
- General Engineering & Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Transfer Between Computers (AREA)
- Television Signal Processing For Recording (AREA)
- Studio Devices (AREA)
Abstract
Description
を備える、情報処理方法が提供される。
なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。
1.予定共有システムの構成
2.予定識別IDに基づく招待者、被招待者の情報の紐付け
3.被招待者の端末が予定共有アプリケーションをダウンロードしていない場合
4.被招待者の端末が予定共有アプリケーションをダウンロードしている場合
5.予定情報に基づいたスライドショーの自動作成機能
まず、図1を参照して、本開示の一実施形態に係る予定共有システムの概略構成について説明する。図1は、本実施形態に係るシステムの構成例を示す模式図である。図1に示すように、本実施形態に係るシステムは、サーバー100、端末200、端末300を有して構成されている。端末200,300は、例えば、スマートフォンのような表示画面と操作部を有するデバイスである。本実施形態では、端末200,300として、表示画面上にタッチセンサが設けられたタッチパネルを備えた端末であるものとするが、端末200,300はこれに限定されるものではない。
また、端末200のデータベース210は、後述する予定共有アプリケーションに設けられるデータベース、または端末200が備えるハードディスク等のデータベースである。通信部220は、サーバー100、又は端末300と通信を行うインターフェースである。操作入力部222は、タッチセンサ、操作ボタン等の、ユーザーによる操作が入力される構成要素である。予定情報作成部224は、ユーザーの操作に応じて後述する予定情報を作成する。撮像部226は、CCDセンサー又はCMOSセンサー等の撮像素子と撮像光学系から構成され、撮像光学系により撮像素子の撮像面に結像した被写体像を光電変化して静止画や動画などの画像データを取得する。表示処理部228は、表示部230への表示を行うための処理を行う。表示部230は、液晶表示ディスプレイ(LCD)等から構成される。
サーバー100では、予定識別IDに基づいて、招待者、被招待者の情報の紐付けを行う。図5は、サーバー100が予定識別IDに基づいて招待者、被招待者の情報を紐付した状態を示す模式図である。図5に示すように、ある1つの予定識別ID400に対して、対応する予定の予定情報401が紐付けられている。また、予定識別ID400に対して、招待者(予定の起案者)の端末のUUID402が紐付けられ、UUID402には招待者のニックネーム404、写真406が紐付けられている。
上述したように、被招待者の端末300が予定共有アプリケーションを既にダウンロードしているか否かによって、端末300が予定識別ID及び予定情報を取得する経路が異なる。図7は、被招待者の端末300が予定共有アプリケーションをダウンロードしていない場合を詳細に示す模式図である。この場合、ステップS12で端末200のユーザーが招待する予定をサーバー100に登録し、ステップS14で予定識別IDが端末200へ返信されると、ステップS20で予定共有アプリケーションとは別のアプリケーション(メール、SMS、SNS等の既存のコミュニケーションアプリ)が起動され、予定識別IDがこれらの別アプリケーションへ送られる。そして、次のステップS22では、ユーザーが図4の画面512で選択した被招待者の宛先に関する情報がメール、SMS、SNS等の別アプリケーションに通知される。
図8は、被招待者の端末300が予定共有アプリケーションを既にダウンロードしている場合を示す模式図である。この場合は、端末300が既にダウンロードしている予定共有アプリケーションを利用して予定を共有することができる。また、図7と同様に、メール、SMS、SNS等の既存のコミュニケーションアプリを通じて予定を共有することもできる。なお、メール、SMS、SNS等を通じて予定を共有する場合の処理は、図7と基本的に同様であるが、ダウンロードリンクをクリックして予定共有アプリケーションをダウンロードする処理(ステップS26)を行う必要がない点で図7の処理とは異なる。
次に、予定情報に基づいたスライドショーの自動作成機能について説明する。本実施形態では、予定が終了した際に、予定の開始から終了までの間に端末200が撮影した画像を用いて、その思い出を自動でスライドショー(または動画のムービー)にすることができる。スライドショーの自動作成の際には、端末200,300が保有する予定情報、予定情報で設定された予定時間内に撮影された画像、スライドショー作成用情報(スライドショーエフェクト及び音楽を含む)、に基づいて作成を行う。
(1) 予定に関する予定情報に基づいて、当該予定の開始から終了までの期間を取得する予定期間取得部と、
前記予定の期間が経過した後、前記期間内に撮像された画像データを取得する画像データ取得部と、
前記画像データを結合した映像ファイルを作成する映像ファイル作成部と、
を備える、情報処理装置。
(2) 前記映像ファイル作成部は、前記映像ファイルとしてスライドショーを作成する、前記(1)に記載の情報処理装置。
(3) 前記映像ファイル作成部は、前記予定情報に含まれる前記期間以外の情報を更に用いて前記映像ファイルを作成することを特徴とする、前記(1)又は(2)に記載の情報処理装置。
(4) 前記映像ファイル作成部は、前記映像ファイルを作成するための仕様を規定した映像ファイル作成用情報に基づいて前記映像ファイルを作成する、前記(1)~(3)のいずれかに記載の情報処理装置。
(5) 前記予定情報を管理するサーバーから前記映像ファイル作成用情報を受信する受信部を備える、前記(4)に記載の情報処理装置。
(6) 被写体を撮像する撮像部を備え、
前記画像データ取得部は、前記撮像部が撮像した前記画像データを取得する、前記(1)に記載の情報処理装置。
(7) 前記画像データ取得部は、前記予定情報を管理するサーバーから、前記期間内に他装置が撮像した前記画像データを取得する、前記(1)に記載の情報処理装置。
(8) 前記映像ファイル作成用情報は、前記映像ファイルが一般ユーザー向け若しくは企業向けであることを示す演出対象条件、背景画像、音声、表示時間若しくは演出技法を含む演出パターン、又は表示に関する詳細情報を含む、前記(4)に記載の情報処理装置。
(9) 前記映像ファイル作成用情報は、前記映像ファイルが特定の企業向けの場合は、当該特定の企業の宣伝のための前記映像ファイル又は当該特定の企業のイメージに合致した前記映像ファイルを作成させるための情報を含む、前記(8)に記載の情報処理装置。
(10) 前記映像ファイル作成用情報は、前記映像ファイルが特定のイベント向けの場合は、当該特定のイベントのイメージに合致した前記映像ファイルを作成させるための情報を含む、前記(8)に記載の情報処理装置。
(11) 予定に関する予定情報に基づいて、当該予定の開始から終了までの期間を取得することと、
前記予定の期間が経過した後、前記期間内に撮像された画像データを取得することと、
前記画像データを結合した映像ファイルを作成することと、
を備える、情報処理方法。
(12) 予定に関する予定情報に基づいて、当該予定の開始から終了までの期間を取得する手段、
前記予定の期間が経過した後、前記期間内に撮像された画像データを取得する手段、
前記画像データを結合した映像ファイルを作成する手段、
としてコンピュータを機能させるためのプログラム。
(13) 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に撮像された画像データを第1の装置から取得する画像データ取得部と、
前記画像データを第2の装置へ送信する送信部であって、前記画像データと前記第2の装置が前記期間内に撮像した画像データとを結合した映像ファイルを前記第2の装置が作成するために送信する送信部と、
を備える、サーバー。
(14) 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に撮像された画像データを第1の装置から取得することと、
前記画像データを第2の装置へ送信する送信部であって、前記画像データと前記第2の装置が前記期間内に撮像した画像データとを結合した映像ファイルを前記第2の装置が作成するために送信することと、
を備える、情報処理方法。
(15) 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に撮像された画像データを第1の装置から取得する手段、
前記画像データを第2の装置へ送信する送信部であって、前記画像データと前記第2の装置が前記期間内に撮像した画像データとを結合した映像ファイルを前記第2の装置が作成するために送信する手段、
としてコンピュータを機能させるためのプログラム。
(16) 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に第1の装置が撮像した第1の画像データを当該第1の装置から取得する画像データ取得部と、
前記第1の画像データを第2の装置へ送信する送信部と、を有する、サーバーと、
被写体を撮像する撮像部と、前記予定情報に基づいて、当該予定の開始から終了までの期間を取得する予定期間取得部と、前記予定の期間が経過した後、前記期間内に前記撮像部が撮像した第2の画像データを取得する画像データ取得部と、前記サーバーから送信された前記第1の画像データを受信する受信部と、前記第1の画像データ及び前記第2の画像データを結合した映像ファイルを作成する映像ファイル作成部と、を有する、前記第2の装置と、
を備える、情報処理システム。
102 通信部
110 画像データ取得部
200,300 端末
232 予定情報取得部(予定期間取得部)
234 画像データ取得部
236 スライドショー自動作成部(映像ファイル作成部)
Claims (16)
- 予定に関する予定情報に基づいて、当該予定の開始から終了までの期間を取得する予定期間取得部と、
前記予定の期間が経過した後、前記期間内に撮像された画像データを取得する画像データ取得部と、
前記画像データを結合した映像ファイルを作成する映像ファイル作成部と、
を備える、情報処理装置。 - 前記映像ファイル作成部は、前記映像ファイルとしてスライドショーを作成する、請求項1に記載の情報処理装置。
- 前記映像ファイル作成部は、前記予定情報に含まれる前記期間以外の情報を更に用いて前記映像ファイルを作成することを特徴とする、請求項1に記載の情報処理装置。
- 前記映像ファイル作成部は、前記映像ファイルを作成するための仕様を規定した映像ファイル作成用情報に基づいて前記映像ファイルを作成する、請求項1に記載の情報処理装置。
- 前記予定情報を管理するサーバーから前記映像ファイル作成用情報を受信する受信部を備える、請求項4に記載の情報処理装置。
- 被写体を撮像する撮像部を備え、
前記画像データ取得部は、前記撮像部が撮像した前記画像データを取得する、請求項1に記載の情報処理装置。 - 前記画像データ取得部は、前記予定情報を管理するサーバーから、前記期間内に他装置が撮像した前記画像データを取得する、請求項1に記載の情報処理装置。
- 前記映像ファイル作成用情報は、前記映像ファイルが一般ユーザー向け若しくは企業向けであることを示す演出対象条件、背景画像、音声、表示時間若しくは演出技法を含む演出パターン、又は表示に関する詳細情報を含む、請求項4に記載の情報処理装置。
- 前記映像ファイル作成用情報は、前記映像ファイルが特定の企業向けの場合は、当該特定の企業の宣伝のための前記映像ファイル又は当該特定の企業のイメージに合致した前記映像ファイルを作成させるための情報を含む、請求項8に記載の情報処理装置。
- 前記映像ファイル作成用情報は、前記映像ファイルが特定のイベント向けの場合は、当該特定のイベントのイメージに合致した前記映像ファイルを作成させるための情報を含む、請求項8に記載の情報処理装置。
- 予定に関する予定情報に基づいて、当該予定の開始から終了までの期間を取得することと、
前記予定の期間が経過した後、前記期間内に撮像された画像データを取得することと、
前記画像データを結合した映像ファイルを作成することと、
を備える、情報処理方法。 - 予定に関する予定情報に基づいて、当該予定の開始から終了までの期間を取得する手段、
前記予定の期間が経過した後、前記期間内に撮像された画像データを取得する手段、
前記画像データを結合した映像ファイルを作成する手段、
としてコンピュータを機能させるためのプログラム。 - 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に撮像された画像データを第1の装置から取得する画像データ取得部と、
前記画像データを第2の装置へ送信する送信部であって、前記画像データと前記第2の装置が前記期間内に撮像した画像データとを結合した映像ファイルを前記第2の装置が作成するために送信する送信部と、
を備える、サーバー。 - 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に撮像された画像データを第1の装置から取得することと、
前記画像データを第2の装置へ送信する送信部であって、前記画像データと前記第2の装置が前記期間内に撮像した画像データとを結合した映像ファイルを前記第2の装置が作成するために送信することと、
を備える、情報処理方法。 - 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に撮像された画像データを第1の装置から取得する手段、
前記画像データを第2の装置へ送信する送信部であって、前記画像データと前記第2の装置が前記期間内に撮像した画像データとを結合した映像ファイルを前記第2の装置が作成するために送信する手段、
としてコンピュータを機能させるためのプログラム。 - 予定に関する予定情報に基づいて当該予定の開始から終了までの期間内に第1の装置が撮像した第1の画像データを当該第1の装置から取得する画像データ取得部と、
前記第1の画像データを第2の装置へ送信する送信部と、を有する、サーバーと、
被写体を撮像する撮像部と、前記予定情報に基づいて、当該予定の開始から終了までの期間を取得する予定期間取得部と、前記予定の期間が経過した後、前記期間内に前記撮像部が撮像した第2の画像データを取得する画像データ取得部と、前記サーバーから送信された前記第1の画像データを受信する受信部と、前記第1の画像データ及び前記第2の画像データを結合した映像ファイルを作成する映像ファイル作成部と、を有する、前記第2の装置と、
を備える、情報処理システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580044823.7A CN106576150B (zh) | 2014-08-29 | 2015-07-21 | 信息处理设备、信息处理方法、程序、服务器和信息处理系统 |
US15/328,550 US20170213573A1 (en) | 2014-08-29 | 2015-07-21 | Information processing device, information processing method, program,server, and information processing system |
JP2016545051A JP6589869B2 (ja) | 2014-08-29 | 2015-07-21 | 情報処理装置、情報処理方法、プログラム、サーバー及び情報処理システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-175750 | 2014-08-29 | ||
JP2014175750 | 2014-08-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016031431A1 true WO2016031431A1 (ja) | 2016-03-03 |
Family
ID=55399332
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/070685 WO2016031431A1 (ja) | 2014-08-29 | 2015-07-21 | 情報処理装置、情報処理方法、プログラム、サーバー及び情報処理システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170213573A1 (ja) |
JP (1) | JP6589869B2 (ja) |
CN (1) | CN106576150B (ja) |
TW (1) | TWI689199B (ja) |
WO (1) | WO2016031431A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10728443B1 (en) | 2019-03-27 | 2020-07-28 | On Time Staffing Inc. | Automatic camera angle switching to create combined audiovisual file |
US10963841B2 (en) | 2019-03-27 | 2021-03-30 | On Time Staffing Inc. | Employment candidate empathy scoring system |
US11023735B1 (en) | 2020-04-02 | 2021-06-01 | On Time Staffing, Inc. | Automatic versioning of video presentations |
US11127232B2 (en) | 2019-11-26 | 2021-09-21 | On Time Staffing Inc. | Multi-camera, multi-sensor panel data extraction system and method |
US11144882B1 (en) | 2020-09-18 | 2021-10-12 | On Time Staffing Inc. | Systems and methods for evaluating actions over a computer network and establishing live network connections |
US11423071B1 (en) | 2021-08-31 | 2022-08-23 | On Time Staffing, Inc. | Candidate data ranking method using previously selected candidate data |
US11727040B2 (en) | 2021-08-06 | 2023-08-15 | On Time Staffing, Inc. | Monitoring third-party forum contributions to improve searching through time-to-live data assignments |
US11907652B2 (en) | 2022-06-02 | 2024-02-20 | On Time Staffing, Inc. | User interface and systems for document creation |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170171630A1 (en) * | 2015-12-14 | 2017-06-15 | International Business Machines Corporation | Sharing Portions of a Video |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001268417A (ja) * | 2000-03-17 | 2001-09-28 | Ricoh Co Ltd | デジタルカメラ装置 |
JP2004178607A (ja) * | 2003-12-08 | 2004-06-24 | Canon Inc | 情報機器 |
JP2013011928A (ja) * | 2011-06-28 | 2013-01-17 | Nippon Telegr & Teleph Corp <Ntt> | イベント情報収集方法、イベント情報収集装置及びイベント情報収集プログラム |
WO2014073274A1 (ja) * | 2012-11-09 | 2014-05-15 | ソニー株式会社 | 通信端末、通信方法、プログラム、及び通信システム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7502806B2 (en) * | 2004-08-23 | 2009-03-10 | Quiro Holdings, Inc. | Method and system for providing image rich web pages from a computer system over a network |
US9280545B2 (en) * | 2011-11-09 | 2016-03-08 | Microsoft Technology Licensing, Llc | Generating and updating event-based playback experiences |
-
2015
- 2015-07-21 JP JP2016545051A patent/JP6589869B2/ja active Active
- 2015-07-21 US US15/328,550 patent/US20170213573A1/en not_active Abandoned
- 2015-07-21 WO PCT/JP2015/070685 patent/WO2016031431A1/ja active Application Filing
- 2015-07-21 CN CN201580044823.7A patent/CN106576150B/zh active Active
- 2015-08-18 TW TW104126871A patent/TWI689199B/zh not_active IP Right Cessation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001268417A (ja) * | 2000-03-17 | 2001-09-28 | Ricoh Co Ltd | デジタルカメラ装置 |
JP2004178607A (ja) * | 2003-12-08 | 2004-06-24 | Canon Inc | 情報機器 |
JP2013011928A (ja) * | 2011-06-28 | 2013-01-17 | Nippon Telegr & Teleph Corp <Ntt> | イベント情報収集方法、イベント情報収集装置及びイベント情報収集プログラム |
WO2014073274A1 (ja) * | 2012-11-09 | 2014-05-15 | ソニー株式会社 | 通信端末、通信方法、プログラム、及び通信システム |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11457140B2 (en) | 2019-03-27 | 2022-09-27 | On Time Staffing Inc. | Automatic camera angle switching in response to low noise audio to create combined audiovisual file |
US10963841B2 (en) | 2019-03-27 | 2021-03-30 | On Time Staffing Inc. | Employment candidate empathy scoring system |
US11961044B2 (en) | 2019-03-27 | 2024-04-16 | On Time Staffing, Inc. | Behavioral data analysis and scoring system |
US11863858B2 (en) | 2019-03-27 | 2024-01-02 | On Time Staffing Inc. | Automatic camera angle switching in response to low noise audio to create combined audiovisual file |
US10728443B1 (en) | 2019-03-27 | 2020-07-28 | On Time Staffing Inc. | Automatic camera angle switching to create combined audiovisual file |
US11127232B2 (en) | 2019-11-26 | 2021-09-21 | On Time Staffing Inc. | Multi-camera, multi-sensor panel data extraction system and method |
US11783645B2 (en) | 2019-11-26 | 2023-10-10 | On Time Staffing Inc. | Multi-camera, multi-sensor panel data extraction system and method |
US11636678B2 (en) | 2020-04-02 | 2023-04-25 | On Time Staffing Inc. | Audio and video recording and streaming in a three-computer booth |
US11184578B2 (en) | 2020-04-02 | 2021-11-23 | On Time Staffing, Inc. | Audio and video recording and streaming in a three-computer booth |
US11861904B2 (en) | 2020-04-02 | 2024-01-02 | On Time Staffing, Inc. | Automatic versioning of video presentations |
US11023735B1 (en) | 2020-04-02 | 2021-06-01 | On Time Staffing, Inc. | Automatic versioning of video presentations |
US11720859B2 (en) | 2020-09-18 | 2023-08-08 | On Time Staffing Inc. | Systems and methods for evaluating actions over a computer network and establishing live network connections |
US11144882B1 (en) | 2020-09-18 | 2021-10-12 | On Time Staffing Inc. | Systems and methods for evaluating actions over a computer network and establishing live network connections |
US11727040B2 (en) | 2021-08-06 | 2023-08-15 | On Time Staffing, Inc. | Monitoring third-party forum contributions to improve searching through time-to-live data assignments |
US11966429B2 (en) | 2021-08-06 | 2024-04-23 | On Time Staffing Inc. | Monitoring third-party forum contributions to improve searching through time-to-live data assignments |
US11423071B1 (en) | 2021-08-31 | 2022-08-23 | On Time Staffing, Inc. | Candidate data ranking method using previously selected candidate data |
US11907652B2 (en) | 2022-06-02 | 2024-02-20 | On Time Staffing, Inc. | User interface and systems for document creation |
Also Published As
Publication number | Publication date |
---|---|
JP6589869B2 (ja) | 2019-10-16 |
CN106576150B (zh) | 2020-07-14 |
US20170213573A1 (en) | 2017-07-27 |
CN106576150A (zh) | 2017-04-19 |
TW201631979A (zh) | 2016-09-01 |
TWI689199B (zh) | 2020-03-21 |
JPWO2016031431A1 (ja) | 2017-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6589869B2 (ja) | 情報処理装置、情報処理方法、プログラム、サーバー及び情報処理システム | |
JP6474932B2 (ja) | 通信端末、通信方法、プログラム、及び通信システム | |
JP5301425B2 (ja) | グループコンテンツプレゼンテーション、およびグループコンテンツプレゼンテーション中のグループ通信を編成するためのシステムおよび方法 | |
US8606872B1 (en) | Method and apparatus for organizing, packaging, and sharing social content and social affiliations | |
KR101668898B1 (ko) | 공식 계정을 이용한 온에어 서비스 방법 및 그 시스템 | |
CN104769564B (zh) | 通信终端、通信方法以及通信系统 | |
JP6194191B2 (ja) | 情報処理システム | |
US20160156874A1 (en) | Methods and Systems for Collaborative Messaging | |
JP6375039B1 (ja) | プログラム、撮影方法及び端末 | |
KR20140099182A (ko) | 온라인 이벤트를 기다리는 가상 모임 로비 | |
JP2023096363A (ja) | サーバ及び方法 | |
JPWO2014073276A1 (ja) | 通信端末、情報処理装置、通信方法、情報処理方法、プログラム、及び通信システム | |
US20140245351A1 (en) | System For Booking Television Programs | |
JP5372288B1 (ja) | サーバ装置、方法、および、プログラム | |
US20080022210A1 (en) | Wedding Ceremony Information Distribution System | |
US11652778B2 (en) | Platform-initiated social media posting with time limited response | |
JP2014048876A (ja) | 集合写真形成装置、集合写真形成方法および集合写真形成プログラム | |
JP6601399B2 (ja) | 情報処理装置、情報処理方法、プログラム、サーバー及び情報処理システム | |
KR102190882B1 (ko) | 커뮤니티 플랫폼 제공 방법 및 장치 | |
CN113553404A (zh) | 信息处理装置、信息处理方法和计算机可读介质 | |
KR101478100B1 (ko) | 영상 이벤트 서비스 제공 방법 및 영상 이벤트 서비스 제공 서버 | |
JP2015022747A (ja) | サーバ装置、方法、および、プログラム | |
JP2019096091A (ja) | イベント管理システム、イベント管理サーバ装置、及びイベント管理プログラム | |
WO2018031438A1 (en) | Method, system, software, engine, and a mobile application platform for video chat |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15835804 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016545051 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15328550 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15835804 Country of ref document: EP Kind code of ref document: A1 |