CN105100679B - Server and method for providing collaboration service and user terminal for receiving collaboration service - Google Patents

Server and method for providing collaboration service and user terminal for receiving collaboration service Download PDF

Info

Publication number
CN105100679B
CN105100679B CN201510270712.XA CN201510270712A CN105100679B CN 105100679 B CN105100679 B CN 105100679B CN 201510270712 A CN201510270712 A CN 201510270712A CN 105100679 B CN105100679 B CN 105100679B
Authority
CN
China
Prior art keywords
document
user
server
video
user terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510270712.XA
Other languages
Chinese (zh)
Other versions
CN105100679A (en
Inventor
李在根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/556,616 external-priority patent/US20150341399A1/en
Priority claimed from US14/705,147 external-priority patent/US10277643B2/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN105100679A publication Critical patent/CN105100679A/en
Application granted granted Critical
Publication of CN105100679B publication Critical patent/CN105100679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A server, a method and an apparatus for providing a collaboration service are provided. The server includes: a memory configured to store computer-executable instructions; and a processor configured to process the computer-executable instructions to provide a screen including a first area displaying a video of the user and a second area displaying an editable document. The processor is further configured to process the computer-executable instructions to receive a selection of a point in time of the video, and provide the editable document in a state corresponding to the selected point in time of the video.

Description

Server and method for providing collaboration service and user terminal for receiving collaboration service
Cross Reference to Related Applications
This application claims priority from korean patent application No. 10-2014-0062625, filed on the korean intellectual property office at 5/23/2014, us non-provisional patent application No. 14/556,616, filed on the us patent and trademark office at 12/1/2014, korean patent application No. 10-2015-0018870, filed on the korean intellectual property office at 6/2015, and us non-provisional patent application No. 14/705,147, filed on the us patent and trademark office at 6/2015, at 5/6/2015, the entire disclosures of which are incorporated herein by reference for all purposes.
Technical Field
Some aspects of exemplary embodiments relate to a method and a server for providing a collaboration service (collaboration service) and a user terminal for requesting the collaboration service.
Background
Due to advances in science and technology, various types of user terminals, such as smart phones, tablet PCs, desktop computers, laptop computers, and the like, are improving and becoming more sophisticated. User terminals have evolved into high-end multimedia devices capable of connecting to a network to search for information on the internet and to transmit or receive files and to capture and play back photographs or moving images.
With the development of user terminals, the demand for cloud computing is increasing. Cloud computing refers to a technology that allows a user to store information in a server on the internet and access the server from anywhere at any time via a user terminal to use the stored information. To meet this increased demand for cloud computing, a wide variety of applications using cloud computing are being developed.
Advances in user terminals and cloud computing technology allow multiple users to connect to a server and execute the same application or access the same information using their terminals.
Disclosure of Invention
Some aspects of the exemplary embodiments include a method and server for providing a collaboration service that allows users to collaboratively edit a document by synchronizing a conference summary generated based on speech included in a video call image associated with each user to the collaboratively edited document so that the users recognize context information while collaboratively editing the document, and a user terminal for receiving the collaboration service.
Additional aspects will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the exemplary embodiments.
According to an aspect of an exemplary embodiment, a server for providing a collaboration service allowing a user to edit a document includes: a communication unit for receiving a video call image associated with each user who edits a document and editing information on the edited document from a user terminal requesting a collaboration service; a controller for synchronizing details of a conference summary generated based on a voice included in a video call image associated with each user with a document edited according to editing information; and a memory for storing details and documents of the conference summary.
According to an aspect of an exemplary embodiment, a method of providing a collaboration service allowing a user to edit a document includes: receiving a video call image associated with each user who edits a document and editing information on the edited document from a user terminal requesting a collaboration service; synchronizing details of a conference summary generated based on speech included in a video call image associated with each user with a document edited according to the received editing information; and storing details and documents of the conference summary.
According to an aspect of an exemplary embodiment, a user terminal for receiving a collaboration service from a server providing the collaboration service allowing a user to edit a document includes: an audio/video input unit for inputting a user's voice and video; a user input unit for receiving edit information on a document being edited; a controller that acquires a video call image and edit information obtained by performing signal processing on voice and video of a user; a communication unit that transmits the acquired video call image and the editing information to a server, and receives, from the server, a video call image associated with each user who edits a document, a conference summary generated based on speech included in the video call image associated with each user, and a document synchronized with details of the conference summary; and an output unit that outputs the received video call image, conference summary, and document associated with each user.
According to an aspect of the exemplary embodiments, a collaboration service is provided that facilitates collaborative editing of a document by users by synchronizing a conference summary generated based on speech contained in a video call image associated with each user with the document being collaboratively edited, thereby allowing the users to recognize contextual information while collaboratively editing the document.
According to an aspect of an exemplary embodiment, there is provided a server for providing a collaboration service, the server including: a memory configured to store computer-executable instructions; and a processor configured to process the computer-executable instructions to provide a screen including a first area displaying a video of the user and a second area displaying an editable document. The processor is further configured to process the computer-executable instructions to receive a selection of a point in time of the video, and provide the editable document in a state corresponding to the selected point in time of the video.
The processor may be further configured to process the computer-executable instructions to receive a selection of an edit of the editable document and reproduce the video from a point in time corresponding to the selected edit.
The screen may further include a third area displaying a text record of items corresponding to a point in time of the video and editing of the editable document.
The processor may be further configured to process the computer-executable instructions to receive a selection of an item from a text recording of the item, reproduce the video from a point in time corresponding to the selected item, and provide the editable document in a state corresponding to the selected item.
The processor may be further configured to process the computer-executable instructions to generate a text recording of the item based on the user's speech in the video.
The processor may be further configured to process the computer-executable instructions to generate a text record of the item based on the user's editing of the editable document.
The editable document may be displayed in a word processing program (word processing program).
According to an aspect of an exemplary embodiment, there is provided a method for providing a collaboration service, the method including displaying a screen including a first area displaying a video of a user and a second area displaying an editable document. The method also includes receiving a selection of a point in time of the video and displaying the editable document in a state corresponding to the selected point in time of the video.
The method may further include receiving a selection of an edit of the editable document and reproducing the video from a point in time corresponding to the selected edit.
The screen may further include a third area displaying a text record of items corresponding to a point in time of the video and editing of the editable document.
The method may further include receiving a selection of an item from a text recording of the item, reproducing the video from a point in time corresponding to the selected item, and providing the editable document in a state corresponding to the selected item.
The method may also include generating a text recording of the item based on the user's speech in the video.
The method may also include generating a text record of the project based on the user's edits to the editable document.
A non-transitory computer readable medium may include computer readable instructions executable to perform the method.
According to an aspect of an exemplary embodiment, there is provided a terminal for providing a collaboration service, the terminal including a display configured to display a screen including a first area displaying a video of a user and a second area displaying an editable document. The terminal further includes: an input device configured to receive a selection of a point in time of a video; and a controller configured to control the display to display the editable document in a state corresponding to the selected point in time of the video.
The input device may be further configured to receive a selection of an edit of the editable document, and the controller may be further configured to control the display to reproduce the video from a point in time corresponding to the selected edit.
The input device may be further configured to receive a selection of an item from a text recording of the item, and the controller may be further configured to: the display is controlled to reproduce the video from a point of time corresponding to the selected item, and the display is controlled to display the editable document in a state corresponding to the selected item.
The text recording of the item may be generated based on the user's speech in the video.
The text records for the items may be generated based on user edits to the editable document.
According to an aspect of an exemplary embodiment, there is provided a server for providing a collaboration service, the server including a communication device configured to transmit a document to be displayed and receive a selection of a portion of the displayed document. The server also includes a controller configured to determine a portion of the user's video to display and a portion of the text item to display, the portion of the video and the portion of the text item corresponding to the selected portion of the displayed document.
The communication device may be further configured to transmit a video to be displayed and receive a selection of another portion of the displayed video, and the controller may be further configured to determine another portion of the document to be displayed and another portion of the text item to be displayed, the another portion of the document and the another portion of the text item corresponding to the selected another portion of the displayed video.
The communication device may be further configured to transmit a text item to be displayed and receive a selection of another portion of the displayed text item, and the controller may be further configured to determine another portion of the video to be displayed and another portion of the document to be displayed, the another portion of the video and the another portion of the document corresponding to the selected another portion of the displayed text item.
Drawings
These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 illustrates an environment for providing collaboration services;
fig. 2 is a block diagram of a configuration of a user terminal according to an exemplary embodiment;
FIG. 3 is a block diagram of a configuration of a server for providing collaboration services in accordance with an illustrative embodiment;
FIG. 4 illustrates a controller of a server for providing collaboration services in accordance with an exemplary embodiment;
FIG. 5 illustrates an image management processor that may be included in a controller of a server for providing collaboration services, according to an exemplary embodiment;
FIG. 6 illustrates an integrated management Database (DB) that may be stored in a memory of a server for providing collaboration services, in accordance with an exemplary embodiment;
fig. 7 illustrates a user information DB that may be stored in a memory of a server for providing a collaboration service according to an exemplary embodiment;
FIG. 8 illustrates a document DB that may be stored in a memory of a server for providing a collaboration service, according to an exemplary embodiment;
fig. 9 illustrates an image DB that may be stored in a memory of a server for providing a collaboration service according to an exemplary embodiment;
FIG. 10 is a diagram illustrating a server for providing a collaboration service implemented as a plurality of decentralized servers, according to another exemplary embodiment;
FIG. 11 is a block diagram of an integrated management server that is one of the decentralized servers for providing collaboration services, according to another exemplary embodiment;
FIG. 12 is a block diagram of a user management server that is one of the decentralized servers for providing collaboration services, according to another exemplary embodiment;
FIG. 13 is a block diagram of a document management server that is one of the decentralized servers for providing collaboration services in accordance with another exemplary embodiment;
fig. 14 is a block diagram of an image management server as one of the dispersion servers for providing the cooperation service according to another exemplary embodiment;
FIG. 15 illustrates a process of initiating a collaboration service in accordance with a request to execute the collaboration service while processing a document with a server for providing the collaboration service in accordance with an exemplary embodiment;
fig. 16 illustrates an example in which a user terminal for receiving a collaboration service requests execution of the collaboration service, according to an exemplary embodiment;
fig. 17 illustrates an example of a user terminal requesting to create a group for receiving a collaboration service, according to an exemplary embodiment;
fig. 18 illustrates an example in which a user terminal for receiving a collaboration service performs management of group members according to an exemplary embodiment;
FIG. 19 illustrates the state of each user terminal when a collaboration service is initiated while a document is being processed so that the user terminals are ready to collaboratively edit the document, in accordance with an exemplary embodiment;
fig. 20 illustrates a method of displaying a user interface including an address window and a menu and a video call image on a user terminal for receiving a collaboration service according to an exemplary embodiment;
fig. 21 illustrates a method of displaying a video call image on a user terminal for receiving a collaborative service according to an exemplary embodiment;
fig. 22 illustrates another method of displaying a video call image on a user terminal for receiving a collaborative service according to an exemplary embodiment;
FIG. 23 illustrates a process for initiating a collaboration service during a videoconference with a server for providing the collaboration service, in accordance with a request to perform the collaboration service, in accordance with an exemplary embodiment;
fig. 24 illustrates an example of performing a video conference in a user terminal for receiving a collaboration service before a request for performing the collaboration service is made in accordance with an exemplary embodiment;
fig. 25 illustrates a process of selecting a document shared during a video conference before making a request for performing a collaboration service in a user terminal for receiving the collaboration service, according to an exemplary embodiment;
fig. 26 illustrates an example of displaying a document shared in a video conference on a user terminal for receiving a collaboration service before a request for execution of the collaboration service is made, according to an exemplary embodiment;
fig. 27 illustrates an example in which a user terminal for receiving a collaboration service requests execution of the collaboration service during a video conference, according to an exemplary embodiment;
FIG. 28 illustrates the state of each user terminal when initiating a collaboration service during a video conference to prepare each user terminal to collaboratively edit a document, in accordance with an exemplary embodiment;
fig. 29 illustrates a process of generating an image showing a summary of a conference (hereinafter referred to as a conference summary) from video call images and transmitting the conference summary to each user terminal in a server for providing a collaboration service according to an exemplary embodiment;
FIG. 30 illustrates an example of displaying a meeting summary on each user terminal for receiving collaboration services in accordance with an exemplary embodiment;
FIG. 31 illustrates another example of displaying a meeting summary on each user terminal for receiving collaboration services in accordance with an exemplary embodiment;
FIG. 32 illustrates a process of requesting management of group members and groups collaborating with each other from a server for providing collaboration services, according to an exemplary embodiment;
fig. 33 illustrates an example of setting information on each group member in a user terminal for receiving a cooperation service according to an exemplary embodiment;
fig. 34 illustrates an example of dividing a current group into a plurality of groups in a user terminal for receiving a cooperation service according to an exemplary embodiment;
fig. 35 illustrates a state of each user terminal when a current group is divided into a plurality of groups;
FIG. 36 illustrates a process of limiting the scope of editing of a document being collaboratively edited in a server for providing collaboration services, in accordance with an exemplary embodiment;
fig. 37 illustrates an example of locking a first edit scope by a first user in a user terminal for receiving a collaboration service according to an exemplary embodiment;
fig. 38 illustrates an example of locking a second edit scope by a second user in a user terminal for receiving a collaborative service according to an exemplary embodiment;
fig. 39 illustrates an example of locking a third edit scope by a third user in a user terminal for receiving a collaboration service according to an exemplary embodiment;
fig. 40 illustrates an example of locking first to third edit scopes for a plurality of areas in a page in a user terminal for receiving a collaboration service according to an exemplary embodiment;
FIG. 41 illustrates a process for collaboratively editing a document in a server for providing collaboration services, according to an exemplary embodiment;
fig. 42 illustrates an example of editing a document being collaboratively edited by a first user in a user terminal for receiving a collaboration service, according to an exemplary embodiment;
fig. 43 illustrates an example of editing a document being collaboratively edited by a second user in a user terminal for receiving a collaboration service, according to an exemplary embodiment;
FIG. 44 illustrates an example of editing information in a user terminal for receiving collaboration services utilizing a meeting summary to identify a document being collaboratively edited, in accordance with an exemplary embodiment;
FIG. 45 illustrates an example of editing a document being collaboratively edited with a meeting summary in a user terminal for receiving collaboration services, according to an exemplary embodiment;
FIG. 46 illustrates the following example in accordance with an exemplary embodiment: in a user terminal for receiving a collaboration service, setting a device for each of an image of a document being collaboratively edited, a video call image, and a conference summary so as to separately display these images with other devices registered with the user terminal;
FIG. 47 illustrates an example of separately displaying a document, a video call image, and a conference summary being collaboratively edited on multiple devices, according to an exemplary embodiment;
fig. 48 illustrates a procedure of terminating a collaboration service by requesting termination of the collaboration service from a server for providing the collaboration service and transmitting a document edited in collaboration to another user terminal according to an exemplary embodiment;
fig. 49 illustrates an example in which a document for review, a video call image for review, and a conference session for review are synchronized with each other, according to an exemplary embodiment;
FIG. 50 illustrates a process of requesting a document for review and reviewing edits of the document from a server for providing collaboration services, according to an exemplary embodiment;
FIG. 51 illustrates the following example in accordance with an exemplary embodiment: when an edited portion of a document for review is selected, a conference summary for review and a video call image for review, both synchronized with the selected edited portion, are output in a user terminal for receiving a collaboration service;
fig. 52 illustrates a process of requesting a video call image for review and reviewing editing of a document from a server for providing a collaboration service, according to an exemplary embodiment;
FIG. 53 illustrates the following example in accordance with an exemplary embodiment: when a portion to be reproduced of the video call image for review is selected, a conference summary for review and a document for review, both of which are synchronized with the selected reproduced portion, are output in a user terminal for receiving a collaboration service;
FIG. 54 illustrates a process of requesting a meeting summary for review and reviewing edits of documents from a server for providing collaboration services, according to an exemplary embodiment;
fig. 55 illustrates the following example according to an exemplary embodiment: when a text in the conference summary for review is selected, a video call image for review and a document for review, both synchronized with the selected text, are output in a user terminal for receiving a collaboration service;
FIG. 56 is a flowchart of a method of providing collaboration services in accordance with an illustrative embodiment; and is
Fig. 57 is a flowchart of a method of receiving a collaboration service according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. The present embodiments are provided so that this disclosure will be thorough and complete, and should not be construed as being limited to the description set forth herein. Accordingly, the following describes exemplary embodiments with reference to the drawings only to illustrate aspects of the present description. Variations or combinations which may be readily inferred by one of ordinary skill in the art from the description and exemplary embodiments thereof are considered to be within the scope of the inventive concept. Expressions such as "at least one of …" modify the entire list of elements when preceding the list of elements, but do not modify the individual elements in the list.
It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated elements, components, steps, and/or operations, but do not preclude the presence or addition of one or more other elements, components, steps, and/or operations.
In addition, it should be understood that although the terms "first," "second," etc. may be used herein to describe various elements and/or components, these elements and/or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component.
The present embodiments relate to a method and a server for providing a collaboration service and a user terminal for receiving a collaboration service, and a detailed description of functions or configurations well known to those skilled in the art is omitted herein.
FIG. 1 illustrates an environment for providing collaboration services.
Collaborative services refer to a class of services that are provided to multiple users when they collaborate with each other to perform a particular task that pursues the same goal. In a collaboration service that allows multiple users to collaboratively edit a document, document editing programs or tools for communication between collaborating users may be provided to the users as a type of collaboration service. In this case, the document being collaboratively edited may be any kind of document executable on the server 100 for providing the collaboration service, regardless of its type or content. For example, the document may contain textual or multimedia content.
The server 100 may be a server for storing various types of applications and data that allow each user to collaborate with other users. The server 100 may perform both local and remote communications. The server 100 may also be connected to a plurality of user terminals 200 via a network.
The user terminal 200 may be various types of user equipment that can be used to connect with the server 100. For example, the user terminal 200 may be a smart phone, a tablet PC, a desktop computer, or a laptop computer capable of performing wired or wireless communication with the server 100. In addition, the user terminal 200 may be a user device configured to capture and reproduce video call images to allow users collaboratively editing a document to participate in a video conference.
In one exemplary embodiment, editing a document may include adding, removing, modifying, and/or formatting text, objects, images, graphics, and the like, in the document, image, video, application, and the like. However, editing is not limited to the above-described exemplary embodiments, but may include other operations performed on a document.
In one exemplary embodiment, collaborative editing may include editing a document by multiple users simultaneously or sequentially, or may include both editing a document by multiple users simultaneously and editing a document by multiple users sequentially. However, the collaborative editing is not limited to the above-described exemplary embodiments, but may include other collaborative editing performed on a document.
Fig. 2 is a block diagram of a configuration of a user terminal 200 according to an exemplary embodiment. Those of ordinary skill in the art will appreciate that user terminal 200 may include other common components in addition to those shown in fig. 2. Each of the components shown in fig. 2 may be integrated, added, or omitted according to the specification of the user terminal 200 actually implemented. In other words, two or more components may be combined into a single component, or a single component may be divided into two or more components, if necessary.
Referring to fig. 2, the user terminal 200 according to the present embodiment may include a controller 210, a user input unit 220 (e.g., user input), an audio/video input unit 230 (e.g., audio/video input), a communication unit 240 (e.g., communicator), and an output unit 250 (e.g., output).
The controller 210 may control the display unit 251 to display a portion of the content stored in the memory (not shown). Alternatively, when the user performs a manipulation on a region of the display unit 251, the controller 210 may perform a control operation corresponding to the manipulation.
Although not shown in fig. 2, the controller 210 may include at least one selected from: a Random Access Memory (RAM), a Read Only Memory (ROM), a Central Processing Unit (CPU), a graphics processing unit (not shown), and a data bus. The RAM, ROM, CPU and Graphics Processing Unit (GPU) may be connected to each other via a data bus.
The CPU accesses a memory (not shown) and performs booting using an operating system (O/S) stored in the memory. The CPU also performs various operations using various types of programs, contents, and data stored in the memory.
The ROM stores a set of commands to start the system. For example, when a power-on command is input and power is supplied, the CPU copies the O/S stored in the memory into the RAM according to the command stored in the ROM, executes the O/S and starts the system. When the startup is completed, the CPU copies various programs stored in the memory into the RAM, executes the programs copied into the RAM, and performs various operations. In detail, the GPU may generate a screen on which an electronic document including various objects such as contents, icons, and menus is displayed. The GPU calculates attribute values, such as coordinate values, shapes, sizes, and colors, of each object to be displayed according to the layout of the screen. The GPU may also create screens including various layouts of the objects based on the calculated attribute values. The screen created by the GPU may be provided to the display unit 251 so that it is displayed on each region of the display unit 251.
The controller 210 controls a video processor (not shown) and an audio processor (not shown) to process video data and audio data contained in content received via the communication unit 240 or content stored in the memory, respectively.
The user input unit 220 may receive various commands from a user. The user input unit 220 may include at least one selected from a keypad 221, a touch panel 223, and a pen recognition panel 225.
The keypad 221 may include various types of keys, such as mechanical buttons, a wheel, and the like, provided on various regions of the main body of the user terminal 200, such as a front surface, side surfaces, and a rear surface.
The touch panel 223 may detect a touch input of a user and output a touch event value corresponding to the detected touch signal. When the touch panel 223 is combined with a display panel (not shown) to form a touch screen (not shown), the touch screen may be implemented as various types of touch sensors, such as capacitive, resistive, and piezoelectric touch sensors. Capacitive touch sensors use a dielectric material that is overlaid on the surface of the touch screen. When a portion of a user's body touches the surface of the touch screen, the capacitive touch sensor detects the micro-electricity caused by the portion of the user's body and calculates touch coordinates. A resistive touch sensor includes two electrode plates embedded in a touch screen. When a user touches a specific point on the screen, the upper and lower electrode pads make contact at the touch point. The resistive touch sensor detects a current caused by contact of two electrode plates and calculates touch coordinates. Touch events on a touch screen can be generated mostly with human fingers. However, touch events may also occur via objects formed of conductive materials that can cause a change in capacitance.
The pen recognition panel 225 senses a proximity input or a touch input of a stylus (e.g., a stylus or a digitizer pen) according to manipulation of the stylus and outputs a pen proximity event or a pen touch event corresponding to the sensed proximity input or touch input of the stylus. The pen recognition panel 225 may be implemented using Electro Magnetic Resonance (EMR) technology, and senses a touch input or a proximity input according to a variation in intensity of an electromagnetic field caused by proximity or touch of a pen. Specifically, the pen recognition panel 225 may include an electromagnetic induction coil sensor (not shown) having a mesh structure and an electric signal processor (not shown) sequentially supplying an Alternating Current (AC) signal having a predetermined frequency to a loop coil of the electromagnetic induction coil sensor. When the pen having the resonance circuit therein is arranged close to the loop coil of the pen recognition panel 225, the magnetic field transmitted from the loop coil generates a current based on mutual electromagnetic induction in the resonance circuit of the pen. An inductive field is created from the coil of the resonant circuit in the pen based on the current. The pen recognition panel 225 then detects the induction field from the loop coil in a signal receiving state and senses the position of the point where the pen is held in close proximity and touched. The pen recognition panel 225 may be disposed under the display panel and have a sufficient area so as to cover a display area of the display panel.
The audio/video input unit 230 may include a microphone 231 and a photographing unit 233. The microphone 231 receives and converts the user's voice or other sounds into audio data. The controller 210 may use the user's voice input via the microphone 231 for a video conference, or may store audio data in a memory. The photographing unit 233 may photograph a still or moving image according to the user's control. The photographing unit 233 may be implemented using a plurality of cameras, for example, a front camera and a rear camera.
When the audio/video input unit 230 includes the microphone 231 and the photographing unit 233, the controller 210 may generate a video call image using the voice of the user input via the microphone 231 and the video of the user recognized by the photographing unit 233.
The user terminal 200 may operate in a motion control mode or a voice control mode. When the user terminal 200 operates in the motion control mode, the controller 210 may activate the photographing unit 233 to photograph the user, track a change in the motion of the user, and perform a control operation corresponding to the change. When the user terminal 200 operates in the voice control mode, the controller 210 analyzes the user's voice input via the microphone 231 and performs a control operation according to the analyzed user's voice.
The communication unit 240 may perform communication with different types of external devices according to various types of communication methods. The communication unit 240 may include at least one selected from a wireless fidelity (Wi-Fi) chip 241, a bluetooth chip 243, a Near Field Communication (NFC) chip 245, and a wireless communication chip 247. The controller 210 may communicate with various external devices via the communication unit 240.
Wi-Fi chip 241 and Bluetooth chip 243 can utilize Wi-Fi and Bluetooth technologies, respectively, to perform communications. The communication unit 240 using the Wi-Fi chip 241 or the bluetooth chip 243 may transmit or receive various types of information after transmitting or receiving connection information such as a Service Set Identifier (SSID) or a session key and establishing a communication connection using the connection information. The NFC chip 245 refers to a chip that performs communication using NFC technology operating on a 13.56MHz frequency band among various Radio Frequency Identification (RFID) frequency bands including 135kHz, 13.56MHz, 433MHz, 860 to 960MHz, and 2.45 GHz. The wireless communication chip 247 refers to a chip that performs communication according to various communication standards such as Institute of Electrical and Electronics Engineers (IEEE), Zigbee, third generation (3G), third generation partnership project (3GPP), and Long Term Evolution (LTE).
The output unit 250 may include a display unit 251 and a speaker 253.
The display unit 251 may include a display panel (not shown) and a controller (not shown) for controlling the display panel. Examples of the display panel may include a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, an active matrix OLED (AM-OLED), a Plasma Display Panel (PDP), and other various displays. The display panel may be formed as a flexible, transparent or wearable display. The display unit 251 may be combined with the touch panel 223 of the user input unit 220 to form a touch screen (not shown). For example, the touch screen may include an integrated module in which the display panel is combined with the touch panel 223 to form a layered structure.
The speaker 253 may output audio data generated by an audio processor (not shown).
The above-described components of the user terminal 200 may be given different names from those described above. In addition, the user terminal 200 according to the present embodiment may include at least one of the above-described components. User terminal 200 may not include some of these components or may include additional components. The user terminal 200 may perform the following operations by using at least one of the above-described components.
The user terminal 200 for receiving the collaboration service from the server 100 for providing the collaboration service receives voice and video of a user via the audio/video input unit 230 and receives edit information on a document being collaboratively edited via the user input unit 220. In an exemplary embodiment, a document is collaboratively edited using one or more of a word processing program, spreadsheet program, slide program, presentation program, animation program, graphics program, note taking program, notepad program, and other similar programs. The editing information may include information for editing the document, and may include, for example, at least one selected from: text information, image information, table information, copy information, paste information, word space information, line space information, word size information, color information, and other various information relating to editing of a document.
The controller 210 of the user terminal 200 may acquire a video call image and edit information obtained by performing signal processing on voice and video of the user and transmit the video call image and edit information to the server 100. The communication unit 240 receives, from the server 100, a video call image associated with each user who collaboratively edits a document, an image showing a summary of a conference (hereinafter referred to as "conference summary") generated based on speech included in the video call image associated with each user, and a document being collaboratively edited that is synchronized with the conference summary. The output unit 250 outputs the received video call image associated with each user, the conference summary, and the document being collaboratively edited. The conference summary may be information output as an image including at least one selected from text regarding the conference, a document including the text regarding the conference, and a chart. In addition, the conference summary may include text generated based on the user's speech, time information of the time at which each text was generated, and page information of a collaborative editing document provided to the user who uttered the text when it was generated. In other words, the details of the conference summary may include at least one selected from text, time information, page information, and image information.
In one exemplary embodiment, the editing of a document may be indicated by a corresponding indicator, which may be one or more of a mark, a highlight, an object, an icon, or an image from a video call image, such as a thumbnail image of a user.
Further, since the meeting summary and the document are synchronized with each other, user input to at least one of the meeting summary and the document may be accompanied by a change in the other. For example, the communication unit 240 may transmit information about a text selected by the user from the conference summary to the server 100, and receive information about an edited portion of the document synchronized with the selected text in response to the transmission. The output unit 250 may then output the edited portion of the document synchronized with the selected text.
The conference summary may be a text recording of items generated based on the user's speech in the video. The meeting summary may also be a text record of items generated based on user edits to the editable document.
Fig. 3 is a block diagram of a configuration of a server 100 for providing a collaboration service according to an exemplary embodiment. Those of ordinary skill in the art will appreciate that the server 100 may include other common components in addition to those shown in fig. 3. Each of the components shown in fig. 3 may be integrated, added, or omitted depending on the actual implementation of the server 100. In other words, two or more components may be combined into a single component, or a single component may be divided into two or more components, if necessary.
The communication unit 110 may perform communication with an external device including a user terminal (200 in fig. 2). For example, the server 100 for providing a collaboration service allowing collaborative editing of a document may receive various requests related to the collaboration service from the user terminal 200, including a request for initiating the collaboration service, setting information necessary for creating a collaborative work environment, and editing information on the document being collaboratively edited. In addition, the server 100 may provide all matters related to the provision of the collaboration service in response to various requests related to the collaboration service.
The controller 130 may perform overall control of the server 100. The controller 130 acquires information and requests received via the communication unit 110 and stores the received information and requests in the memory 150. The memory 150 may include a storage device or a database. The controller 130 may also process received information and requests. For example, the controller 130 may generate an image to be used in the collaboration service or perform a process for managing the received information based on the information received from the fourth user terminal (600 in fig. 48). In addition, in response to the acquired request, the controller 130 may transmit information necessary for providing the cooperation service to the user terminal 200.
The controller 130 may perform video call images associated with each user collaboratively editing a document, a conference summary generated based on speech included in the video call images, and overall management of the document collaboratively edited according to the received editing information. Video call images, conference summaries and documents are used in collaboration services. For example, the controller 130 may perform management operations such as generation, storage, processing, and deletion of video call images, conference summaries, and documents being collaboratively edited associated with each user. The controller 130 of the server 100 will now be described in more detail with reference to fig. 3 and 4.
Fig. 4 illustrates a controller 130 of the server 100 for providing a collaboration service according to an exemplary embodiment. Referring to fig. 3, the controller 130 of the server 100 may include an integrated management processor 131, a user management processor 133, a document management processor 135, and an image management processor 137. Those of ordinary skill in the art will appreciate that the controller 130 may include other common components in addition to those shown in fig. 4. The controller 130 may include multiple processors as shown in fig. 4, or some or all of the processors may be integrated into a single controller, unlike in fig. 4.
The integrated management processor 131 performs overall control for providing the cooperation service. The integrated management processor 131 can assign the received information and request related to the cooperation service to the user management processor 133, the document management processor 135, and the image management processor 137, respectively, and control the processing of these information and request. In addition, in response to a request related to a collaboration service, the integrated management processor 131 may transmit information on the collaboration service using at least one selected from the user management processor 133, the document management processor 135, and the image management processor 137. The collaboration service related information used by the integrated management processor 131 in integrated management for providing collaboration services, collaboration service related information generated, modified, and deleted according to the integrated management, and collaboration service support software may be stored in the integrated management Database (DB) 151.
In order to achieve synchronization between a video call image associated with each user collaboratively editing a document, a conference summary generated based on speech included in the video call image associated with each user, and the collaboratively edited document, the integrated management processor 131 may add log data to the video call image input from the user terminal 200 to the communication unit 240, edit information of the document, and/or a result of editing the document according to the edit information. For example, if the editing information is text information for inserting text, the result of editing of the document may be a document having the text therein or text inserted into the document. If the editing information is copy information for copying and pasting text, the result of editing of the document may be a document to which the text is copied or copied text that is pasted into the document. If the editing information is information for adjusting the word size, the result of the editing of the document may be a document or text in which the font size has been adjusted. As described above, video call images, conference summaries and documents are used in collaboration services. In this case, the log data may be data related to the time when the video call image or the edit information is received by the server 100. In other words, a portion of the video call image and the edit information are synchronized with each other based on the time when the video call image or the edit information is received by the server 100 or within a predetermined range of the time. Thus, the video call image may be synchronized with the document being collaboratively edited. In addition, the conference summary can be synchronized with the cooperatively edited document using the log data itself added to the video call image. The attribute and synchronization interval of log data added for synchronization between various types of images and documents being collaboratively edited may be changed.
In an exemplary embodiment, instead of a video call image, there may be a still image of the user editing the document and the user's audio may be synchronized with the document being collaboratively edited. In another exemplary embodiment, the still image of the user corresponds to a portion or all of the audio of the user.
The log data may be a text record of an item generated based on the user's speech in the video. The log data may be a text record of items generated based on user edits to the editable document.
The user management processor 133 may manage information on a plurality of users using the collaborative service. In other words, the user management processor 133 may manage personal information on each user and information on group members in each group. User information used in user management by the user management processor 133 and user information generated, modified, and deleted according to the user management may be stored in the user information DB 153.
The document management processor 135 performs overall control of the document being collaboratively edited according to the editing information received from the user terminal 200. When a program for processing a document is executed on the server 100 for providing a collaborative service, the document management processor 135 may perform overall control on the document being collaboratively edited according to editing information received from the user terminal 200 and a request related to the processing of the document. For example, the document management processor 135 may perform management operations such as creation, editing, storage, and deletion of documents. Document information used for document management by the document management processor 135 and documents generated, modified, and deleted according to document management may be stored in the document DB 155.
The image management processor 137 performs overall control of the video call image associated with each user collaboratively editing the document and the conference summary generated based on the voice included in the video call image associated with each user. For example, the image management processor 137 may perform management operations such as creation, storage, processing, and deletion of video call images and conference summaries associated with each user. Image information used for image management by the image management processor 137 and image information generated, modified, and deleted according to the image management may be stored in the image DB 157. The image management processor 137 will now be described in more detail with reference to fig. 5.
Fig. 5 illustrates an image management processor 137 that may be included in the controller 130 of the server 100 for providing a cooperation service according to an exemplary embodiment.
Referring to fig. 5, the image management processor 137 may include an audio/video processor 138 and an audio converter 139. Those of ordinary skill in the art will appreciate that the image management processor 137 may include other common components in addition to those shown in fig. 5.
The audio/video processor 138 may perform signal processing on the input image signal. In this case, the signal processing may refer to creation or processing of an image. The processing of the image may refer to editing thereof. The resulting image signal including the audio signal and the video signal may be transmitted to the user terminal 200 via the communication unit 110 or stored in the image DB 157.
The audio converter 139 may convert voice included in a video call image associated with each user receiving the collaboration service into information in a text form. The image management processor 137 may receive information in text form from the audio converter 139 in order to create a conference summary.
Referring back to fig. 3, the server 100 may include a memory 150. The memory 150 may include at least one selected from the integrated management DB 151, the user information DB153, the document DB155, the security document DB158, and the image DB157, as described in more detail below with reference to fig. 6 to 9.
Fig. 6 illustrates an integrated management DB 151 that may be stored in the memory 150 of the server 100 for providing a collaboration service according to an exemplary embodiment.
The integrated management DB 151 may store various software and information necessary for the server 100 to provide the cooperation service. Referring to fig. 6, the integrated management DB 151 may store collaboration service support software and collaboration service related information.
The collaboration service support software may include an OS and applications executed on the server 100 and various types of data for supporting collaboration services. The collaboration service-related information may include information on access to various databases, synchronization information such as attributes and synchronization intervals of log data added for synchronization between various types of images and documents edited in collaboration, and collaboration history information generated when collaboration is performed using a collaboration service provided by the server 100.
Fig. 7 illustrates a user information DB153 that may be stored in the memory 150 of the server 100 for providing a collaboration service according to an exemplary embodiment.
Referring to fig. 7, the user information DB153 may store personal information on each user of the collaborative service and information on group members in each group. The user information DB153 may store, for each user, a unique identifier such as an account for obtaining access to the server 100, personal information stored in the server 100, and information on use of a collaboration service provided to the server 100. The personal information stored in the server 100 may be various types of data and applications uploaded to the server 100. The information on the use of the cooperation service provided by the server 100 may be information representing a period in which the user is allowed to use the cooperation service or a right to use the cooperation service.
Fig. 8 illustrates a document DB155 that may be stored in the memory 150 of the server 100 providing a collaboration service according to an exemplary embodiment.
Referring to fig. 8, the document DB155 may store documents created or edited by the server 100. As shown in fig. 8, the document DB155 may store attribute information of documents, which may include information about sharing or not of the documents and version information. When editing a document using a collaboration service, the document DB155 may store editing information about each user together with an account of the user who edited the document and log data for synchronization.
Further, if a cooperative writing requiring security is performed, the server 100 may have a separate security enhanced area where the document and the editing information are separately stored. The security document DB158 may store documents created or edited with the server 100 like the document DB155, but the security document DB158 may be a DB in which security is further enhanced. For example, the document stored in the security document DB158 may be double or triple encrypted for storage, or authentication of a user or a user terminal may be required for executing the document or using a collaboration service that allows editing of the document. Authentication of a user or user terminal may be performed based on, for example, a password, an authentication key, a Personal Identification Number (PIN), biometric information, a public key certificate, a Media Access Control (MAC) address, and/or approval via a telephone.
In addition, the documents stored in the security document DB158 may be provided only when the user pays a separate fee for its use or the user is located at a specific place.
When the documents of the collaborative editing are stored in the security document DB158 in this manner, management operations such as creation, editing, storage, and deletion can be performed via the security document DB 158. For example, if editing information is received from the user terminal 200, the server 100 may perform activities that may occur during editing of a document, such as temporary storage of a previous document and editing information, storage of a result of editing, and/or storage of an editing command, via the secure document DB 158.
In detail, a method in which a user determines a document to be stored in the security document DB158 will be described in detail below by describing an exemplary embodiment of creating a group using a collaboration service and storing a collaborative editing document.
Fig. 9 illustrates an image DB157 that may be stored in the memory 150 of the server 100 for providing a collaboration service according to an exemplary embodiment.
Referring to fig. 9, the image DB157 may store a video call image associated with each user collaboratively editing a document and a conference summary generated based on speech included in the video call image associated with each user. As shown in fig. 9, the image DB157 may store a video call image associated with each user together with an account and log data of each user for each group of the collaborative editing document. The image DB157 may also store the conference summary for each group together with the account of the user who uttered each text and log data for synchronization.
Referring back to fig. 3, the above-described components of the server 100 may be given different names than those described above. In addition, the server 100 for providing a collaboration service according to the present embodiment may include at least one of the above-described components. The server 100 may not include some of these components or may include additional components. The server 100 may perform the following operations by using at least one of the above components.
The server 100 for providing permission of collaborative editing of a document may receive a video call image associated with each user collaboratively editing the document and editing information on the document from a user terminal requesting a collaboration service, and store the received video call image associated with each user, a conference summary generated based on speech included in the video call image associated with each user, and the document collaboratively edited according to the received editing information.
Further, the server 100 may synchronize the document with the conference summary so that the user may identify context information while collaboratively editing the document. In other words, the server 100 may synchronize the document with the meeting summary such that the user may identify the status of the document being collaboratively edited through the text contained in the meeting summary.
Synchronization refers to matching the time of execution of an event. Synchronization may also include matching the time of occurrence of events that have already been executed. In addition, synchronization may refer to the simultaneous occurrence of events or the adjustment of the time intervals at which events occur such that they are performed within a predetermined range. For example, text spoken by a user collaborating with other users to edit a document (e.g., text included in a meeting summary) may be synchronized with the document collaboratively edited in generating the text. In this case, synchronizing with the document may include synchronizing the text spoken by the user with editing information about the document edited at the time of generating the text or with an editing result of the document according to the editing information. The result of the editing may be a document edited or a portion of a document edited in accordance with the editing information. If the result of the editing is a portion of the document being edited, the editing information may be the same as the result of the editing. For example, if the editing information is input text that the user wants to add to the document, the result of the editing may be the edited document in which the input text is.
Additionally, synchronizing may include correlating multiple tasks that occur simultaneously or nearly simultaneously (e.g., in less than 5 seconds) with one another. In this case, correlating the plurality of tasks may mean that the plurality of tasks are aggregated into a group and managed together, or that times at which the plurality of tasks are executed (e.g., stored, output, or displayed) are matched to each other.
In an exemplary embodiment, for example, if a user edits a document while discussing conference details via speech, the synchronization may include connecting the editing information or results of the editing with text spoken by the user when the editing information is received by the server 100 or when the results of the editing are generated in the server 100. In this case, information on synchronization and connection between the editing information or the result of the editing and the text spoken by the user may be stored in the memory 150.
Additionally, synchronization may include correlating multiple tasks that occur together in a short period of time (e.g., in less than 10 minutes) to one another. Alternatively, synchronizing may include correlating some of the plurality of tasks that occurred in a particular period of time (e.g., within less than 5 minutes) before or after one of the plurality of tasks occurred with one another.
In this case, correlating the plurality of tasks may mean that the plurality of tasks occurring together in a short period of time are aggregated into a group and managed together, or that times at which the plurality of tasks are executed (e.g., stored, output, or displayed) are matched to each other.
For example, when text "patent policy" is entered or copied to a document being collaboratively edited between 3:10pm and 3:15pm, the user may say that the speech is "very important" during the same time interval. In this case, the speech may be converted to text and then stored in the conference summary.
In this case, the text "patent policy" entered or copied between 3:10pm and 3:15pm may be interlinked with the text "very important" spoken by the user. For example, interlinked text may refer to if one text is displayed later, then the other text is displayed together. Interconnecting the results of the editing with the text spoken by the user in this manner may be referred to as the two being synchronized with each other.
The server 100 may also generate edited images based on the conference summary by extracting a portion corresponding to each text in the conference summary from the video call image associated with each user so that the users may identify contextual information while collaboratively editing the document. Because the edited image based on the conference summary is considered to be synchronized with the conference summary, and the conference summary is synchronized with the document, the edited image, the conference summary, and the document are synchronized with each other.
In addition, the server 100 may synchronize conference schedules and documents to the video call images associated with each user for storage.
Further, during review after collaborative editing of a document, a user may utilize various types of images synchronized with the document to identify contextual information that exists at the time of the collaborative editing.
Fig. 10 is a diagram illustrating a server 100 for providing a collaboration service implemented as a plurality of distributed servers according to another exemplary embodiment. Although omitted, the above description about the server 100 is applicable to the server 100 shown in fig. 10.
Referring to fig. 10, the distributed server for providing the cooperation service may include an integrated management server 101, a user management server 103, a document management server 105, and an image management server 107. Unlike in the server 100 of fig. 3, the various processors integrated into the controller 130 and the various databases stored in the memory 150 are implemented as a plurality of decentralized servers.
One of the integrated management server 101, the user management server 103, the document management server 105, and the image management server 107 may perform communication with another server and exchange various types of data with each other. For example, the integrated management server 101 may perform communication with the user terminal 200 to send the received information and request related to the cooperation service to at least one selected from the user management server 103, the document management server 105, and the image management server 107. The integrated management server 101 may also acquire a response to the transmission from another server and provide the cooperation service to the user terminal 200. When the server 100 is implemented as a plurality of distributed servers in this manner, this configuration can facilitate maintenance and management of the server 100.
Fig. 11 is a block diagram of the integrated management server 101 as one of the decentralized servers for providing the cooperation service according to another exemplary embodiment.
Referring to fig. 11, the integrated management server 101 may include a communication unit 111, an integrated management processor 131, and an integrated management DB 151. The description of the integrated management processor 131 shown in fig. 4 and the integrated management DB 151 shown in fig. 6 is applicable to the integrated management processor 131 and the integrated management DB 151. Those of ordinary skill in the art will appreciate that the integrated management server 101 may include other common components in addition to those shown in fig. 11.
The integrated management server 101 may request the user management server 103 to confirm whether the user connected to the server 100 for providing the collaboration service is authorized to use the collaboration service. The integrated management server 101 may also request the document management server 105 to edit the document according to the editing information received from the user terminal 200, or acquire the document stored in the document management server 105. The integrated management server 101 may also store a video call image or a conference summary used in the collaboration service in the image management server 107 or acquire an image stored in the image management server 107. The integrated management server 101 may use the user information and the log data in order to acquire images and/or documents synchronized with each other. For example, the integrated management server 101 may acquire all texts in a conference summary and edit information on a document being collaboratively edited having the same user information and log data, and provide the user terminal 200 with a collaboration service using the conference summary and the document synchronized with each other.
Fig. 12 is a block diagram of the user management server 103 as one of the dispersion servers for providing the cooperation service according to another exemplary embodiment.
Referring to fig. 12, the user management server 103 may include a communication unit 113, a user management processor 133, and a user information DB 153. The description of the user management processor 133 shown in fig. 4 and the user information DB153 shown in fig. 7 is applicable to the user management processor 133 and the user information DB 153. Those of ordinary skill in the art will appreciate that the integrated management server 103 may include other common components in addition to those shown in fig. 12.
The user management server 103 may manage information on a plurality of users using the collaboration service. The user management processor 133 may manage personal information on each user and information on group members in each group. User information used in user management and user information generated, modified, and deleted according to user management may be stored in the user information DB 153.
Fig. 13 is a block diagram of the document management server 105 as one of the decentralized servers for providing the collaboration service according to another exemplary embodiment.
Referring to fig. 13, the document management server 105 may include a communication unit 115, a document management processor 135, and a document DB 155. In order to perform the cooperative writing requiring security, the document management server 105 may further include a security document DB 158. The description of the document managing processor 135 shown in fig. 4 and the document DB155 shown in fig. 8 is applicable to the document managing processor 135 and the document DB 155. Those of ordinary skill in the art will appreciate that the document management server 105 may include other common components in addition to those shown in FIG. 13.
The document management server 105 can perform overall control on the document being collaboratively edited according to the editing information of the document. Document information for document management and documents generated, modified, and deleted according to document management may be stored in the document DB 155. The document management server 105 can also communicate with other servers via the communication unit 115. For example, the document management server 105 may receive a request for editing of a document stored therein or receive the document via the communication unit 115.
Fig. 14 is a block diagram of the image management server 107 as one of the dispersion servers for providing the cooperation service according to another exemplary embodiment.
Referring to fig. 14, the image management server 105 may include a communication unit 117, an image management processor 137, and an image DB 157. The description of the image management processor 137 shown in fig. 4 and the image DB157 shown in fig. 9 is applicable to the image management processor 137 and the image DB 157. It will be understood by those of ordinary skill in the art that the document management server 107 may include other common components in addition to those shown in FIG. 14.
The image management processor 107 can perform overall control of the video call image associated with each user of the collaborative editing and the conference summary generated based on the speech included in the video call image associated with each user. The image management server 107 can also communicate with other servers via the communication unit 117. For example, the image management server 107 may receive a request for an image stored therein or receive an image to be stored therein via the communication unit 117.
Fig. 15 illustrates a process of initiating a collaboration service according to a request for execution of the collaboration service while processing a document using the server 100 for providing the collaboration service according to an exemplary embodiment. In detail, fig. 15 illustrates a process of preparing to receive a collaboration service. When a first user requests execution of a collaboration service while processing a document with the server 100, the collaboration service is provided so as to collaboratively edit the document through a video conference with group members. However, the exemplary embodiments are not limited to this process of preparing to receive collaboration services.
A program for processing a document may be installed on the server 100. The first user may log in to the server 100 with his or her user account via the first user terminal 300 and request execution of a program for processing a document on the server 100. The second and third users may log into the server 100 using their user accounts and receive the collaboration service provided by the server 100.
Referring to fig. 15, the first user terminal 300 may request the creation of a document from the server 100 (operation S1505). Since a program for processing a document is installed on the server 100, a first user may log into the server 100 and request to create a document having a desired format.
The server 100 creates a document according to a request from the first user terminal 300 (operation S1510).
The first user terminal 300 displays a document created by the server 100 on its screen, and the first user can process the document through the screen (operation S1515). In this case, the document executed on the server 100 may be displayed on the web browser screen executed on the first user terminal 300. In other words, the server 100 provides the first user terminal 300 with the web-based document, and the first user terminal 300 can view the web-based document through the web browser screen.
The first user terminal 300 may transmit information regarding the processing of the document to the server 100 (operation S1520). The time interval at which information regarding the processing of the document is transmitted may be adjusted. The first user terminal 300 may transmit information about the processing of the document to the server 100 every time an event related to the processing of the document occurs.
The server 100 may store information about the processing of the document received from the first user terminal 300 (operation S1525).
The first user may select to perform the handover to the cooperative mode in the first user terminal 300 (operation S1530).
The first user terminal requests the server 100 to perform the cooperation service (operation S1535). A state of the first user terminal 300 requesting the execution of the cooperation service will now be described with reference to fig. 16.
Fig. 16 illustrates an example in which the first user terminal 300 for receiving the cooperation service requests the execution of the cooperation service according to an exemplary embodiment. In detail, fig. 16 shows a user interface screen 360 of the first user terminal 300. From top to bottom, the user interface screen 360 includes an address window 361 for inputting an address of the server 100, a menu bar 363, and a band menu bar 365. The menu bar 363 and the band menu bar 365 may have different shapes according to the type of program executed on the server 100.
First, the first user may select a "collaboration" menu in the menu bar 363, click (or touch) "to switch to a collaboration mode" in the displayed sub-menu, and perform switching to the collaboration mode.
Then, in accordance with a request from the first user to perform a switch to the collaboration mode, a mode indicator 367 indicating that the current mode is the collaboration mode and an access user display window 369 showing a user who obtains access to the current document appear on the upper right side of the user interface screen 360. A window 370 showing the document being collaboratively edited may be displayed under the band menu bar 365.
Referring back to fig. 15, the first user may execute the group creation menu in the first user terminal 300 (operation S1540).
The first user terminal 300 may request the creation of a group from the server 100 (operation S1545).
Fig. 17 illustrates an example in which the first user terminal 300 for receiving the cooperation service requests the creation of a group according to an exemplary embodiment.
On the user interface screen 360 of the first user terminal 300, the first user may select a "collaboration" menu in the menu bar 363 between the address window 361 and the band menu bar 365, and then select a "create group" in the displayed sub-menu.
Selecting "create group" will display a create group window 362 on the user interface screen 360 of the first user terminal 300. To create a group, the first user may enter the appropriate values in the group name, accessibility, password, and default permissions, and press the done button. As shown in fig. 17, the first user may select and enter "group 1", "private", "x" (not shown so that other users cannot view it) and "read/write" in the group name, accessibility, password and default permissions, respectively. In this case, if the first user sets a password, the group members are allowed to use the collaboration service by participating in the created group through inputting the password. In addition, the create group window 362 may also include a menu 362-1 for selecting a mode of setting a security level of a document edited via the collaboration service. For example, if buttons for setting security levels to high, medium, and low, respectively, are provided via the menu 362-1, the user may determine the security level of a document on which group members cooperatively perform a collaborative service by selecting one of the buttons corresponding to high, medium, and low.
According to an exemplary embodiment, if the user sets the security level to high, a document edited via the collaboration service may be stored in the security document DB 158. In addition, if the user sets a security level to be in, a document edited via the collaboration service may be stored in the document DB155, but further authentication may be required in addition to inputting a corresponding password. In addition, if the user sets the security level to low, the document on which the created group works may be stored in the document DB155 and may be accessed by any user without requiring input of a password or without a password.
According to another exemplary embodiment, group members participating in the collaboration service may be restricted if the user sets the security level to high. For example, participation in the group may be limited to only users who are located within a predetermined range from the first user or who are located at a pre-specified place. Alternatively, only users accessed via pre-specified IP segments may be allowed to participate in the group. As another example, only users accessed via pre-specified device identification information (e.g., MAC address, etc.) may be allowed to participate in the group.
According to another exemplary embodiment, if the user sets the security level to high, execution of another application may be restricted while the document is being edited via the collaboration service. For example, another application may be restricted from executing on the user terminal to prevent the capture, copying, or transmission of a document being edited.
According to another exemplary embodiment, if a user sets a security level to high, the complexity of encryption of a document edited via a collaboration service may be increased when the document is stored. This increased encryption complexity may increase the time it takes to store the edited document, but may further enhance the security of the document.
According to another exemplary embodiment, if the user sets the security level to high, the server 100 may transmit the edited document to the user terminal via a separate secure channel. For example, the server 100 may convert the edited document into an image to provide the image to the user terminal in the form of stream data via a secure channel. If the document being edited is an image, the security of the document may be further enhanced because it is difficult for the user terminal to parse the structure, content, or text of the document.
Referring back to fig. 15, the server 100 may create a group according to a request from the first user terminal 300 (operation S1550). According to the above example, a private group with a group name of "group 1" may be registered with the server 100, and the first user (e.g., user a) may be a current group member in group 1.
The first user terminal 300 may execute the group member management menu (operation S1555).
The first user terminal 300 may request the server 100 to invite the group members (operation S1560). The management of group members will now be described in more detail with reference to fig. 18.
Fig. 18 illustrates an example in which the first user terminal 300 for receiving the cooperation service performs management of group members according to an exemplary embodiment.
On the user interface screen 360 of the first user terminal 300, the first user may select a "collaboration" menu in the menu bar 363 between the address window 361 and the band menu bar 365, and then select a "management group member" in the displayed sub-menu.
Then, a management group member window 364 may be displayed on the user interface screen 360 of the first user terminal 300. The first user may request invitation to the group members from the server 100 by pressing an "add member" button.
Referring back to fig. 15, the server 100 may invite the second user terminal 400 and the third user terminal 500, and the second user terminal 400 and the third user terminal 500 may be connected to the server 100 (operation S1565).
In this case, if the first user sets a password during the creation of the group, the server 100 may receive the password from the second user terminal 400 and the third user terminal 500. In this case, the password may be text entered by the group members, or may be encrypted before being sent to the server 100.
Upon receiving the password, the server 100 may perform authentication of the group members by determining whether the received password is the same as the password created by the first user. When the authentication is confirmed, the server 100 may provide the second user terminal 400 and the third user terminal 500 with the right to use the cooperation service.
Further, while each group is generally assigned one password, each group member may be assigned a different password. For example, the second user terminal 400 and the third user terminal 500 may use different passwords to participate in the group. Alternatively, only one of the second user terminal 400 and the third user terminal 500 may be required to enter its password.
The second and third user terminals 400 and 500 may request execution of the document being collaboratively edited and receive a web-based document from the server 100 (operation S1570).
The second and third user terminals 400 and 500 may display the documents being executed on the server 100 on the screens of the second and third user terminals 400 and 500, respectively (operation S1575). In this case, the document executed on the server 100 may be displayed on web browser screens executed on the second user terminal 400 and the third user terminal 500.
The first to third user terminals 300, 400 and 500 may exchange video call images with each other via the server 100 (operation S1580). The exchange of video call images may continue until a request to terminate the collaboration service is made.
The video call images of the other parties may be output to the first to third user terminals 300, 400 and 500, respectively (operation S1585). For example, video call images associated with the second and third users may be output to the first user terminal 300.
Fig. 19 illustrates states of the first to third user terminals 300, 400 and 500 when a collaboration service is initiated while a document is being processed so that the first to third user terminals 300, 400 and 500 are ready to collaboratively edit the document, according to an exemplary embodiment. Referring to fig. 19, the first to third user terminals 300, 400 and 500 may be connected to the server 100.
A window 370 showing a document being collaboratively edited and a window 380 showing video call images of second and third users (e.g., "white" and "treble") may be displayed on the first user terminal 300.
Similarly, a window 470 showing a document being collaboratively edited and a window 480 showing video call images of the first and third users (e.g., "ann" and "tret") may be displayed on the second user terminal 400.
Likewise, a window 570 showing the document being collaboratively edited and a window 580 showing video call images of the first and second users (e.g., "ann" and "white") may be displayed on the third user terminal 500. In an exemplary embodiment, the video call image may comprise a motion visual image. In another exemplary embodiment, a video codec is utilized to process the motion visual image.
In one exemplary embodiment, the video call image may be a still image of the first and second users while audio provided by the first and second users is played.
Fig. 20 illustrates a method of displaying a user interface including an address window and a menu and a video call image on a first user terminal 300 for receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 20, a window 380 showing a video call image may be displayed on the right side of the first user terminal 300 on which an address window, a menu bar, and a ribbon bar are displayed. As described above with reference to fig. 19, the video call images of the second and third users (e.g., "white" and "treble") may appear in a window 380 on the first user terminal. However, the first user may manipulate the video call images of some of the users collaboratively editing the document so that they are not displayed on the first user terminal 300.
Fig. 21 illustrates a method of displaying a video call image on a first user terminal 300 for receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 21, a window 380 showing a video call image displayed on the first user terminal 300 may be resized. In other words, the window 380 showing the video call image displayed according to the default size preset in the first user terminal 300 may be resized. The window 380 may be resized using a multi-finger gesture, a click and drag method, and so on.
Fig. 22 illustrates another method of displaying a video call image on the first user terminal 300 for receiving the cooperation service according to an exemplary embodiment.
Referring to fig. 22, the position of a window 380 showing a video call image displayed on the first user terminal 300 may be adjusted. In other words, the position of the window 380 showing the video call image displayed according to the default position preset in the first user terminal 300 may be adjusted. The position of the window 380 can be adjusted using a drag-and-drop method or the like.
Fig. 23 illustrates a process of initiating a collaboration service according to a request for execution of the collaboration service during a video conference using the server for providing collaboration services 100 according to an exemplary embodiment. In detail, fig. 23 illustrates a process of preparing to receive a collaboration service. When a first user runs a video conference and a document with the server 100, a collaboration service is provided to share the document and collaboratively edit the document during the video conference. However, embodiments are not limited to this process of preparing to receive collaboration services.
Referring to fig. 23, the first user terminal 230 may request the server 100 to create a group for the video conference (operation S2305).
The server 100 may create a group according to a request from the first user terminal 300 (operation S2310).
The first user terminal 300 may perform a group member management menu (operation S2315).
The first user terminal 300 may request the server 100 to invite the group members for the video conference (operation S2320).
In this case, creating a group for the videoconference and inviting group members may be performed in a manner similar to those described above with reference to fig. 17 and 18.
The server 100 invites the second user terminal 400 and the third user terminal 500, and the second user terminal 400 and the third user terminal 500 may be connected to the server 100 (operation S2325).
The first to third user terminals 300, 400 and 500 may exchange video call images with each other via the server 100 (operation S2330). The exchange of video call images may continue until a request to terminate the collaboration service is made.
The video call images of the other parties may be output to the first to third user terminals 300, 400 and 500, respectively (operation S2335). For example, video call images associated with the second and third users may be output to the first user terminal 300. In operations S2305 to S2335, a group of the video conference is created, group members are invited, and the video conference is performed. The user interface screen for the video conference will now be described in detail with reference to fig. 24.
Fig. 24 illustrates an example of performing a video conference in the first user terminal 300 for receiving the collaboration service before making a request for performing the collaboration service according to an exemplary embodiment.
In detail, fig. 24 shows a user interface screen 360 of the first user terminal 300. From top to bottom, the user interface screen 360 includes an address window 361 for inputting an address of the server 100, a menu bar 363, and a band menu bar 365. The menu bar 363 and the band menu bar 365 may have different shapes according to the type of program executed on the server 100. The user interface screen 360 shown in fig. 24 has a similar structure to that shown in fig. 16, but differs therefrom in that the shapes of the menu bar 363 and the ribbon bar 365 are changed to match the program for the video conference.
Further, the mode indicator 367 may indicate that the current mode is a video conference mode, and the access user display window 369 may show the name of the user currently participating in the video conference.
An area for displaying video call images of other parties, a chat area, and simple icons representing basic functions used during the video conference appear on the video conference window 375. Video conference window 375 may also include a memo area or an area for displaying shared documents that are needed to run the video conference.
Referring back to fig. 23, the first user may select a document, from the document list, which he or she wants to share with other users participating in the video conference (operation S2340).
Fig. 25 illustrates a process of selecting a document shared during a video conference before making a request for performing a collaboration service in the first user terminal 300 for receiving the collaboration service according to an exemplary embodiment.
Referring to fig. 25, in a video conference window 375 displayed on the first user terminal 300, the first user may select a folder storing shared documents from simple icons representing basic functions used during a video conference and open the folder to recognize a sharable document list 376.
The first user may select at least one document from the list of documents and share the selected document with other users. In this case, the first user may select two files having different document formats to share the two documents at the same time. Referring to fig. 25, the first user may select both "present.doc" and "graph.pdf" from the document list 376.
Referring back to fig. 23, the first user terminal 300 may request the server 100 to execute a document selected from the document list (operation S2345).
The server 100 may execute the document requested to be executed by the first user terminal 300 (operation S2350). A program for executing a document may be installed on the server 100.
The server 100 may transmit the document executed on the server 100 to the first to third user terminals 300, 400 and 500 as a web-based document (operation S2355).
The first to third user terminals 300, 400 and 500 may display documents transmitted as web-based documents via web browsers, respectively (operation S2360).
Fig. 26 illustrates an example of displaying a document shared during a video conference on the first user terminal 300 for receiving a collaboration service before a request for execution of the collaboration service is made, according to an exemplary embodiment.
Referring to fig. 26, an area for displaying a video call image of the other party, a chat area, and a simple icon representing basic functions used during a video conference appear on the video conference window 375 of the first user terminal 300. Unlike in fig. 25, the two types of shared documents selected by the first user may be displayed together on video conference window 375.
Referring back to fig. 23, the first user may select to perform switching to the cooperative mode in the first user terminal 300 (operation S2365). Selection of execution of a switch to the cooperative mode in the first user terminal 300 will now be described with reference to fig. 27.
Fig. 27 illustrates an example in which the first user terminal 300 for receiving the collaboration service requests execution of the collaboration service during a video conference, according to an exemplary embodiment.
In detail, fig. 27 shows a user interface screen 360 of the first user terminal 300. From top to bottom, the user interface screen 360 includes an address window 361 for inputting an address of the server 100, a menu bar 363, and a band menu bar 365. Mode indicator 367 may indicate that the current mode is a video conference mode, and access user display window 369 may display the name of the user currently participating in the video conference.
Referring to fig. 27, the first user selects a "collaboration" menu in the menu bar 363, clicks (or touches) "to switch to a collaboration mode" in a displayed sub-menu, and performs switching to the collaboration mode.
Referring back to fig. 23, the first user terminal 300 may request the server 100 to perform a collaborative service that allows collaborative editing of a document (operation S2370). In this case, a menu for setting the security state of the document being collaboratively edited may be further provided. For example, in response to a user input, i.e., clicking (or touching) "switch to collaboration mode", the security level of the document being collaboratively edited may be set. The security level may be provided via buttons for selecting high, medium, and low, respectively, and the location of the DB on the server 100 where the document is stored may be selectively varied depending on the selected security level. For example, if the user sets the security level to high, the document on which the created group works may be stored in the security document DB 158. On the other hand, if the user sets the security level to medium or low, the document on which the created group works may be stored in the document DB 155.
The states of the first to third user terminals 300, 400 and 500 at the time of requesting execution of a collaboration service allowing collaborative editing of a document will now be described in detail with reference to fig. 28.
Fig. 28 illustrates states of the first to third user terminals 300, 400 and 500 when a collaboration service is initiated during a video conference to make the first to third user terminals 300, 400 and 500 ready to collaboratively edit a document, according to an exemplary embodiment. Referring to fig. 28, the first to third user terminals 300, 400 and 500 may be connected to the server 100.
A window 370 showing a document being collaboratively edited and a window 380 showing video call images of second and third users (e.g., "white" and "treble") may be displayed on the first user terminal 300.
Similarly, a window 470 showing a document being collaboratively edited and a window 480 showing video call images of the first and third users (e.g., "ann" and "tret") may be displayed on the second user terminal 400.
Likewise, a window 570 showing the document being collaboratively edited and a window 580 showing video call images of the first and second users (e.g., "ann" and "white") may be displayed on the third user terminal 500.
Fig. 19 illustrates states of the first to third user terminals 300, 400 and 500 when a collaboration service is initiated while a document is being processed to enable the first to third user terminals 300, 400 and 500 to collaboratively edit the document. Fig. 28 illustrates states of the first to third user terminals 300, 400 and 500 when a collaboration service is initiated during a video conference to enable the first to third user terminals 300, 400 and 500 to collaboratively edit a document. Both FIG. 19 and FIG. 28 show a process of preparing to receive a collaboration service for collaboratively editing a document. Each of the first to third user terminals 300, 400 and 500 shows the same state after receiving a collaboration service allowing collaborative editing of a document according to performing a switch to a collaboration mode. The example shown in fig. 28 is different from the example shown in fig. 19 only in that two different types of documents are displayed in the windows 370, 470, and 570, respectively.
Fig. 29 illustrates a process of generating a conference summary from a video call image and transmitting the conference summary to each of the first to third user terminals 300, 400 and 500 in the server 100 for providing a collaboration service according to an exemplary embodiment. As described above, when a document is ready to be collaboratively edited, video call images may be exchanged between user terminals, and the document may be displayed on the screen of each user terminal. Since the conference summary is generated based on voice included in the video call image, the video call image should be exchanged between each user terminal. In other words, since the transmission/reception of the video call image is not described but continues until a request for termination of the cooperation service is made, it is assumed here that the video call image is continuously exchanged between each user terminal.
Referring to fig. 29, the server 100 may convert voice included in a video call image into text (operation S2905). The server 100 may convert the speech contained in the video call images associated with each user into text and generate a conference summary based on the text. In other words, since each of the first to third user terminals 300, 400 and 500 transmits or receives a video call image via the server 100, the server 100 may convert voice included in the video call image into text each time the video call image associated with each user is transmitted or received.
The server 100 may transmit the conference summary to the first to third user terminals 300, 400 and 500 (operation S2910). For example, each time a video call image of each user is transmitted or received, the server 100 may convert voice in the video call image into text and transmit a conference summary including the text to each of the first to third terminals 300, 400 and 500.
Each of the first to third terminals 300, 400 and 500 may display the conference summary received from the server 100 (operation S2915). From the time when the conference summary is displayed on each of the first to third terminals 300, 400 and 500, it can be considered that the collaboration service allowing the collaborative editing of the document is completely provided.
Fig. 30 illustrates an example of displaying a conference summary on each of the first to third user terminals 300, 400 and 500 for receiving a collaboration service according to an exemplary embodiment. Referring to fig. 30, the first to third user terminals 300, 400 and 500 may be connected to a server 100 for providing a collaboration service.
A window 370 showing a document being collaboratively edited, a window 380 showing video call images of second and third users (e.g., "white" and "treble"), and a window 390 showing a meeting summary may be displayed on the first user terminal 300.
Similarly, a window 470 showing a document being collaboratively edited, a window 480 showing video call images of the first and third users (e.g., "ann" and "treble"), and a window 490 showing a meeting summary may be displayed on the second user terminal 400.
Similarly, a window 570 showing a document being collaboratively edited, a window 580 showing video call images of the first and second users (e.g., "ann" and "white"), and a window 590 showing a meeting summary may be displayed on the third user terminal 500.
As is clear from fig. 30, a window 390 displayed on the first user terminal 300 displays in text what the first user (e.g., "ann") said during the video conference. In detail, the conference summary may include "a" as user information, "(09: 01)" as information on the time when the text occurs, "[ 1] as information on a page in a document that is being collaboratively edited and viewed by a user who speaks the text at the time when the text occurs, and" this is a document to be collaboratively edited today "as text into which voice in a video call image is converted. In other words, the meeting summary may include information about the time at which each text in the meeting summary occurred and information about pages in documents viewed by users who spoken the text when the text occurred.
Further, the windows 490 and 590 showing the conference summary displayed on the second and third user terminals 400 and 500 include the same details as the window 390 displayed on the first user terminal 300.
Fig. 31 illustrates another example of displaying a conference summary on each of the first to third user terminals 300, 400 and 500 for receiving a collaboration service according to an exemplary embodiment. Referring to fig. 31, the first to third user terminals 300, 400 and 500 may be connected to a server 100 for providing a collaboration service.
As described above with reference to fig. 30, windows 370, 470, and 570 showing documents being collaboratively edited, windows 380, 480, and 580 showing video call images of the remaining users other than the specific user, and windows 390, 490, and 590 showing a conference summary may be displayed on the first to third user terminals 300, 400, and 500, respectively. In this case, the meeting summary may include information about the time at which each text in the meeting summary occurred and information about the pages in the document viewed by the user who spoken the text at the time the text occurred.
Fig. 30 and 31 show results obtained by performing the processes of fig. 15 and 23, respectively. The example shown in fig. 31 is different from the example shown in fig. 30 only in that two different types of documents are displayed in the windows 370, 470, and 570, respectively.
Fig. 32 illustrates a process of requesting management of group members and a group cooperating with each other from the server 100 for providing a cooperation service according to an exemplary embodiment. While the server 100 is providing a collaboration service that allows collaborative editing of documents, the first user at the head of the group may consider it necessary to manage group members or groups. In other words, the first user may want to change the information about each member in the group or to split a group with a larger size into several smaller groups.
The management of group members will now be described with reference to fig. 32.
Referring to fig. 32, the first user terminal 300 may execute a group member management menu (operation S3205). The first user may set the authority of each group member or a sub-group to which each group member belongs in a group member window popped up when the group member management menu is executed.
The first user terminal 300 may transmit information about each group member to the server 100 (operation S3210). In other words, if the first user terminal 300 executes the group member management menu so that a change is made to information about each group member, the first user terminal 300 may transmit the information about each group member to the server 100 to reflect the change.
The server 100 may store information about each group member (operation S3215). Managing group members by setting information about each group member will now be described in detail with reference to fig. 33.
Fig. 33 illustrates an example of setting information on each group member in the first user terminal 300 for receiving the cooperation service according to an exemplary embodiment.
Referring to fig. 33, on the user interface screen 360 of the first user terminal 300, the first user may select a "collaboration" menu in the menu bar 363 between the address window 361 and the band menu bar 365 and then select a "management group member" in the displayed sub-menu.
Then, a management group member window 364 may be displayed on the user interface screen 360 of the first user terminal 300. The first user may set information about each group member by changing or setting information about the current member and pressing a "done" button. Referring to fig. 33, a second user (e.g., "white") has read and write rights and belongs to subgroup G1-1. Similarly, a third user (e.g., "treble") has read and write rights and belongs to subgroup G1-2.
Referring back to fig. 32, the first user terminal 300 may perform a group management menu (operation S3220).
The first user terminal 300 may request the server 100 to divide the group (operation S3225).
The server 100 may divide the group into smaller groups according to a request from the first user terminal 300 (operation S3230). The management of the group will now be described in detail with reference to fig. 34.
Fig. 34 illustrates an example of dividing a current group into a plurality of groups in the first user terminal 300 for receiving the cooperation service according to an exemplary embodiment.
As described above with reference to fig. 33, information about each group member may be set such that the second user (e.g., "white") and the third user (e.g., "tre") belong to different sub-groups. If multiple users with different sub-groups belong to a single group, the first user at the head of the group may split the group.
Referring to fig. 34, on the user interface screen 360 of the first user terminal 300, the first user may select a "collaboration" menu in the menu bar 363 between the address window 361 and the band menu bar 365 and then select a "management group" in the displayed sub-menu.
Then, the submenus "split group" and "merge group" may be further displayed. The submenu "split group" may be activated if multiple users with different sub-groups belong to the current group and thus the current group is to be split. By selecting the submenu "split group," the first user may split the current group into multiple subgroups. The result of segmenting the groups will now be described in detail with reference to fig. 35.
Fig. 35 illustrates a state of each of the first to third user terminals 300, 400 and 500 when the current group is divided into a plurality of groups.
Referring to fig. 35, the first to third user terminals 300, 400 and 500 may be connected to a server 100 for providing a collaboration service.
A window 370 showing a document being collaboratively edited, a window 380 showing video call images of second and third users (e.g., "user B" and "user C"), and a window 390 showing a conference summary may be displayed on the first user terminal 300. Further, the access user display window 369 indicates that the first user (e.g., "user A") belongs to "groups 1-1" and "groups 1-2" and that "user B" and "user C" among users belonging to at least one same group as the first user are currently accessing the server 100.
A window 470 showing a document being collaboratively edited, a window 480 showing a video call image of a first user (e.g., "user a"), and a window 490 showing a conference summary may be displayed on the second user terminal 400. Further, the access user display window 469 indicates that a second user (e.g., "user B") belongs to "groups 1-1" and that "user a" among users belonging to the same group as the second user is currently accessing the server 100.
A window 570 showing a document being collaboratively edited, a window 580 showing a video call image of a first user (e.g., "user a"), and a window 590 showing a conference summary may be displayed on the third user terminal 500. Further, the access user display window 569 indicates that a third user (e.g., "user C") belongs to "groups 1-2" and that "user a" among users belonging to the same group as the third user is currently accessing the server 100.
In other words, since the first and second users, "user A" and "user B", belong to "group 1-1", and the first and third users, "user A" and "user C", belong to "group 1-2", it can be seen that the second and third users do not belong to the same group due to the segmentation of the current group.
If two types of documents are to be collaboratively edited and collaboration is required for each type of document, efficient collaboration can be achieved by dividing the group into a plurality of subgroups as shown in fig. 35.
Fig. 36 illustrates a process of limiting the editing range of a document being collaboratively edited in the server 100 for providing a collaboration service according to an exemplary embodiment. Since a document is collaboratively edited when a plurality of users collaborate with each other on the document, an editing range can be specified in advance for each user to prevent a conflict during editing.
First, an example in which the first editing range is locked by the first user will be described with reference to fig. 36.
The first user may specify the first edit scope by using the first user terminal 300 (operation S3605).
The first user terminal 300 may transmit the first editing range designated by the first user to the server 100 (operation S3610).
The server 100 may lock a portion of the document being collaboratively edited, which corresponds to the first editing scope, based on the first editing scope (operation S3615).
The server 100 may transmit the document in which the first editing scope is locked as a web-based document to the first to third user terminals 300, 400 and 500 (operation S3620).
The first to third user terminals 300, 400 and 500 may display the documents, in which the first editing range is locked and which are transmitted as web-based documents, via the web browsers, respectively (operation S3625).
Next, an example in which the second editing range is locked by the second user will be described with reference to fig. 36.
The second user may specify the second editing range by using the second user terminal 400 (operation S3630).
The second user terminal 400 may transmit the second editing range designated by the second user to the server 100 (operation S3635).
The server 100 may lock a portion of the document being collaboratively edited, which corresponds to the second editing scope, based on the second editing scope (operation S3640).
The server 100 may transmit the document in which the second editing scope is locked as a web-based document to the first to third user terminals 300, 400 and 500 (operation S3645).
The first to third user terminals 300, 400 and 500 may display the documents, in which the second editing range is locked and which are transmitted as web-based documents, via the web browsers, respectively (operation S3650).
Finally, an example in which the third editing range is locked by the third user will be described with reference to fig. 36.
The third user may specify the third editing range by using the third user terminal 500 (operation S3655).
The third user terminal 500 may transmit the third editing range designated by the third user to the server 100 (operation S3660).
The server 100 may lock a portion of the document being collaboratively edited, which corresponds to the third editing scope, based on the third editing scope (operation S3665).
The server 100 may transmit the document in which the third editing scope is locked as a web-based document to the first to third user terminals 300, 400 and 500 (operation S3670).
The first to third user terminals 300, 400 and 500 may display the documents, in which the third editing range is locked and which are transmitted as the web-based document, via the web browsers, respectively (operation S3675).
A method of displaying a document in which an editing range is locked for each user will now be described in detail with reference to fig. 37 to 40.
Fig. 37 illustrates an example of locking a first edit scope 371 by a first user in a first user terminal 300 for receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 37, a window 370 showing a document being collaboratively edited, a window 380 showing a video call image, and a window 390 showing a conference summary may be displayed on the first user terminal 300. In this case, when a first user (e.g., "ann") designates a first page in the document displayed in the window 370 and requests the server (100 in fig. 36) for providing the collaboration service to lock the first editing scope 371, the document in which the first editing scope 371 is locked may be displayed as shown in fig. 37. It may be indicated that the edit scope of each user is locked with a predetermined color, pattern, or mark corresponding to each user so that other users can recognize it.
Further, the video call image of the first user is output to the window 380, and the text spoken by the first user is displayed in the window 390. In other words, video call images associated with users speaking text in a conference summary may be displayed with the conference summary. To this end, only video call images corresponding to text in the conference summary may be displayed in the window 380.
Fig. 38 illustrates an example of locking the second edit scope 372 by the second user in the first user terminal 300 for receiving the collaborative service according to an exemplary embodiment.
Referring to fig. 38, a window 370 showing a document being collaboratively edited, a window 380 showing a video call image, and a window 390 showing a conference summary may be displayed on the first user terminal 300. In this case, when the second user (e.g., "white") specifies the second page in the document being collaboratively edited using the second user terminal 400 and requests the server (100 in fig. 36) to lock the second editing scope 372, the document in which the second editing scope 372 is locked may be displayed as shown in fig. 38.
Further, the video call image of the second user is output to the window 380, and the text spoken by the second user is displayed in the window 390. In other words, video call images associated with users speaking text in a conference summary may be displayed with the conference summary. To this end, only video call images corresponding to text in the conference summary may be displayed in the window 380.
Fig. 39 illustrates an example of locking the third edit scope 373 by the third user in the first user terminal 300 for receiving the collaboration service according to an exemplary embodiment.
Referring to fig. 39, a window 370 showing a document being collaboratively edited, a window 380 showing a video call image, and a window 390 showing a conference summary may be displayed on the first user terminal 300. In this case, when the third user (e.g., "treble") specifies the third page in the document being collaboratively edited using the third user terminal 500 and requests the server (100 in fig. 36) to lock the third editing scope 373, the document in which the third editing scope 373 is locked may be displayed as shown in fig. 39.
In addition, the video call image of the third user is output to the window 380, and the text spoken by the third user is displayed in the window 390. In other words, video call images associated with users speaking text in a conference summary may be displayed with the conference summary. To this end, only video call images corresponding to text in the conference summary may be displayed in the window 380.
Referring to fig. 37 to 39, each of the first to third users may specify an editing range for each page in the document being collaboratively edited. In addition, the page number of the document viewed by each of the first to third users on his or her own user terminal is displayed together with the text in the conference summary. The text may be obtained by converting speech in the video call image of each user. Information about the time at which the text occurred can also be displayed along with the text and page number, and the user's video call image corresponding to the latest text in the conference summary is displayed.
Fig. 40 illustrates an example of locking the first to third editing ranges 371 to 373 for a plurality of regions in a page in the first user terminal 300 for receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 40, a window 370 showing a document being collaboratively edited, a window 380 showing a video call image, and a window 390 showing a conference summary may be displayed on the first user terminal 300.
When a first user (e.g., "ann") designates an area in a first page in a document displayed in the window 370 as a first editing scope 371 and requests a server (100 in fig. 36) for providing a collaboration service to lock the area, the document in which the first editing scope 371 is locked may be displayed as shown in fig. 40.
In addition, when the second user (e.g., "white") designates an area in the first page of the document being collaboratively edited as the second editing scope 372 using the second user terminal 400 and requests the server (100 in fig. 36) to lock the second editing scope 372, the document in which the second editing scope 372 is locked may be displayed as shown in fig. 40.
In this case, when the third user (e.g., "treble") designates an area in the first page of the document being collaboratively edited as the third editing scope 373 using the third user terminal 500 and requests the server (100 in fig. 36) to lock the third editing scope 373, the document in which the third editing scope 373 is locked may be displayed as shown in fig. 40.
In addition, the video call image of the third user is output to the window 380, and the text spoken by the third user is displayed in the window 390. In other words, video call images associated with users speaking text in a conference summary may be displayed with the conference summary. To this end, only video call images corresponding to text in the conference summary may be displayed in the window 380.
Referring to fig. 40, each of the first to third users may specify an editing range for each region of one page within a document being collaboratively edited. In addition, the page number of the document viewed by each of the first to third users on his or her own user terminal is displayed together with the text in the conference summary. The text may be obtained by converting speech in the video call image of each user. Information about the time at which the text occurred can also be displayed along with the text and page number, and the user's video call image corresponding to the latest text in the conference summary is displayed.
Fig. 41 illustrates a process of editing a document being collaboratively edited in the server 100 for providing a collaboration service according to an exemplary embodiment.
First, an example in which a document being collaboratively edited is edited by a first user according to first editing information and displayed will be described in detail with reference to fig. 41.
The first user may edit the document being collaboratively edited in the first user terminal 300 (operation S4105).
The first user terminal 300 may transmit the first editing information to the server 100 (operation S4110).
The server 100 may store the first editing information (operation S4115).
The server 100 may transmit the edited document as a web-based document to the first to third user terminals 300, 400 and 500 (operation S4120).
The first to third user terminals 300, 400 and 500 may display the documents edited according to the first editing information and transmitted as the web-based documents, respectively, via the web browsers (operation S4125).
Next, an example in which the document being collaboratively edited is edited by the second user according to the second editing information and displayed will be described in detail with reference to fig. 41.
The second user may edit the collaborative edited document in the second user terminal 400 (operation S4130).
The second user terminal 400 may transmit the second editing information to the server 100 (operation S4135).
The server 100 may store the second editing information (operation S4140).
The server 100 may transmit the edited document as a web-based document to the first to third user terminals 300, 400 and 500 (operation S4145).
The first to third user terminals 300, 400 and 500 may display the documents edited according to the second editing information and transmitted as the web-based documents, respectively, via the web browsers (operation S4150).
An example in which a document being collaboratively edited is edited by the first and second users sequentially according to the first editing information and the second editing information will now be described with reference to fig. 42 and 43.
Fig. 42 illustrates an example of editing a document being collaboratively edited by a first user in a first user terminal 300 for receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 42, a window 370 showing a document being collaboratively edited, a window 380 showing a video call image, and a window 390 showing a conference summary may be displayed on the first user terminal 300.
When a first user (e.g., "ann") edits a portion in the first page of the document displayed in the window 370, speaks about the edit to the portion through a video call image and requests the edit from a server (100 in fig. 41) for providing a collaboration service, the document having the edited portion indicated by "a" indicating "ann" as shown in fig. 42 may be displayed.
The edited portion may be indicated with a predetermined color, pattern, or mark corresponding to each user so that other users can recognize who edited the portion.
Referring to fig. 42, the first user may edit a target that is a part of a directory in a document being collaboratively edited, speak about the editing of the part through a video call image, and display text obtained by converting the first user's voice in the video call image in a conference summary.
Fig. 43 illustrates an example of editing a document being collaboratively edited by a second user in a second user terminal 400 for receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 43, a window 470 showing a document being collaboratively edited, a window 480 showing a video call image, and a window 490 showing a conference summary may be displayed on the second user terminal 400.
When a second user (for example, "white") edits a portion in the second page of the document being collaboratively edited, speaks about the edit to the portion through the video call image and requests the edit from the server (100 in fig. 41) for providing the collaboration service, the document having the edited portion represented by "B" indicating "white" as indicated in fig. 43 may be displayed.
The edited portion may be indicated with a predetermined color, pattern, or mark corresponding to each user so that other users can recognize who edited the portion.
Referring to fig. 43, a second user may edit a portion of a second page of a document being collaboratively edited using a stylus, speak about the editing of the portion through a video call image, and display text obtained by converting the second user's voice in the video call image in a conference summary.
Fig. 44 illustrates an example of editing information for identifying a document being collaboratively edited using a meeting summary in the second user terminal 400 for receiving a collaboration service, according to an exemplary embodiment.
Referring to fig. 44, a window 470 showing a document being collaboratively edited, a window 480 showing a video call image, and a window 490 showing a conference summary may be displayed on the second user terminal 400. When text is accumulated in the window 490, a scroll bar may be created on the right side of the window 490.
The server 100 may receive information about the text selected by the user from the conference summary from the second user terminal 400 and transmit information about the edited portion of the document being collaboratively edited, which is synchronized with the selected text, to the second user terminal 400. In detail, if text of another user is selected in the window 490, edit information of a document corresponding to the selected text may be displayed in the window 470. This is possible when the conference summary is synchronized with the document.
Fig. 45 illustrates an example of editing a document being collaboratively edited with a meeting summary in the second user terminal 400 for receiving a collaboration service, according to an exemplary embodiment.
Referring to fig. 45, a window 470 showing a document being collaboratively edited, a window 480 showing a video call image, and a window 490 showing a conference summary may be displayed on the second user terminal 400.
The server for providing the collaboration service (100 in fig. 41) may receive information about the text selected by the user from the conference summary from the second user terminal 400 and recognize whether there is an edited portion of the document being collaboratively edited that is synchronized with the selected text. In detail, if the second user selects his or her own text from the window 490 and there is no editing information of the document corresponding to the selected text, the document may be edited using the selected text as the editing information of the document. In this way, the document displayed in window 470 can be edited with the text displayed in window 490.
FIG. 46 illustrates the following example in accordance with an exemplary embodiment: in a user terminal for receiving a collaboration service, a device is set for each of an image of a document being collaboratively edited, a video call image, and a conference summary so as to separately display the image of the document, the video call image, and the conference summary with other devices registered with the user terminal.
In detail, fig. 46 illustrates the image-specific device setting window 366. If it is difficult for the user to view the edits of the document, the conference summary, and the video call image that are being collaboratively edited, device settings may be performed for each image so that at least one of the image of the document, the video call image, and the conference summary is displayed on the device currently registered with the user terminal.
FIG. 47 illustrates an example of separately displaying a document, a video call image, and a conference summary being collaboratively edited on multiple devices, according to an exemplary embodiment.
When the device setting is completed for each image as described above with reference to fig. 46, the document, the video call image, and the conference summary can be separately displayed on the PC 300-1, the smart TV 300-2, and the tablet PC 300-3, respectively.
Fig. 48 illustrates a procedure of terminating a collaboration service by requesting termination of the collaboration service from the server 100 for providing the collaboration service and transmitting a document edited in collaboration to another user terminal according to an exemplary embodiment.
Referring to fig. 48, the first user terminal 300 may request termination of the cooperation service from the server 100 (operation S4805).
The server 100 may store the document being collaboratively edited and terminate the program for processing the document (operation S4810).
The server 100 may store the conference summary and the video call image (operation S4815).
If the document, the conference summary, and the video call image are all stored, the server 100 may terminate the video call image service (operation S4820). In this case, the server 100 may also store documents for review, video call images for review, and a conference summary for review. The document for review, the video call image for review, and the conference summary for review refer to images in which editing information is synchronized with text information, that is, a document in which an indication of a portion edited during collaborative editing of the document with a collaboration service is retained and images synchronized with the edited portion.
The first user terminal 300 requests the server 100 to share the document being collaboratively edited (operation S4825). For example, in order to share a document collaboratively edited using a collaboration service with a fourth user who does not collaboratively edit the document, the first user terminal 300 may request sharing of the document from the server 100.
The server 100 may retrieve the document requested by the first user terminal 300 (operation S4830).
The server 100 may transmit the retrieved document to the fourth user terminal 600 (operation S4835).
The fourth user terminal 600 may display the transmitted document (operation S4840). The document displayed on the screen of the fourth user terminal 600 is a web-based document. A document executed on the server 100 may be displayed on a web browser screen executed on the fourth user terminal 600.
Fig. 49 illustrates an example in which a document for review, a video call image for review, and a conference session for review are synchronized with each other according to an exemplary embodiment.
Referring to fig. 49, a document for review, a video call image for review, and a conference summary for review may be synchronized with each other.
For example, when the first through third users engaged in a video call on day 5, month 21 2014, the first user (e.g., "Ann") may enter the sentence "enhance the utilization of owned patents" in the document at 09:03 am on day 5, month 21 2014, while outputting a voice saying "I will write down a target to enhance the utilization of owned patents". In this case, the server 100 may synchronize the sentence "enhance utilization of owned patents" in the document with the meeting details in the meeting summary "i will write down the utilization targeted to enhance owned patents. In addition, when the first to third users participated in the video call on day 5/21 2014, the video call image outputted when the first user inputs the sentence "enhance the utilization of the owned patent" in the document may be synchronized with the document and the conference summary.
In other words, the sentence "enhance utilization of owned patents" may be synchronized with the meeting details "i will write down the goal to enhance utilization of owned patents. Thus, the sentence "enhance the utilization of the owned patent", the conference detail "i will write down the object to enhance the utilization of the owned patent" and the video call image output on 5/21/2014 at 09:03 am can be synchronized with each other.
Fig. 50 illustrates a process of requesting a document for review and reviewing editing of the document from the server 100 for providing a collaboration service according to an exemplary embodiment.
Referring to fig. 50, the third user terminal 500 may request a document for review from the server 100 (operation S5005). For example, the third user may log into the server 100 via the third user terminal 500 and select a document for review from among documents in the list of collaboration services that the third user has performed. The third user terminal 500 may then request a document for review from the server 100 according to the selection information of the third user received via the user input unit (220 in fig. 2).
The server 100 may retrieve the document for review in response to the request (operation S5010). For example, the server 100 may retrieve a document for review corresponding to a request from the third user terminal 500 from among documents stored in the document DB (155 in fig. 3).
The server 100 may transmit the document for review to the third user terminal 500 (operation S5015).
The third user terminal 500 may select an edited portion from the received document for review (operation S5050). For example, if the third user terminal 500 displays a document for review and the third user selects a sentence or paragraph in the document for review, an edited portion may be selected from the document for review.
The third user terminal 500 may transmit edit information on the selected edited portion to the server 100 (operation S5025).
The server 100 may identify a conference summary for review and a part of a video call image for review, which are synchronized with the transmitted editing information (operation S5030). For example, the server 100 may determine a conference summary for review and a video call image for review, which are synchronized with the edit information, from the image DB (157 in fig. 3) based on the synchronization information contained in the integrated management DB (151 in fig. 3).
The server 100 may transmit the conference summary for review and the video call image for review synchronized with the editing information to the third user terminal 500 (operation S5035).
The third user terminal 500 may output a conference summary for review and a video call image for review in synchronization with the edit information (operation S5040). For example, the third user terminal 500 may output the received conference summary for review and video call image for review via the output unit (250 in fig. 2).
An example of the synchronized image output in operation S5040 will now be described with reference to fig. 51.
FIG. 51 illustrates the following example in accordance with an exemplary embodiment: when the edited portion of the document for review is selected, the conference summary for review and the video call image for review, both of which are synchronized with the selected edited portion, are output in the third user terminal 500 for receiving the collaboration service.
In detail, fig. 51 illustrates an example in which an image 570 of a document for review and a window 580 showing a video call image for review synchronized with the document image 570 for review and a conference summary 590 for review synchronized with the document image for review 570 are output to the third user terminal 500.
In one exemplary embodiment, the window 580 showing the video call image may be displayed in a pop-up window or another window. In another exemplary embodiment, the video call image may be displayed in a pop-up window or another window in response to selecting an edit to the editable document or an indicator of the edit. In another exemplary embodiment, the video call image may be displayed in a pop-up window or another window in response to selecting an item from a text record of the item, log data, or a conference summary.
Referring to fig. 51, the third user terminal 500 may select an edited portion representing a sentence or paragraph in the image 570 for review. For example, the third user terminal 500 may select an edited portion indicating the sentence "enhance the utilization of the owned patent".
The third user terminal 500 may output a meeting summary 590 for review that is synchronized with the selected edited portion. For example, assume that when the first to third users are engaged in a video call on day 5/21 2014, the first user (e.g., "ann") edits the sentence "enhance the utilization of owned patents" in the document on morning 09:03 on day 5/21 2014, while outputting a voice saying "i will write down a target to enhance the utilization of owned patents". In this case, the third user terminal 500 may output a meeting summary 590 for review, in which the meeting details "i will write down a patent with the goal of enhancing the utilization of the owned patent" are indicated in bold.
The third user terminal 500 may also output a window 580 for showing a video call image for review in synchronization with the selected edited portion. A reproduction bar corresponding to the length of time for which the video conference of 5, 21 and 2014 was carried out may be displayed on the window 580 for showing the video call image for examination. In addition, a window 580 for showing a video call image for review may be displayed from a playback position corresponding to a time of 09:03 am on 5/21/2014. However, the exemplary embodiment is not limited thereto, and the window 580 for showing the video call image for review may be a still image at a playback position corresponding to 09:03 am on 21 st 5/2014.
Fig. 52 illustrates a process of requesting a video call image for review and reviewing editing of a document from the server 100 for providing a collaboration service according to an exemplary embodiment.
Referring to fig. 52, the third user terminal 500 may request a video call image for review from the server 100 (operation S5205). For example, the third user may log into the server 100 via the third user terminal 500 and select a video call image for review from among video call images in the list of collaboration services that the third user has performed. The third user terminal 500 may then request the video call image for review from the server 100 according to the selection information of the third user received via the user input unit (220 in fig. 2).
The server 100 may retrieve a video call image for review (operation S5210). For example, the server 100 may retrieve a video call image for review corresponding to a request from the third user terminal 500 from among images stored in the image DB 157.
The server 100 may transmit the video call image for review to the third user terminal 500 (operation S5215).
The third user terminal 500 may select a portion to be reproduced from the received video call image for review (operation S5220). For example, if the third user terminal 500 displays a reproduction bar for a video call image for review and the third user selects a point on the reproduction bar, a portion to be reproduced may be selected from the video call image for review.
The third user terminal 500 may transmit information about the selected portion to be reproduced to the server 100 (operation S5225).
The server 100 may identify a conference summary for review and a part in a document for review, which are synchronized with the transmitted information on the part to be reproduced (operation S5230). For example, the server 100 may determine a document for review and a conference summary for review, which are synchronized with information on a part to be reproduced, from the document DB155 and the image DB157, respectively, based on the synchronization information contained in the integrated management DB 151.
The server 100 may transmit a conference summary for review and a document for review, which are synchronized with information on a part to be reproduced, to the third user terminal 500 (operation S5235).
The third user terminal 500 may output a conference summary for review and a document for review in synchronization with information on a section to be reproduced (operation S5240). For example, the third user terminal 500 may output the received conference summary for review and the document for review via the output unit 250.
An example of the synchronized image output in operation S5240 will now be described with reference to fig. 53.
FIG. 53 illustrates the following example in accordance with an exemplary embodiment: when a portion to be reproduced of the video call image for review is selected, a conference summary for review and a document for review, both of which are synchronized with the selected reproduced portion, are output in the third user terminal 500 for receiving the collaboration service.
In detail, fig. 53 illustrates an example in which a window 580 showing a video call image for review and a conference summary 590 for review and an image 570 of a document for review, each of which is synchronized with the video call image for review, are output to the third user terminal 500.
Referring to fig. 53, the third user terminal 500 may select a portion to be reproduced among video call images for review. For example, the third user terminal 500 may select a portion to be reproduced from the video call image for review by selecting a point on a reproduction bar of the video call image for review output to the third user terminal 500.
The third user terminal 500 may output a conference summary for review 590 synchronized with the selected portion to be reproduced. For example, assume that while the first to third users are engaged in a video call on day 5/21 2014, the first user (e.g., "ann") outputs a voice saying "i will write down a patent with the goal of enhancing the utilization of the owned patent". If the selected section to be reproduced includes morning 09:03 on 21 st/5/2014, the third user terminal 500 may output a meeting summary for review 590 indicating that the sentence "i will write down the target to enhance the utilization of the owned patent".
The third user terminal 500 may also output an image 570 of the document for review in synchronization with the selected portion to be reproduced. For example, assume that a first user (e.g., "An") edited the sentence "enhanced utilization of owned patents" in the document at 09:03 am on 21 st 5/21 2014 while the first through third users recorded the meeting summary on 21 st 5/2014. If the selected section to be reproduced includes 2014, 5, 21, morning 09:03, the third user terminal 500 may output an image 570 of the document for review including the sentence "enhance utilization of the owned patent".
A reproduction bar corresponding to the length of time for which the video conference of 5, 21 and 2014 is carried out may be displayed on the window 580 for showing the video call image. In addition, the video call image for review may be displayed from a playback position corresponding to a time of 09:03 am on 5/21 th 2014. However, the exemplary embodiment is not limited thereto, and the video call image for review may be a still image at a playback position corresponding to 09:03 am on 21 st 5/2014.
Fig. 54 illustrates a process of requesting a conference summary for review and reviewing edits of documents from the server 100 for providing a collaboration service, according to an exemplary embodiment.
Referring to fig. 54, the third user terminal 500 may request a conference summary for review from the server 100 (operation S5405). For example, the third user may log into the server 100 via the third user terminal 500 and select a conference summary for review from among conference summaries in the list of collaboration services that the third user has performed. The third user terminal 500 may then request the conference summary for review from the server 100 according to the selection information of the third user received via the user input unit 220.
The server 100 may retrieve the conference summary for review (operation S5410). For example, the server 100 may retrieve a conference summary for review corresponding to a request from the third user terminal 500 from the conference summaries stored in the image DB 157.
The server 100 may transmit a conference summary for review to the third user terminal 500 (operation S5415).
The third user terminal 500 may select a part to be reproduced from the received conference summary for review (operation S5420). For example, if the third user terminal 500 displays a conference summary for review and the third user selects an area in the conference summary for review, text at a time point corresponding to the selected area may be selected.
The third user terminal 500 may transmit information about the selected text to the server 100 (operation S5425).
The server 100 may identify a video call image for review and a part in a document for review in synchronization with the transmitted text information (operation S5430). For example, the server 100 may determine a document for review and a video call image for review, which are synchronized with text information, from the document DB155 and the image DB157, respectively, based on synchronization information contained in the integrated management DB 151.
The server 100 may transmit the video call image for review and the document for review synchronized with the text information to the third user terminal 500 (operation S5435).
The third user terminal 500 may output a video call image for review and a document for review in synchronization with the text information (operation S5440). For example, the third user terminal 500 may output the received video call image for review and the document for review via the output unit 250.
An example of the synchronized image output in operation S5440 will now be described in detail with reference to fig. 55.
Fig. 55 illustrates the following example according to an exemplary embodiment: when the text in the conference summary for review is selected, the video call image for review and the document for review, both of which are synchronized with the selected text, are output in the third user terminal 500 for receiving the collaboration service.
In detail, fig. 55 illustrates an example in which a conference summary for review 590, and a video call image for review 580 and an image for review document 570, each synchronized with the conference summary for review 590, are output to the third user terminal 500.
Referring to fig. 55, the third user terminal 500 may select text at a particular point in time in the conference summary 590 for review. For example, the third user terminal 500 may select the text "i will write down a meeting detail targeted to enhance the utilization of the owned patent" corresponding to meeting details of 09:03 am on 21 st 5/2014 ".
The third user terminal 500 may output a video call image for review in synchronization with the selected text. A reproduction bar corresponding to the length of time for which the video conference of 5, 21 and 2014 was carried out may be displayed on the video call image for review. In addition, the video call image for review may be displayed from a playback position corresponding to a time of 09:03 am on 5/21 th 2014. However, the exemplary embodiment is not limited thereto, and the video call image for review may be a still image at a playback position corresponding to 09:03 am on 21 st 5/2014.
The third user terminal 500 may also output an image 570 of the document for review in synchronization with the selected text. For example, assume that the first user (e.g., "an") edited the sentence "enhance the utilization of the owned patent" 09:03 in the morning on day 21/5/2014 while the first to third users recorded the conference summary on day 21/5/2014. If the time corresponding to the selected text is 2014, 5, 21, morning 09:03, the third user terminal 500 may output an image 570 of the document for review indicating the sentence "enhance the utilization of the owned patent".
FIG. 56 is a flowchart of a method of providing collaboration services, according to an exemplary embodiment.
Referring to fig. 56, the method of providing a collaboration service according to the present embodiment includes operations performed by the server for providing a collaboration service of fig. 3 in time series. Thus, although omitted below, the above description about the server 100 is applicable to the method of fig. 56.
The server 100 receives a video call image associated with each user who edits a document and editing information on the edited document from a user terminal requesting a collaboration service (operation S5610).
The server 100 synchronizes the details of the conference summary generated based on the voice included in the video call image associated with each user with the document edited according to the received editing information (operation S5620).
The server 100 stores the received video call image, the details of the conference summary, and the edited document associated with each user (operation S5630).
Fig. 57 is a flowchart of a method of receiving a collaboration service according to an exemplary embodiment.
Referring to fig. 57, the method of receiving a collaboration service according to the present embodiment includes operations performed by the user terminal 200 of fig. 2 in time series. Thus, although omitted below, the above description about the user terminal 200 is applicable to the method of fig. 57.
The user terminal 200 acquires a video call image obtained by performing signal processing on the voice and video of the user and editing information on an edited document (operation S5710).
The user terminal 200 transmits the acquired video call image and the editing information to the server 100 for providing the cooperation service (operation S5720).
The user terminal 200 receives a video call image associated with each user who edits a document, a conference summary generated based on speech included in the video call image associated with each user, and a document synchronized with the details of the conference summary from the server 100 (operation S5730).
The user terminal 200 outputs the received video call image, conference summary, and document associated with each user (operation S5740).
The method of providing a collaboration service or the method of receiving a collaboration service according to an exemplary embodiment may be recorded in programs executable on a computer, and implemented by a general-purpose digital computer and one or more processors that can run or execute the programs using a computer readable recording medium. Examples of the computer-readable recording medium include recording media such as magnetic storage media (e.g., floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs).
While one or more exemplary embodiments have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the inventive concept as defined by the following claims. Accordingly, the exemplary embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. The scope of the inventive concept is defined not by the detailed description of the exemplary embodiments but by the appended claims, and all differences within the scope of the appended claims and their equivalents will be construed as being included in the inventive concept.

Claims (13)

1. A server for providing collaboration services, the server comprising:
a memory configured to store computer-executable instructions; and
a processor configured to process the computer-executable instructions to:
providing a screen including a first area displaying a video call image of the video and a second area displaying an editable document, wherein the editable document is one real-time cooperatively edited document formed by one or more edits made by each of the conference participants in real-time during the conference, and the editable document is synchronized with the video call image,
receiving a selection of a point in time of the video,
displaying the editable document in a state corresponding to the selected point in time of the video,
receiving a selection of an edit of the editable document, an
Reproducing the video from a point in time corresponding to the selected edit.
2. The server of claim 1, wherein the screen further comprises a third area displaying a text record of items corresponding to a point in time of the video and to an edit of the editable document.
3. The server of claim 2, wherein the processor is further configured to process the computer-executable instructions to:
receiving a selection of an item from a text record of the item;
reproducing the video from a point of time corresponding to the selected item; and
the editable document is provided in a state corresponding to the selected project.
4. The server of claim 2, wherein the processor is further configured to process the computer-executable instructions to:
generating a text recording of the item based on speech in the video.
5. The server of claim 2, wherein the processor is further configured to process the computer-executable instructions to:
generating a text record for the project based on the editing of the editable document.
6. The server of claim 1, wherein the editable document is displayed in a word processing program.
7. A method for providing collaboration services, the method comprising:
displaying a screen including a first region displaying a video call image of a video and a second region displaying an editable document, wherein the editable document is one real-time cooperatively edited document formed by one or more edits made by each of the conference participants in real-time during the conference, and the editable document is synchronized with the video call image;
receiving a selection of a point in time of the video;
displaying the editable document in a state corresponding to the selected point in time of the video;
receiving a selection of an edit of the editable document; and
reproducing the video from a point in time corresponding to the selected edit.
8. A terminal for providing a collaboration service, the terminal comprising:
a display configured to display a screen including a first region displaying a video call image of a video and a second region displaying an editable document, wherein the editable document is one real-time cooperatively edited document formed by one or more edits, each of the conference participants editing in real time during the conference, and the editable document is synchronized with the video call image;
an input device configured to receive a selection of a point in time of the video;
a controller configured to control the display to display the editable document in a state corresponding to the selected point in time of the video;
wherein the input device is further configured to receive a selection of an edit of the editable document, and
the controller is further configured to control the display to reproduce the video from a point in time corresponding to the selected edit.
9. The terminal of claim 8, wherein the screen further comprises a third area displaying a text record of items corresponding to a point in time of the video and an edit of the editable document.
10. The terminal of claim 9, wherein:
the input device is further configured to receive a selection of an item from a text recording of the item, and
the controller is further configured to:
controlling the display to reproduce the video from a point in time corresponding to the selected item, an
Controlling the display to display the editable document in a state corresponding to the selected item.
11. The terminal of claim 9, wherein the text recording of the item is generated based on speech in the video.
12. The terminal of claim 9, wherein the text record of the project is generated based on editing of the editable document.
13. The terminal of claim 8, wherein the editable document is displayed in a word processing program.
CN201510270712.XA 2014-05-23 2015-05-25 Server and method for providing collaboration service and user terminal for receiving collaboration service Active CN105100679B (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
KR10-2014-0062625 2014-05-23
KR20140062625 2014-05-23
US14/556,616 2014-12-01
US14/556,616 US20150341399A1 (en) 2014-05-23 2014-12-01 Server and method of providing collaboration services and user terminal for receiving collaboration services
KR10-2015-0018870 2015-02-06
KR1020150018870A KR102319417B1 (en) 2014-05-23 2015-02-06 Server and method for providing collaboration services and user terminal for receiving collaboration services
US14/705,147 2015-05-06
US14/705,147 US10277643B2 (en) 2014-05-23 2015-05-06 Server and method of providing collaboration services and user terminal for receiving collaboration services

Publications (2)

Publication Number Publication Date
CN105100679A CN105100679A (en) 2015-11-25
CN105100679B true CN105100679B (en) 2020-10-20

Family

ID=54580107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510270712.XA Active CN105100679B (en) 2014-05-23 2015-05-25 Server and method for providing collaboration service and user terminal for receiving collaboration service

Country Status (1)

Country Link
CN (1) CN105100679B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107479811A (en) * 2016-06-08 2017-12-15 南京跃豚智能科技有限公司 The method and mobile terminal with managing meeting, recording meeting are established in ession for telecommunication
CN108377258A (en) * 2018-01-16 2018-08-07 广州市信富信息科技有限公司 A kind of long-range multiterminal collaboration method and system based on Cloud Server
CN110413966A (en) * 2018-04-27 2019-11-05 富士施乐株式会社 Document management apparatus and non-transitory computer-readable medium
CN109151372A (en) * 2018-10-23 2019-01-04 国家电网公司 A kind of video conferencing system based on Internet of Things
CN110446001A (en) * 2019-07-12 2019-11-12 视联动力信息技术股份有限公司 Video conference processing method, device, electronic equipment and medium based on view networking
CN110677614A (en) * 2019-10-15 2020-01-10 广州国音智能科技有限公司 Information processing method, device and computer readable storage medium
CN111953852B (en) * 2020-07-30 2021-12-21 北京声智科技有限公司 Call record generation method, device, terminal and storage medium
CN111818294A (en) * 2020-08-03 2020-10-23 上海依图信息技术有限公司 Method, medium and electronic device for multi-person conference real-time display combined with audio and video
CN111885345B (en) * 2020-08-14 2022-06-24 广州视睿电子科技有限公司 Teleconference implementation method, teleconference implementation device, terminal device and storage medium
CN113836871A (en) * 2021-08-20 2021-12-24 北京仿真中心 Collaborative discussion and collaborative editing integration method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101330388A (en) * 2007-06-20 2008-12-24 中国科学院自动化研究所 Synergic editing method based on synthesis integration deliberation hall

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020002562A1 (en) * 1995-11-03 2002-01-03 Thomas P. Moran Computer controlled display system using a graphical replay device to control playback of temporal data representing collaborative activities
JP4438217B2 (en) * 2000-11-10 2010-03-24 ソニー株式会社 Program additional data creation device, video program editing device, and program additional data creation screen display method
JP2005523555A (en) * 2002-04-16 2005-08-04 サムスン エレクトロニクス カンパニー リミテッド Information storage medium on which interactive content version information is recorded, its recording method and reproducing method
US20040107270A1 (en) * 2002-10-30 2004-06-03 Jamie Stephens Method and system for collaboration recording
US20060026502A1 (en) * 2004-07-28 2006-02-02 Koushik Dutta Document collaboration system
US8566301B2 (en) * 2006-05-01 2013-10-22 Steven L. Rueben Document revisions in a collaborative computing environment
US7937663B2 (en) * 2007-06-29 2011-05-03 Microsoft Corporation Integrated collaborative user interface for a document editor program
US20110125560A1 (en) * 2009-11-25 2011-05-26 Altus Learning Systems, Inc. Augmenting a synchronized media archive with additional media resources
US9055089B2 (en) * 2011-06-07 2015-06-09 International Business Machines Corporation Associating communications in collaboration sessions

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101330388A (en) * 2007-06-20 2008-12-24 中国科学院自动化研究所 Synergic editing method based on synthesis integration deliberation hall

Also Published As

Publication number Publication date
CN105100679A (en) 2015-11-25

Similar Documents

Publication Publication Date Title
US10810360B2 (en) Server and method of providing collaboration services and user terminal for receiving collaboration services
TWI691849B (en) Server and method of providing collaboration services and user terminal for receiving collaboration services
CN105100679B (en) Server and method for providing collaboration service and user terminal for receiving collaboration service
CN106716954B (en) Real-time sharing method, system and computer readable memory during telephone call
CN110597774B (en) File sharing method, system, device, computing equipment and terminal equipment
US10431187B2 (en) Terminal apparatus, screen recording method, program, and information processing system
US9071615B2 (en) Shared space for communicating information
EP4130963A1 (en) Object dragging method and device
RU2700188C2 (en) Representing computing environment on multiple devices
US10798153B2 (en) Terminal apparatus and server and method of controlling the same
US20080184115A1 (en) Design and design methodology for creating an easy-to-use conference room system controller
US20070076245A1 (en) Information processing device, information processing system, and information processing method
WO2015079818A1 (en) Terminal device, screen sharing method, and screen sharing system
JP2012194625A (en) Document management device, document editing method and program
KR20140081220A (en) user terminal apparatus and contol method thereof
CN109983451A (en) Text file manager
US20240054455A1 (en) Systems and methods for multi-party distributed active co-browsing
JP5088153B2 (en) CONFERENCE TASK SUPPORT METHOD, CONFERENCE TASK SUPPORT SYSTEM, USER INTERFACE DEVICE, AND PROGRAM
WO2023246723A1 (en) Object access method and apparatus, and electronic device, storage medium and program product
KR101427308B1 (en) System for publicating online album by oneself
US20150067056A1 (en) Information processing system, information processing apparatus, and information processing method
WO2024087533A1 (en) Expression image sharing method and apparatus, computer device, and storage medium
KR101648807B1 (en) Method and system for sharing digital contents using user terminal
US20230353802A1 (en) Systems and methods for multi-party distributed active co-browsing of video-based content
McEwan Community Bar: Designing for informal awareness and casual interaction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant