WO2016166955A1 - Image processing device, image distribution system, and image processing method - Google Patents

Image processing device, image distribution system, and image processing method Download PDF

Info

Publication number
WO2016166955A1
WO2016166955A1 PCT/JP2016/001937 JP2016001937W WO2016166955A1 WO 2016166955 A1 WO2016166955 A1 WO 2016166955A1 JP 2016001937 W JP2016001937 W JP 2016001937W WO 2016166955 A1 WO2016166955 A1 WO 2016166955A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic information
image
video
operation event
information terminal
Prior art date
Application number
PCT/JP2016/001937
Other languages
French (fr)
Inventor
Yoshihiko Shimohira
Kiyoshi Kasatani
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016026034A external-priority patent/JP2016224907A/en
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Publication of WO2016166955A1 publication Critical patent/WO2016166955A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client

Definitions

  • the present invention relates to image processing devices, image distribution systems, and image processing methods.
  • the interactive white board has a configuration combining a display made of a flat panel such as a liquid crystal panel, a touch panel serving as a coordinate detection device that detects coordinates of a position on a display surface of the display where an indicator for input such as an input pen or a finger touches, and a control device that causes an image including a character, a number, or a figure written on the display surface of the display to be displayed based on coordinate data output from the coordinate detection device.
  • conference material and/or the like can be enlarged and displayed on the display, and writing can be performed on the display surface of the display with the indicator such as an input pen or a finger.
  • the interactive white board can be coupled to an electronic information terminal such as a tablet computer via a wireless local area network (LAN) or the like.
  • An image on the display of the interactive white board can be displayed on the electronic information terminal.
  • a writing operation and the like can be performed on the image also with the electronic information terminal.
  • An operation event performed on the electronic information terminal is transmitted to the interactive white board, processed as an operation event of the interactive white board, and displayed on the display.
  • Patent Literature 1 discloses an image distribution system in which, in transmitting video from a server to an electronic information terminal, the server draws a plurality of images on a browser as hyper text markup language (HTML) 5 content, and distributes the drawn images to the electronic information terminal as compressed data.
  • HTML hyper text markup language
  • Patent Literature 1 specific processing is not disclosed for each operation event in a case in which operation events are transmitted from a plurality of electronic information terminals to the image displayed on each electronic information terminal.
  • the present invention is made in view of such a situation, and has an object to provide an image processing device, an image distribution system, and an image processing method that can appropriately process operation events from a plurality of electronic information terminals.
  • an image processing device is coupled to electronic information terminals via a communication network.
  • the image processing device includes: an image acquisition unit, a video generation unit, a distribution unit, a communication unit, and an electronic information terminal identification unit.
  • the image acquisition unit is configured to acquire an image.
  • the video generation unit is configured to generate video in a predetermined format from the image acquired by the image acquisition unit.
  • the distribution unit is configured to distribute the video generated by the video generation unit to the electronic information terminals.
  • the communication unit is configured to receive an operation event related to an operation performed to the video distributed to the electronic information terminals, from each of the electronic information terminals.
  • the electronic information terminal identification unit is configured to identify an electronic information terminal on which an operation event is performed, based on identification information added to the operation event.
  • operation events from a plurality of electronic information terminals can be appropriately processed by an image processing device.
  • Fig. 1 is a block diagram illustrating a configuration of an image distribution system according to a first embodiment of the present invention.
  • Fig. 2 is a configuration diagram illustrating a hardware configuration of an application server.
  • Fig. 3 is a functional block diagram of the application server used for the image distribution system.
  • Fig. 4 is a functional block diagram of an electronic processing terminal used for the image distribution system.
  • Fig. 5 is a diagram illustrating a procedure of preparation for receiving video in the image distribution system.
  • Fig. 6 is a diagram illustrating a procedure for distributing compressed video in the image distribution system.
  • Fig. 7 is a diagram illustrating a procedure for inputting an operation event in the image distribution system.
  • Fig. 1 is a block diagram illustrating a configuration of an image distribution system according to a first embodiment of the present invention.
  • Fig. 2 is a configuration diagram illustrating a hardware configuration of an application server.
  • Fig. 3 is a functional block diagram of the application server used for the image distribution system
  • FIG. 8 is a block diagram illustrating a configuration of an electronic information board system according to a second embodiment of the present invention.
  • Fig. 9 is a functional block diagram of an electronic information board used for the electronic information board system.
  • Fig. 10 is a diagram illustrating a procedure for distributing compressed video in the electronic information board system.
  • Fig. 11 is a diagram illustrating a procedure for inputting an operation event in the electronic information board system.
  • FIG. 1 is a block diagram illustrating a configuration of the image distribution system according to the first embodiment of the present invention.
  • the image distribution system according to this embodiment includes an application server 10 and a plurality of electronic information terminals 20 (20-1, 20-2, ..., 20-n) coupled to the application server 10 via a wireless local area network (LAN).
  • the wireless LAN is used for an intra-company network, for example.
  • the application server 10 can be coupled to the electronic information terminal 20 via various wired or wireless communication networks such as a wired LAN and the Internet, not limited to the wireless LAN.
  • Fig. 2 is a configuration diagram illustrating a hardware configuration of the application server according to the first embodiment of the present invention.
  • a central processing unit (CPU) 501 a read only memory (ROM) 502, a random access memory (RAM) 503, a hard disc drive (HDD) 504, a hard disk controller (HDC) 505, a media drive 507, a display (image display device) 508, an interface (I/F) 509, a keyboard 511, a mouse 512, a microphone 513, a speaker 514, and a graphics processing unit (GPU) 515 are coupled to each other via an extension bus line 520 to construct the application server 10.
  • the CPU 501 controls the entire operation of the application server 10.
  • the ROM 502 stores therein a computer program used for driving the CPU 501 such as an IPL.
  • the RAM 503 is used as a work area for the CPU 501.
  • the HDD 504 stores therein various pieces of data such as a computer program.
  • the HDC 505 controls reading or writing of various pieces of data from/to the HDD 504 in accordance with control performed by the CPU 501.
  • the media drive 507 controls reading or writing (storing) of data from/to a recording medium 506 such as a flash memory.
  • the display 508 displays various pieces of information.
  • the I/F 509 is used for transmitting data via a communication network and coupling a dongle.
  • a ROM 516 storing therein a computer program used for driving the GPU 515 and a RAM 517 used as a work area for the GPU 515 are coupled to the GPU 515.
  • the extension bus line 520 includes an address bus, a data bus, and/or the like for electrically coupling the components described above.
  • the application server 10 illustrated in Fig. 1 implements an application 100 and a video distribution service 200 that distributes a plurality of images generated by the application 100 as video.
  • the electronic information terminal 20 is a notebook-type personal computer, a tablet computer, or the like, and operates based on various operating systems (OSs) such as iOS, Android (registered trademark), windows (registered trademark), and macOS (registered trademark).
  • OSs operating systems
  • the application 100 sends out an image to be distributed to the electronic information terminal 20, to the video distribution service 200 (Step S1).
  • the video distribution service 200 transmits a plurality of images as a drawing result of the application 100 to the electronic information terminal 20 (Step S2).
  • the images are distributed from one application server 10 to a plurality of (n) electronic information terminals 20-1, 20-2, ..., 20-n at the same time.
  • Each electronic information terminal 20 displays the distributed video, and performs drawing on the video in response to a touch operation or a mouse operation performed to the displayed video.
  • the operation events such as the touch operation or the mouse operation are transmitted to the video distribution service 200 of the application server 10 (Step S3).
  • the video distribution service 200 that has received the operation event from the electronic information terminal 20 sends out the operation event to the application 100 (Step S4), and the application 100 that has received the operation event performs processing in accordance with the operation. In this way, a result of processing of the operation event is distributed as video, so that a user using a communication terminal can perform an interactive operation.
  • a distribution service input/output unit 210 (refer to Fig. 3) functions as an identification information adder, and adds a terminal ID (for example, 1, 2, 3, ..., n) for identifying the electronic information terminal 20 to the operation event to be sent out to the application 100. Due to this, the application 100 can determine the electronic information terminal 20 from which the operation event is transmitted.
  • Fig. 3 is a functional block diagram of the image processing device used for the image distribution system.
  • the application server 10 includes the application 100 and the video distribution service 200.
  • the application 100 provides the user with a function in cooperation with the video distribution service 200, and generates an image to be distributed.
  • the application 100 includes a function execution unit 110, an application input/output unit 120, and a frame output unit 130.
  • the video distribution service 200 is a general term for modules of the application server 10 in a video distribution system.
  • the video distribution service 200 includes the distribution service input/output unit 210, a distribution control unit 220, a distribution information management unit 230, a communication control unit 240, an encoder control unit 250, a video encoder unit 260, and a capture unit 270.
  • the distribution control unit 220, the distribution information management unit 230, and the communication control unit 240 constitute a distribution module 280, and the encoder control unit 250 and the video encoder unit 260 constitute a codec module 290.
  • the function execution unit 110 is a module that executes a function specific to the application server 10.
  • the application input/output unit 120 receives the operation event that is the data output from the distribution service input/output unit 210.
  • the application input/output unit 120 converts the received data into a format that can be processed by the function execution unit 110 and sends out the converted data.
  • the application input/output unit 120 transmits data regarding service start, service stop, and/or the like to the distribution service input/output unit 210.
  • the frame output unit 130 functions as a video generation unit and outputs an image to be distributed as video. For example, the frame output unit 130 outputs an image as an RGB bitmap as video compressed in H.264/AVC format.
  • the video distribution service 200 is a general term for modules of the server in the video distribution system.
  • the distribution service input/output unit 210 receives data regarding service start, service stop, and/or the like output from the application input/output unit 120.
  • the distribution service input/output unit 210 transmits data such as an operation event to the application input/output unit 120.
  • the distribution information management unit 230 manages video encode setting related to distribution, and information of the electronic information terminal 20 being coupled.
  • the communication control unit 240 functions as a communication unit for communicating with the electronic information terminal 20.
  • the distribution control unit 220 controls the entire distribution, activates the codec module 290, and gives an end instruction.
  • the encoder control unit 250 acquires a frame image at a frame rate designated by the distribution control unit 220, and sends out the acquired frame image to the video encoder unit 260.
  • the video encoder unit 260 generates compressed data from the acquired image.
  • the capture unit 270 functions as an image acquisition unit, and acquires the frame image output from the frame output unit 130 of the application 100.
  • the electronic information terminal 20 basically has a hardware configuration similar to the application server illustrated in Fig. 2, so that the description of the hardware is omitted.
  • Fig. 4 is a functional block diagram of the electronic processing terminal used for the image distribution system.
  • the electronic information terminal 20 includes a graphical user interface (GUI) unit 310, a distribution information management unit 320, an overall control unit 330, a communication control unit 340, a reproduction control unit 350, and a video decoder unit 360.
  • the GUI unit 310 displays a user interface (UI).
  • the overall control unit 330 performs overall control such as start and disconnection instructions of communication.
  • the distribution information management unit 320 holds information regarding distribution, for example, the video encode setting.
  • the communication control unit 340 controls communication with the application server 10, and transmits and receives data to/from the video distribution service 200 of the application server 10.
  • the reproduction control unit 350 puts the compressed data into the video decoder unit.
  • the video decoder unit 360 decodes the compressed data to display the decoded data.
  • Fig. 5 is a diagram illustrating a procedure of preparation for receiving video in the image distribution system.
  • Fig. 5 illustrates a processing procedure from when the electronic information terminal 20 is coupled to the video distribution service 200 of the application server 10 until preparation for receiving video is completed.
  • the communication control unit 340 of the electronic information terminal 20 transmits an acquisition request for an encode parameter to the communication control unit 240 of the video distribution service 200 (Step S10).
  • the communication control unit 240 of the video distribution service 200 When receiving the acquisition request for an encode parameter, acquires the encode parameter from the distribution information management unit 230, and transmits the encode parameter as a response to the electronic information terminal 20 (Step S20). Subsequently, the communication control unit 240 of the electronic information terminal 20 requests the communication control unit 240 of the video distribution service 200 to establish a session for upload (Step S30). When receiving the request to establish the session for upload, the communication control unit 240 of the video distribution service 200 transmits a response on completion of the establishment of the session for upload (Step S40). In this case, the session for upload is a network connection for transmitting the operation event and/or the like from the electronic information terminal 20 to the video distribution service 200, and maintains a coupled state at all times.
  • the communication control unit 340 of the electronic information terminal 20 requests the communication control unit 240 of the video distribution service 200 to establish a session for download (Step S50).
  • the communication control unit 240 of the video distribution service 200 transmits a response on completion of the establishment of the session for download (Step S60).
  • the session for download is a network connection for transmitting video and/or the like from the video distribution service 200 to the electronic information terminal 20, and maintains a coupled state at all times.
  • the session for download is completely established, the preparation for receiving video performed by the electronic information terminal 20 is completed.
  • the session for download and the session for upload are separately established in this embodiment, but can be implemented without being separated.
  • Fig. 6 is a diagram illustrating the procedure for distributing the compressed data in the image distribution system.
  • the frame output unit 130 of the application 100 sends out the image to be distributed as video to the capture unit 270 of the video distribution service 200 (Step S100).
  • the frame output unit 130 sends out the image to the capture unit 270.
  • a buffer that is for storing an image and shared by the frame output unit 130 and the capture unit 270 may be disposed, the frame output unit 130 may write the image to the buffer for storing an image, and the capture unit 270 may acquire the image of the buffer for storing an image.
  • the image acquired by the capture unit 270 of the video distribution service 200 is encoded into frame data compressed by the video encoder unit 260 (Step S110).
  • the video encoder unit 260 outputs the compressed frame data to the communication control unit 240, and the communication control unit 240 transmits the compressed data to the electronic information terminal 20 (Step S120).
  • the compressed data is transmitted to all of the electronic information terminals 20 being coupled, so that the electronic information terminals 20 being coupled can share the same video.
  • S100, S110, and S120 described above represent a series of processing for distributing one frame of the video, and the series of processing are repeated so that the compressed data can be continuously distributed (Steps S130, S140, S150).
  • Fig. 7 is a diagram illustrating a procedure for inputting the operation event in the image distribution system.
  • Fig. 7 illustrates a processing procedure in a case in which the user of the electronic information terminal 20 performs a mouse operation to the video displayed on the electronic information terminal 20.
  • the electronic information terminal 20 transmits operation event information including a type of the operation event, coordinates, and/or the like to the communication control unit 240 of the video distribution service 200 (Step S200).
  • the communication control unit 240 sends out the received operation event information to the distribution service input/output unit 210, and the distribution service input/output unit 210 adds a specific terminal ID (for example, 1, 2, ..., n) for identifying the electronic information terminal 20 to the operation event information and sends out the operation event information to the application input/output unit 120 (Step S210).
  • the application input/output unit 120 converts the operation event information into a function execution command in a format that can be executed by the function execution unit 110, and sends out the function execution command to the function execution unit 110 (Step S220).
  • the application input/output unit 120 processes the operation event information for each terminal ID added to the operation event information, and generates the function execution command for each terminal ID. That is, the application input/output unit 120 functions as an electronic information terminal identification unit for identifying the electronic information terminal.
  • the function execution unit 110 executes the given function execution command (Step S230). Accordingly, operation events performed on a plurality of electronic information terminals 20 (20-1 to 20-n) are executed by the application 100 while being divided into operation events of respective electronic terminals. With the image distribution system according to this embodiment, the operation events from a plurality of electronic information terminals can be divided and appropriately processed.
  • the electronic information terminal 20 adds a terminal ID of the electronic information terminal 20 itself (for example, 1, 2, 3, ..., n) as identification information to the operation event. That is, as illustrated in Fig. 4 with a virtual line, the electronic information terminal 20 includes an identification information adder 370 serving as the identification information adder. The terminal ID added by the identification information adder 370 is transmitted together with the operation event to the application server 10. In the application server 10, the distribution service input/output unit 210 (refer to Fig. 3) does not add the terminal ID.
  • the application input/output unit 120 of the application server 10 determines which electronic information terminal 20 executes the operation event based on the terminal ID from the electronic information terminal 20. Subsequent processing is the same as of the application server 10 according to the first embodiment. Also in this modification, the operation events performed on a plurality of electronic information terminals 20 (20-1 to 20-n) are executed by the application 100 while being divided into operation events of respective electronic terminals. Accordingly, the operation events from a plurality of electronic information terminals can be divided and appropriately processed.
  • the image processing device is applied to an electronic information board system.
  • the electronic information board includes a display such as a liquid crystal panel, a touch panel serving as a coordinate detection device that detects coordinates of a position on a display surface of the display where an indicator for input such as an input pen or a finger touches, and a control device that causes an image, whose examples include characters, numbers, and figures, written on the display surface of the display to be displayed based on coordinate data output from the coordinate detection device.
  • conference material and/or the like can be enlarged and displayed on the display, and writing can be performed on the display surface of the display with the indicator such as an input pen or a finger.
  • Fig. 8 is a block diagram illustrating a configuration of the electronic information board system according to the second embodiment of the present invention.
  • the electronic information board system according to this embodiment includes an interactive white board 40 serving as the electronic information board and a plurality of electronic information terminals 20 (20-1, 20-2, ..., 20-n) coupled to the interactive white board 40 via a wireless local area network (LAN).
  • the wireless LAN is used for an intra-company network, for example.
  • the interactive white board 40 can be coupled to the electronic information terminal 20 via various wired or wireless communication networks such as a wired LAN and the Internet, not limited to the wireless LAN.
  • the interactive white board 40 includes a touch panel 519 illustrated in Fig. 2 with a dashed line in addition to the hardware configuration of the application server 10 illustrated in Fig. 2.
  • the touch panel 519 With the touch panel 519, coordinates of a position on the display 508 where an input pen or a finger touches are detected.
  • the display 508 In the interactive white board 40, the display 508 has a large screen such as a liquid crystal display.
  • the CPU 501 executes predetermined software, a white board application 400 and the video distribution service 200 are implemented.
  • the electronic information terminal 20 is a notebook-type personal computer, a tablet computer, or the like, and operates based on various operating systems (OSs) such as iOS, Android (registered trademark), windows (registered trademark), and macOS (registered trademark).
  • OSs operating systems
  • the configuration of the electronic information terminal 20 is the same as in the first embodiment (refer to Fig. 4).
  • the white board application 400 detects a stroke of writing on a display 50 with an input pen or a finger or an operation of a page through the touch panel.
  • the white board application 400 draws a stroke 51 or the like on the display 508 based on the operation event detected through the touch panel.
  • the white board application 400 outputs an image (white board image) on the display 508 to be distributed to the electronic information terminal 20, and transmits a drawing result to the video distribution service 200 (Step S11).
  • the video distribution service 200 converts a plurality of images as the drawing result of the white board application 400 into compressed data, and distributes the compressed data to each electronic information terminal 20 (Step S12).
  • the images are distributed from one interactive white board 40 to a plurality of (n) electronic information terminals 20-1, 20-2, ..., 20-n at the same time.
  • the electronic information terminal 20 displays the distributed white board video that is acquired, a touch operation or a mouse operation is performed on the displayed video, and the electronic information terminal 20 can perform drawing of a stroke or the like on the displayed video (operation event).
  • the operation event is transmitted to the video distribution service 200 of the application server 10 (Step S13).
  • the video distribution service 200 that has received the operation event sends out the operation event to the white board application 400 (Step S14).
  • the white board application 400 that has received the operation event performs processing corresponding to the operation, that is, drawing of the stroke 51 or the like on the display 508.
  • the operation event input to the white board application 400 is accompanied with coordinate information.
  • the function execution unit 410 generates a stroke from the coordinate information to perform drawing on the image on the display 508. Accordingly, the electronic information terminal 20 can draw a stroke on the display 508 of the interactive white board 40 through remote control.
  • a result of processing of the operation event is distributed as video to the electronic information terminal 20 again, so that the user using the electronic information terminal 20 can perform an interactive operation.
  • the distribution service input/output unit 210 adds the terminal ID for identifying the electronic information terminal 20 to the operation event and sends out the operation event to the white board application 400, so that the white board application 400 can determine the electronic information terminal 20 from which the operation event is transmitted.
  • operations in a plurality of electronic information terminals 20 are prevented from being mixed up.
  • Fig. 9 is a functional block diagram of the electronic information board.
  • the interactive white board 40 includes the white board application 400 and the video distribution service 200 as described above.
  • the white board application 400 provides the user of the interactive white board 40 with a function as a white board in cooperation with the video distribution service 200.
  • the white board application 400 implements the function of the interactive white board 40, and generates an image to be distributed to the electronic information terminal 20.
  • the white board application 400 includes the function execution unit 410, an application input/output unit 420, a frame output unit 430, an image display control unit 440, and a touch panel control unit 450.
  • the video distribution service 200 is a general term for modules of the interactive white board 40 in the video distribution system.
  • the video distribution service 200 has the same configuration as the video distribution service 200 according to the first embodiment. That is, the video distribution service 200 includes the distribution service input/output unit 210, the distribution control unit 220, the distribution information management unit 230, the communication control unit 240, the encoder control unit 250, the video encoder unit 260, and the capture unit 270.
  • the distribution control unit 220, the distribution information management unit 230, and the communication control unit 240 constitute the distribution module 280, and the encoder control unit 250 and the video encoder unit 260 constitute the codec module 290.
  • the function execution unit 410 is a module that executes a function specific to the interactive white board. That is, the function execution unit 410 draws an image to be displayed on the display 508, or a stroke based on a drawing command.
  • the application input/output unit 420 receives the data output from the distribution service input/output unit 210, for example, the operation event.
  • the function execution unit 410 converts the received data into a format that can be processed by the function execution unit 410 to be sent out.
  • the function execution unit 410 transmits data regarding service start, service stop, and/or the like to the distribution service input/output unit 210.
  • the frame output unit 430 outputs an image to be distributed as video.
  • the frame output unit 430 outputs an image as an RGB bitmap as data compressed in H.264/AVC format.
  • the image display control unit 440 receives a drawing signal from the function execution unit 410, and controls image display of the display 508.
  • the touch panel control unit 450 receives a detection signal from the touch panel, and outputs the detection signal to the function execution unit 410.
  • the following describes processing in the electronic information board system according to this embodiment.
  • the operation is performed in the following order: preparation for receiving video, distribution of the compressed data, and input of the operation event.
  • a procedure of preparation for receiving video in the electronic information board system is the same as the procedure according to the first embodiment illustrated in Fig. 5.
  • Fig. 10 is a diagram illustrating a procedure for distributing the compressed data in the electronic information board system.
  • the frame output unit 430 of the white board application 400 sends out the image to be distributed as video to the capture unit 270 of the video distribution service 200 (Step S300).
  • the frame output unit 430 may directly transmit the image to the capture unit 270.
  • a buffer that is for storing an image and shared by the frame output unit 430 and the capture unit 270 may be disposed, the frame output unit 430 may write the image to the buffer for storing an image, and the capture unit 270 may acquire the image of the buffer for storing an image.
  • the image acquired by the capture unit 270 of the video distribution service 200 is encoded into frame data compressed by the video encoder unit 260 (Step S310).
  • the video encoder unit 260 passes the compressed frame data to the communication control unit 240, and the communication control unit 240 transmits the compressed data to the electronic information terminal 20 (Step S320).
  • the compressed data is transmitted to all of the electronic information terminals 20 being coupled, so that the electronic information terminals 20 being coupled can share the same white board video.
  • Steps S300, S310, and S320 described above represent a series of processing for distributing one frame of the video, and the series of processing are repeated so that the compressed data generated from the white board image can be continuously distributed (Steps S330, S340, S350).
  • the white board image distributed from the interactive white board 40 to each electronic information terminal 20 is not a screen itself displayed on the display 508 by the white board application 400, but a canvas image in which a stroke or the like are drawn.
  • the white board image does not include a UI pen mode change button or an eraser button for operating the white board. This setting can be changed as needed.
  • an arbitrary image output from the white board application 400 can be distributed, so that an image not including a UI can be distributed. In this way, this embodiment can prevent an image unneeded by the user of the electronic information terminal 20 from being displayed.
  • Fig. 11 is a diagram illustrating the procedure for inputting the operation event in the electronic information board system.
  • Fig. 10 illustrates a processing procedure in a case in which the user of the electronic information terminal 20 performs a mouse operation to the white board video displayed on the electronic information terminal 20.
  • the electronic information terminal 20 transmits operation event information including a type of the operation event, coordinates, and/or the like to the communication control unit 240 of the video distribution service 200 (Step S400).
  • the communication control unit 240 sends out the received operation event information to the distribution service input/output unit 210, and the distribution service input/output unit 210 adds the terminal ID (for example, 1, 2, ..., n) for identifying the electronic information terminal 20 to the operation event information and sends out the operation event information to the application input/output unit 420. Subsequently, when receiving the operation event information, the application input/output unit 420 converts the operation event information into a function execution command in a format that can be executed by the function execution unit 410, and sends out the function execution command to the function execution unit 410 (Step S420).
  • the terminal ID for example, 1, 2, ..., n
  • the application input/output unit 420 processes the operation event information for each terminal ID added to the operation event information, and generates the function execution command for each terminal ID. That is, the application input/output unit 420 functions as an electronic information terminal identification unit for identifying the electronic information terminal.
  • the function execution unit 410 executes the given function execution command (Step S430). Accordingly, operation events performed on a plurality of electronic information terminals 20 (20-1 to 20-n) are executed by the white board application 400 while being divided into operation events of respective electronic terminals.
  • the stroke 51 is displayed on the display 508 of the interactive white board 40 in which the white board application 400 operates, and the displayed result is displayed on the electronic information terminal 20 as video.
  • the terminal ID is added to the operation event information, so that the application input/output unit can generate a stroke drawing command for each electronic information terminal 20, that is, for each user. Accordingly, strokes are not mixed up even when a plurality of users transmit operation events at the same time, and the strokes intended by all the users that have input the operation events are drawn.
  • the video of the white board on which the strokes are displayed is distributed to the other electronic information terminals 20, so that material can be displayed on the interactive white board 40, a plurality of users can share the video, and the users can perform writing on the material.
  • the electronic information terminal 20 adds a terminal ID of the electronic information terminal 20 itself (for example, 1, 2, 3, ..., n) as identification information to the operation event. That is, as illustrated in Fig. 4 with a virtual line, the electronic information terminal 20 includes the identification information adder 370 serving as the identification information adder. The terminal ID added by the identification information adder 370 is transmitted together with the operation event to the interactive white board 40. In the interactive white board 40, the distribution service input/output unit 210 (refer to Fig. 9) does not add the terminal ID. The application input/output unit 420 of the interactive white board 40 determines which electronic information terminal 20 executes the operation event based on the terminal ID from the electronic information terminal 20. Subsequent processing is the same as of the interactive white board 40 according to the second embodiment.
  • a terminal ID of the electronic information terminal 20 for example, 1, 2, 3, ..., n
  • the electronic information terminal 20 includes the identification information adder 370 serving as the identification information adder.
  • the stroke 51 is displayed on the display 508 of the interactive white board 40 in which the white board application 400 operates, and the displayed result is displayed on the electronic information terminal 20 as video.
  • the terminal ID is added to the operation event information, so that the application input/output unit can generate a stroke drawing command for each electronic information terminal 20, that is, for each user. Accordingly, strokes are not mixed up even when a plurality of users transmit operation events at the same time, and the strokes intended by all the users that have input the operation events are drawn.
  • the video of the white board on which the strokes are displayed is distributed to the other electronic information terminals 20, so that material can be displayed on the interactive white board 40, a plurality of users can share the video, and the users can perform writing on the material.
  • the application server 10 coupled to the electronic information terminals 20 via a wireless LAN includes is provided.
  • the application server 10 includes the capture unit 270 that acquires an image, the frame output unit 130 that generates video in a predetermined format from the image acquired by the capture unit 270, the distribution control unit 220 that distributes the video generated by the frame output unit 130 to the electronic information terminals 20, the communication control unit 240 that receives the operation event related to the operation performed to the video distributed to the electronic information terminals 20, from each electronic information terminal 20, and the application input/output unit 120 that identifies an electronic information terminal 20 on which an operation event is performed based on the identification information added to the operation event.
  • the application input/output unit 120 determines the electronic information terminal 20 from which the operation event is transmitted based on the identification information. Accordingly, operation events from a plurality of electronic information terminals can be appropriately processed on the display.
  • This aspect has a feature that the application input/output unit 120 that processes each operation event for each electronic information terminal based on the identification information is disposed. According to this aspect, each operation event is processed for each electronic information terminal. Accordingly, operation events from a plurality of electronic information terminals can be divided and processed.
  • This aspect has a feature that the distribution service input/output unit 210 that adds the terminal ID indicating which electronic information terminal 20 has performed the operation, to the operation event, is disposed. According to this aspect, the distribution service input/output unit 210 can add the terminal ID to the operation event. Accordingly, operation events from a plurality of electronic information terminals can be appropriately processed.
  • the application server 10 includes the capture unit 270 that acquires an image, the frame output unit 130 that generates video in a predetermined format from the image acquired by the capture unit 270, and the distribution control unit 220 that distributes the video generated by the frame output unit 130 to the electronic information terminals 20.
  • Each electronic information terminal 20 includes the communication control unit 240 that receives the operation event related to the operation performed to the video distributed to the electronic information terminals 20, from each electronic information terminal 20, and the application input/output unit 120 that identifies an electronic information terminal 20 on which an operation event is performed based on the identification information added to the operation event.
  • the application server 10 is coupled to the electronic information terminal 20 via a wireless LAN to construct the image distribution system. Accordingly, the application server 10 and the electronic information terminal 20 can share the processing based on the operation event on the electronic information terminal 20 and the application server 10.
  • each electronic information terminal 20 includes the identification information adder 370 that adds the terminal ID to the operation event.
  • the application server 10 is coupled to the electronic information terminal 20 via a wireless LAN to construct the electronic information board system, and the identification information adder 370 of the electronic information terminal 20 adds the terminal ID to the operation event. Accordingly, the application server 10 can perform processing while recognizing the electronic information terminal 20 from which the operation event is transmitted.
  • This aspect has a feature that an image processing method performed by the application server 10 coupled to the electronic information terminals 20 via a wireless LAN is provided.
  • the method includes acquiring an image, generating video in a predetermined format from the image acquired at the acquiring, distributing the video generated at the generating to the electronic information terminals 20, receiving the operation event related to the operation performed to the video distributed to the electronic information terminals 20, from each electronic information terminal 20, and identifying an electronic information terminal 20 on which an operation event is performed based on the identification information added to the operation event.
  • the application input/output unit 120 can recognize the electronic information terminal 20 on which the operation event is performed. Accordingly, operation events from a plurality of electronic information terminals can be appropriately processed.
  • Application server image processing device 20 Electronic information terminal 100 Application 110 Function execution unit 120 Application input/output unit (electronic information terminal identification unit, event processing unit) 130 Frame output unit 200 Video distribution service 210 Distribution service input/output unit (identification information adder) 220 Distribution control unit 230 Distribution information management unit 240 Communication control unit (communication unit) 250 Encoder control unit 260 Video encoder unit (video generation unit) 270 Capture unit (image acquisition unit) 370 Identification information adder (identification information adder)

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

An image processing device is coupled to electronic information terminals via a communication network. The image processing device includes: an image acquisition unit configured to acquire an image; a video generation unit configured to generate video in a predetermined format from the image acquired by the image acquisition unit; a distribution unit configured to distribute the video generated by the video generation unit to the electronic information terminals; a communication unit configured to receive an operation event related to an operation performed to the video distributed to the electronic information terminals, from each of the electronic information terminals; and an electronic information terminal identification unit configured to identify an electronic information terminal on which an operation event is performed, based on identification information added to the operation event.

Description

IMAGE PROCESSING DEVICE, IMAGE DISTRIBUTION SYSTEM, AND IMAGE PROCESSING METHOD
The present invention relates to image processing devices, image distribution systems, and image processing methods.
In recent years, an electronic information board called an interactive white board (IWB) has been widely used. The interactive white board has a configuration combining a display made of a flat panel such as a liquid crystal panel, a touch panel serving as a coordinate detection device that detects coordinates of a position on a display surface of the display where an indicator for input such as an input pen or a finger touches, and a control device that causes an image including a character, a number, or a figure written on the display surface of the display to be displayed based on coordinate data output from the coordinate detection device. With such kind of device, conference material and/or the like can be enlarged and displayed on the display, and writing can be performed on the display surface of the display with the indicator such as an input pen or a finger.
The interactive white board can be coupled to an electronic information terminal such as a tablet computer via a wireless local area network (LAN) or the like. An image on the display of the interactive white board can be displayed on the electronic information terminal. A writing operation and the like can be performed on the image also with the electronic information terminal. An operation event performed on the electronic information terminal is transmitted to the interactive white board, processed as an operation event of the interactive white board, and displayed on the display.
For such an interactive white board, there has been a demand for coupling the interactive white board to the electronic information terminal to distribute an image, performing an operation such as drawing with the electronic information terminal, and displaying drawing content in the electronic information terminal on the display as an operation event in the interactive white board based on the operation performed with the electronic information terminal.
Patent Literature 1 discloses an image distribution system in which, in transmitting video from a server to an electronic information terminal, the server draws a plurality of images on a browser as hyper text markup language (HTML) 5 content, and distributes the drawn images to the electronic information terminal as compressed data.
However, in the invention disclosed in Patent Literature 1, specific processing is not disclosed for each operation event in a case in which operation events are transmitted from a plurality of electronic information terminals to the image displayed on each electronic information terminal.
The present invention is made in view of such a situation, and has an object to provide an image processing device, an image distribution system, and an image processing method that can appropriately process operation events from a plurality of electronic information terminals.
In order to solve the abovementioned problem, an image processing device is coupled to electronic information terminals via a communication network. The image processing device includes: an image acquisition unit, a video generation unit, a distribution unit, a communication unit, and an electronic information terminal identification unit. The image acquisition unit is configured to acquire an image. The video generation unit is configured to generate video in a predetermined format from the image acquired by the image acquisition unit. The distribution unit is configured to distribute the video generated by the video generation unit to the electronic information terminals. The communication unit is configured to receive an operation event related to an operation performed to the video distributed to the electronic information terminals, from each of the electronic information terminals. The electronic information terminal identification unit is configured to identify an electronic information terminal on which an operation event is performed, based on identification information added to the operation event.
According to the present invention, operation events from a plurality of electronic information terminals can be appropriately processed by an image processing device.
Fig. 1 is a block diagram illustrating a configuration of an image distribution system according to a first embodiment of the present invention. Fig. 2 is a configuration diagram illustrating a hardware configuration of an application server. Fig. 3 is a functional block diagram of the application server used for the image distribution system. Fig. 4 is a functional block diagram of an electronic processing terminal used for the image distribution system. Fig. 5 is a diagram illustrating a procedure of preparation for receiving video in the image distribution system. Fig. 6 is a diagram illustrating a procedure for distributing compressed video in the image distribution system. Fig. 7 is a diagram illustrating a procedure for inputting an operation event in the image distribution system. Fig. 8 is a block diagram illustrating a configuration of an electronic information board system according to a second embodiment of the present invention. Fig. 9 is a functional block diagram of an electronic information board used for the electronic information board system. Fig. 10 is a diagram illustrating a procedure for distributing compressed video in the electronic information board system. Fig. 11 is a diagram illustrating a procedure for inputting an operation event in the electronic information board system.
The following describes an image processing device, an image distribution system, and an image processing method according to embodiments of the present invention.
First embodiment
First, the following describes a first embodiment of the present invention. In the image distribution system, an application server serving as an image processing device draws HTML content, and distributes a resulting image to an electronic processing terminal as video.
Fig. 1 is a block diagram illustrating a configuration of the image distribution system according to the first embodiment of the present invention. The image distribution system according to this embodiment includes an application server 10 and a plurality of electronic information terminals 20 (20-1, 20-2, ..., 20-n) coupled to the application server 10 via a wireless local area network (LAN). The wireless LAN is used for an intra-company network, for example. The application server 10 can be coupled to the electronic information terminal 20 via various wired or wireless communication networks such as a wired LAN and the Internet, not limited to the wireless LAN.
Fig. 2 is a configuration diagram illustrating a hardware configuration of the application server according to the first embodiment of the present invention. As illustrated in Fig. 2, a central processing unit (CPU) 501, a read only memory (ROM) 502, a random access memory (RAM) 503, a hard disc drive (HDD) 504, a hard disk controller (HDC) 505, a media drive 507, a display (image display device) 508, an interface (I/F) 509, a keyboard 511, a mouse 512, a microphone 513, a speaker 514, and a graphics processing unit (GPU) 515 are coupled to each other via an extension bus line 520 to construct the application server 10.
The CPU 501 controls the entire operation of the application server 10. The ROM 502 stores therein a computer program used for driving the CPU 501 such as an IPL. The RAM 503 is used as a work area for the CPU 501. The HDD 504 stores therein various pieces of data such as a computer program. The HDC 505 controls reading or writing of various pieces of data from/to the HDD 504 in accordance with control performed by the CPU 501. The media drive 507 controls reading or writing (storing) of data from/to a recording medium 506 such as a flash memory. The display 508 displays various pieces of information. The I/F 509 is used for transmitting data via a communication network and coupling a dongle. A ROM 516 storing therein a computer program used for driving the GPU 515 and a RAM 517 used as a work area for the GPU 515 are coupled to the GPU 515. The extension bus line 520 includes an address bus, a data bus, and/or the like for electrically coupling the components described above.
When the CPU 501 executes predetermined video distribution software, the application server 10 illustrated in Fig. 1 implements an application 100 and a video distribution service 200 that distributes a plurality of images generated by the application 100 as video.
The electronic information terminal 20 is a notebook-type personal computer, a tablet computer, or the like, and operates based on various operating systems (OSs) such as iOS, Android (registered trademark), windows (registered trademark), and macOS (registered trademark).
The following schematically describes an operation of the image distribution system based on Fig. 1. The application 100 sends out an image to be distributed to the electronic information terminal 20, to the video distribution service 200 (Step S1). The video distribution service 200 transmits a plurality of images as a drawing result of the application 100 to the electronic information terminal 20 (Step S2). In this example, the images are distributed from one application server 10 to a plurality of (n) electronic information terminals 20-1, 20-2, ..., 20-n at the same time.
Each electronic information terminal 20 displays the distributed video, and performs drawing on the video in response to a touch operation or a mouse operation performed to the displayed video. The operation events such as the touch operation or the mouse operation are transmitted to the video distribution service 200 of the application server 10 (Step S3).
The video distribution service 200 that has received the operation event from the electronic information terminal 20 sends out the operation event to the application 100 (Step S4), and the application 100 that has received the operation event performs processing in accordance with the operation. In this way, a result of processing of the operation event is distributed as video, so that a user using a communication terminal can perform an interactive operation. In this case, a distribution service input/output unit 210 (refer to Fig. 3) functions as an identification information adder, and adds a terminal ID (for example, 1, 2, 3, ..., n) for identifying the electronic information terminal 20 to the operation event to be sent out to the application 100. Due to this, the application 100 can determine the electronic information terminal 20 from which the operation event is transmitted.
Next, the following describes a configuration of the image processing device in the image distribution system. Fig. 3 is a functional block diagram of the image processing device used for the image distribution system.
As described above, the application server 10 includes the application 100 and the video distribution service 200. The application 100 provides the user with a function in cooperation with the video distribution service 200, and generates an image to be distributed. The application 100 includes a function execution unit 110, an application input/output unit 120, and a frame output unit 130.
The video distribution service 200 is a general term for modules of the application server 10 in a video distribution system. The video distribution service 200 includes the distribution service input/output unit 210, a distribution control unit 220, a distribution information management unit 230, a communication control unit 240, an encoder control unit 250, a video encoder unit 260, and a capture unit 270. The distribution control unit 220, the distribution information management unit 230, and the communication control unit 240 constitute a distribution module 280, and the encoder control unit 250 and the video encoder unit 260 constitute a codec module 290.
The function execution unit 110 is a module that executes a function specific to the application server 10. The application input/output unit 120 receives the operation event that is the data output from the distribution service input/output unit 210. The application input/output unit 120 converts the received data into a format that can be processed by the function execution unit 110 and sends out the converted data. The application input/output unit 120 transmits data regarding service start, service stop, and/or the like to the distribution service input/output unit 210.
The frame output unit 130 functions as a video generation unit and outputs an image to be distributed as video. For example, the frame output unit 130 outputs an image as an RGB bitmap as video compressed in H.264/AVC format.
The video distribution service 200 is a general term for modules of the server in the video distribution system. The distribution service input/output unit 210 receives data regarding service start, service stop, and/or the like output from the application input/output unit 120. The distribution service input/output unit 210 transmits data such as an operation event to the application input/output unit 120. The distribution information management unit 230 manages video encode setting related to distribution, and information of the electronic information terminal 20 being coupled. The communication control unit 240 functions as a communication unit for communicating with the electronic information terminal 20. The distribution control unit 220 controls the entire distribution, activates the codec module 290, and gives an end instruction.
The encoder control unit 250 acquires a frame image at a frame rate designated by the distribution control unit 220, and sends out the acquired frame image to the video encoder unit 260. The video encoder unit 260 generates compressed data from the acquired image. The capture unit 270 functions as an image acquisition unit, and acquires the frame image output from the frame output unit 130 of the application 100.
The following describes the electronic information terminal 20. The electronic information terminal 20 basically has a hardware configuration similar to the application server illustrated in Fig. 2, so that the description of the hardware is omitted. Fig. 4 is a functional block diagram of the electronic processing terminal used for the image distribution system. The electronic information terminal 20 includes a graphical user interface (GUI) unit 310, a distribution information management unit 320, an overall control unit 330, a communication control unit 340, a reproduction control unit 350, and a video decoder unit 360.
The GUI unit 310 displays a user interface (UI). The overall control unit 330 performs overall control such as start and disconnection instructions of communication. The distribution information management unit 320 holds information regarding distribution, for example, the video encode setting. The communication control unit 340 controls communication with the application server 10, and transmits and receives data to/from the video distribution service 200 of the application server 10. The reproduction control unit 350 puts the compressed data into the video decoder unit. The video decoder unit 360 decodes the compressed data to display the decoded data.
The following describes an operation of the image distribution system according to the first embodiment. The operation is performed in the following order: preparation for receiving video, distribution of the compressed data, and input of the operation event. Fig. 5 is a diagram illustrating a procedure of preparation for receiving video in the image distribution system. Fig. 5 illustrates a processing procedure from when the electronic information terminal 20 is coupled to the video distribution service 200 of the application server 10 until preparation for receiving video is completed. First, the communication control unit 340 of the electronic information terminal 20 transmits an acquisition request for an encode parameter to the communication control unit 240 of the video distribution service 200 (Step S10). When receiving the acquisition request for an encode parameter, the communication control unit 240 of the video distribution service 200 acquires the encode parameter from the distribution information management unit 230, and transmits the encode parameter as a response to the electronic information terminal 20 (Step S20).
Subsequently, the communication control unit 240 of the electronic information terminal 20 requests the communication control unit 240 of the video distribution service 200 to establish a session for upload (Step S30). When receiving the request to establish the session for upload, the communication control unit 240 of the video distribution service 200 transmits a response on completion of the establishment of the session for upload (Step S40). In this case, the session for upload is a network connection for transmitting the operation event and/or the like from the electronic information terminal 20 to the video distribution service 200, and maintains a coupled state at all times.
The communication control unit 340 of the electronic information terminal 20 requests the communication control unit 240 of the video distribution service 200 to establish a session for download (Step S50). When receiving the request to establish the session for download, the communication control unit 240 of the video distribution service 200 transmits a response on completion of the establishment of the session for download (Step S60). In this case, the session for download is a network connection for transmitting video and/or the like from the video distribution service 200 to the electronic information terminal 20, and maintains a coupled state at all times.
When the session for download is completely established, the preparation for receiving video performed by the electronic information terminal 20 is completed. The session for download and the session for upload are separately established in this embodiment, but can be implemented without being separated.
The following describes a procedure for distributing the compressed data from the application server 10 to the electronic information terminal 20. Fig. 6 is a diagram illustrating the procedure for distributing the compressed data in the image distribution system. The frame output unit 130 of the application 100 sends out the image to be distributed as video to the capture unit 270 of the video distribution service 200 (Step S100). As a method of sending out the image, the frame output unit 130 sends out the image to the capture unit 270. Alternatively, a buffer that is for storing an image and shared by the frame output unit 130 and the capture unit 270 may be disposed, the frame output unit 130 may write the image to the buffer for storing an image, and the capture unit 270 may acquire the image of the buffer for storing an image.
Subsequently, the image acquired by the capture unit 270 of the video distribution service 200 is encoded into frame data compressed by the video encoder unit 260 (Step S110). When the encoding is completed, the video encoder unit 260 outputs the compressed frame data to the communication control unit 240, and the communication control unit 240 transmits the compressed data to the electronic information terminal 20 (Step S120). In this case, the compressed data is transmitted to all of the electronic information terminals 20 being coupled, so that the electronic information terminals 20 being coupled can share the same video.
S100, S110, and S120 described above represent a series of processing for distributing one frame of the video, and the series of processing are repeated so that the compressed data can be continuously distributed (Steps S130, S140, S150...).
The following describes a procedure for inputting the operation event in the electronic information terminal 20. Fig. 7 is a diagram illustrating a procedure for inputting the operation event in the image distribution system. Fig. 7 illustrates a processing procedure in a case in which the user of the electronic information terminal 20 performs a mouse operation to the video displayed on the electronic information terminal 20.
First, the electronic information terminal 20 transmits operation event information including a type of the operation event, coordinates, and/or the like to the communication control unit 240 of the video distribution service 200 (Step S200). The communication control unit 240 sends out the received operation event information to the distribution service input/output unit 210, and the distribution service input/output unit 210 adds a specific terminal ID (for example, 1, 2, ..., n) for identifying the electronic information terminal 20 to the operation event information and sends out the operation event information to the application input/output unit 120 (Step S210).
Subsequently, when receiving the operation event information, the application input/output unit 120 converts the operation event information into a function execution command in a format that can be executed by the function execution unit 110, and sends out the function execution command to the function execution unit 110 (Step S220). The application input/output unit 120 processes the operation event information for each terminal ID added to the operation event information, and generates the function execution command for each terminal ID. That is, the application input/output unit 120 functions as an electronic information terminal identification unit for identifying the electronic information terminal. The function execution unit 110 executes the given function execution command (Step S230).
Accordingly, operation events performed on a plurality of electronic information terminals 20 (20-1 to 20-n) are executed by the application 100 while being divided into operation events of respective electronic terminals.
With the image distribution system according to this embodiment, the operation events from a plurality of electronic information terminals can be divided and appropriately processed.
Modification of first embodiment
The following describes a modification of the first embodiment. In the image distribution system according to the modification, the electronic information terminal 20 adds a terminal ID of the electronic information terminal 20 itself (for example, 1, 2, 3, ..., n) as identification information to the operation event. That is, as illustrated in Fig. 4 with a virtual line, the electronic information terminal 20 includes an identification information adder 370 serving as the identification information adder. The terminal ID added by the identification information adder 370 is transmitted together with the operation event to the application server 10.
In the application server 10, the distribution service input/output unit 210 (refer to Fig. 3) does not add the terminal ID. The application input/output unit 120 of the application server 10 determines which electronic information terminal 20 executes the operation event based on the terminal ID from the electronic information terminal 20. Subsequent processing is the same as of the application server 10 according to the first embodiment.
Also in this modification, the operation events performed on a plurality of electronic information terminals 20 (20-1 to 20-n) are executed by the application 100 while being divided into operation events of respective electronic terminals. Accordingly, the operation events from a plurality of electronic information terminals can be divided and appropriately processed.
Second embodiment
The following describes a second embodiment of the present invention. In the second embodiment of the present invention, the image processing device is applied to an electronic information board system. The electronic information board includes a display such as a liquid crystal panel, a touch panel serving as a coordinate detection device that detects coordinates of a position on a display surface of the display where an indicator for input such as an input pen or a finger touches, and a control device that causes an image, whose examples include characters, numbers, and figures, written on the display surface of the display to be displayed based on coordinate data output from the coordinate detection device. In such kind of device, conference material and/or the like can be enlarged and displayed on the display, and writing can be performed on the display surface of the display with the indicator such as an input pen or a finger.
Fig. 8 is a block diagram illustrating a configuration of the electronic information board system according to the second embodiment of the present invention. The electronic information board system according to this embodiment includes an interactive white board 40 serving as the electronic information board and a plurality of electronic information terminals 20 (20-1, 20-2, ..., 20-n) coupled to the interactive white board 40 via a wireless local area network (LAN). The wireless LAN is used for an intra-company network, for example. The interactive white board 40 can be coupled to the electronic information terminal 20 via various wired or wireless communication networks such as a wired LAN and the Internet, not limited to the wireless LAN.
The interactive white board 40 includes a touch panel 519 illustrated in Fig. 2 with a dashed line in addition to the hardware configuration of the application server 10 illustrated in Fig. 2. With the touch panel 519, coordinates of a position on the display 508 where an input pen or a finger touches are detected. In the interactive white board 40, the display 508 has a large screen such as a liquid crystal display. When the CPU 501 executes predetermined software, a white board application 400 and the video distribution service 200 are implemented.
The electronic information terminal 20 is a notebook-type personal computer, a tablet computer, or the like, and operates based on various operating systems (OSs) such as iOS, Android (registered trademark), windows (registered trademark), and macOS (registered trademark). The configuration of the electronic information terminal 20 is the same as in the first embodiment (refer to Fig. 4).
The following schematically describes an operation of the electronic information board system based on Fig. 8. In the interactive white board 40, the white board application 400 detects a stroke of writing on a display 50 with an input pen or a finger or an operation of a page through the touch panel. The white board application 400 draws a stroke 51 or the like on the display 508 based on the operation event detected through the touch panel.
The white board application 400 outputs an image (white board image) on the display 508 to be distributed to the electronic information terminal 20, and transmits a drawing result to the video distribution service 200 (Step S11). The video distribution service 200 converts a plurality of images as the drawing result of the white board application 400 into compressed data, and distributes the compressed data to each electronic information terminal 20 (Step S12). In this example, the images are distributed from one interactive white board 40 to a plurality of (n) electronic information terminals 20-1, 20-2, ..., 20-n at the same time.
The electronic information terminal 20 displays the distributed white board video that is acquired, a touch operation or a mouse operation is performed on the displayed video, and the electronic information terminal 20 can perform drawing of a stroke or the like on the displayed video (operation event). The operation event is transmitted to the video distribution service 200 of the application server 10 (Step S13).
The video distribution service 200 that has received the operation event sends out the operation event to the white board application 400 (Step S14). The white board application 400 that has received the operation event performs processing corresponding to the operation, that is, drawing of the stroke 51 or the like on the display 508. The operation event input to the white board application 400 is accompanied with coordinate information. Thus, the function execution unit 410 generates a stroke from the coordinate information to perform drawing on the image on the display 508. Accordingly, the electronic information terminal 20 can draw a stroke on the display 508 of the interactive white board 40 through remote control.
A result of processing of the operation event is distributed as video to the electronic information terminal 20 again, so that the user using the electronic information terminal 20 can perform an interactive operation.
The distribution service input/output unit 210 adds the terminal ID for identifying the electronic information terminal 20 to the operation event and sends out the operation event to the white board application 400, so that the white board application 400 can determine the electronic information terminal 20 from which the operation event is transmitted. Thus, operations in a plurality of electronic information terminals 20 are prevented from being mixed up.
The following describes a configuration of the interactive white board 40. Fig. 9 is a functional block diagram of the electronic information board.
The interactive white board 40 includes the white board application 400 and the video distribution service 200 as described above. The white board application 400 provides the user of the interactive white board 40 with a function as a white board in cooperation with the video distribution service 200. The white board application 400 implements the function of the interactive white board 40, and generates an image to be distributed to the electronic information terminal 20. The white board application 400 includes the function execution unit 410, an application input/output unit 420, a frame output unit 430, an image display control unit 440, and a touch panel control unit 450. The video distribution service 200 is a general term for modules of the interactive white board 40 in the video distribution system. The video distribution service 200 has the same configuration as the video distribution service 200 according to the first embodiment. That is, the video distribution service 200 includes the distribution service input/output unit 210, the distribution control unit 220, the distribution information management unit 230, the communication control unit 240, the encoder control unit 250, the video encoder unit 260, and the capture unit 270. The distribution control unit 220, the distribution information management unit 230, and the communication control unit 240 constitute the distribution module 280, and the encoder control unit 250 and the video encoder unit 260 constitute the codec module 290.
The function execution unit 410 is a module that executes a function specific to the interactive white board. That is, the function execution unit 410 draws an image to be displayed on the display 508, or a stroke based on a drawing command. The application input/output unit 420 receives the data output from the distribution service input/output unit 210, for example, the operation event. The function execution unit 410 converts the received data into a format that can be processed by the function execution unit 410 to be sent out. The function execution unit 410 transmits data regarding service start, service stop, and/or the like to the distribution service input/output unit 210. The frame output unit 430 outputs an image to be distributed as video. For example, the frame output unit 430 outputs an image as an RGB bitmap as data compressed in H.264/AVC format.
The image display control unit 440 receives a drawing signal from the function execution unit 410, and controls image display of the display 508. The touch panel control unit 450 receives a detection signal from the touch panel, and outputs the detection signal to the function execution unit 410.
The following describes processing in the electronic information board system according to this embodiment. The operation is performed in the following order: preparation for receiving video, distribution of the compressed data, and input of the operation event. A procedure of preparation for receiving video in the electronic information board system is the same as the procedure according to the first embodiment illustrated in Fig. 5.
The following describes a procedure for distributing the compressed data from the interactive white board 40 to the electronic information terminal 20. Fig. 10 is a diagram illustrating a procedure for distributing the compressed data in the electronic information board system. The frame output unit 430 of the white board application 400 sends out the image to be distributed as video to the capture unit 270 of the video distribution service 200 (Step S300). As a method of sending out the image, for example, the frame output unit 430 may directly transmit the image to the capture unit 270. Alternatively, a buffer that is for storing an image and shared by the frame output unit 430 and the capture unit 270 may be disposed, the frame output unit 430 may write the image to the buffer for storing an image, and the capture unit 270 may acquire the image of the buffer for storing an image.
Subsequently, the image acquired by the capture unit 270 of the video distribution service 200 is encoded into frame data compressed by the video encoder unit 260 (Step S310). When the encoding is completed, the video encoder unit 260 passes the compressed frame data to the communication control unit 240, and the communication control unit 240 transmits the compressed data to the electronic information terminal 20 (Step S320). In this case, the compressed data is transmitted to all of the electronic information terminals 20 being coupled, so that the electronic information terminals 20 being coupled can share the same white board video.
Steps S300, S310, and S320 described above represent a series of processing for distributing one frame of the video, and the series of processing are repeated so that the compressed data generated from the white board image can be continuously distributed (Steps S330, S340, S350...).
The white board image distributed from the interactive white board 40 to each electronic information terminal 20 is not a screen itself displayed on the display 508 by the white board application 400, but a canvas image in which a stroke or the like are drawn. The white board image does not include a UI pen mode change button or an eraser button for operating the white board.
This setting can be changed as needed. In this embodiment, an arbitrary image output from the white board application 400 can be distributed, so that an image not including a UI can be distributed. In this way, this embodiment can prevent an image unneeded by the user of the electronic information terminal 20 from being displayed.
The following describes a procedure for inputting the operation event in the electronic information terminal 20. Fig. 11 is a diagram illustrating the procedure for inputting the operation event in the electronic information board system. Fig. 10 illustrates a processing procedure in a case in which the user of the electronic information terminal 20 performs a mouse operation to the white board video displayed on the electronic information terminal 20.
First, the electronic information terminal 20 transmits operation event information including a type of the operation event, coordinates, and/or the like to the communication control unit 240 of the video distribution service 200 (Step S400). The communication control unit 240 sends out the received operation event information to the distribution service input/output unit 210, and the distribution service input/output unit 210 adds the terminal ID (for example, 1, 2, ..., n) for identifying the electronic information terminal 20 to the operation event information and sends out the operation event information to the application input/output unit 420.
Subsequently, when receiving the operation event information, the application input/output unit 420 converts the operation event information into a function execution command in a format that can be executed by the function execution unit 410, and sends out the function execution command to the function execution unit 410 (Step S420). The application input/output unit 420 processes the operation event information for each terminal ID added to the operation event information, and generates the function execution command for each terminal ID. That is, the application input/output unit 420 functions as an electronic information terminal identification unit for identifying the electronic information terminal. The function execution unit 410 executes the given function execution command (Step S430).
Accordingly, operation events performed on a plurality of electronic information terminals 20 (20-1 to 20-n) are executed by the white board application 400 while being divided into operation events of respective electronic terminals.
As described above, when the user of the electronic information terminal 20 performs an operation such as a mouse operation to the white board video displayed on the electronic information terminal 20, the stroke 51 is displayed on the display 508 of the interactive white board 40 in which the white board application 400 operates, and the displayed result is displayed on the electronic information terminal 20 as video. The terminal ID is added to the operation event information, so that the application input/output unit can generate a stroke drawing command for each electronic information terminal 20, that is, for each user.
Accordingly, strokes are not mixed up even when a plurality of users transmit operation events at the same time, and the strokes intended by all the users that have input the operation events are drawn.
The video of the white board on which the strokes are displayed is distributed to the other electronic information terminals 20, so that material can be displayed on the interactive white board 40, a plurality of users can share the video, and the users can perform writing on the material.
Modification of second embodiment
The following describes a modification of the second embodiment. In the electronic information board system according to this embodiment, the electronic information terminal 20 adds a terminal ID of the electronic information terminal 20 itself (for example, 1, 2, 3, ..., n) as identification information to the operation event. That is, as illustrated in Fig. 4 with a virtual line, the electronic information terminal 20 includes the identification information adder 370 serving as the identification information adder. The terminal ID added by the identification information adder 370 is transmitted together with the operation event to the interactive white board 40.
In the interactive white board 40, the distribution service input/output unit 210 (refer to Fig. 9) does not add the terminal ID. The application input/output unit 420 of the interactive white board 40 determines which electronic information terminal 20 executes the operation event based on the terminal ID from the electronic information terminal 20. Subsequent processing is the same as of the interactive white board 40 according to the second embodiment.
Also in this embodiment, when the user of the electronic information terminal 20 performs an operation such as a mouse operation to the white board video displayed on the electronic information terminal 20, the stroke 51 is displayed on the display 508 of the interactive white board 40 in which the white board application 400 operates, and the displayed result is displayed on the electronic information terminal 20 as video. The terminal ID is added to the operation event information, so that the application input/output unit can generate a stroke drawing command for each electronic information terminal 20, that is, for each user.
Accordingly, strokes are not mixed up even when a plurality of users transmit operation events at the same time, and the strokes intended by all the users that have input the operation events are drawn.
The video of the white board on which the strokes are displayed is distributed to the other electronic information terminals 20, so that material can be displayed on the interactive white board 40, a plurality of users can share the video, and the users can perform writing on the material.
Configuration, action, and effect of aspects of present invention
First aspect
This aspect has a feature that the application server 10 coupled to the electronic information terminals 20 via a wireless LAN includes is provided. The application server 10 includes the capture unit 270 that acquires an image, the frame output unit 130 that generates video in a predetermined format from the image acquired by the capture unit 270, the distribution control unit 220 that distributes the video generated by the frame output unit 130 to the electronic information terminals 20, the communication control unit 240 that receives the operation event related to the operation performed to the video distributed to the electronic information terminals 20, from each electronic information terminal 20, and the application input/output unit 120 that identifies an electronic information terminal 20 on which an operation event is performed based on the identification information added to the operation event.
According to this aspect, the application input/output unit 120 determines the electronic information terminal 20 from which the operation event is transmitted based on the identification information. Accordingly, operation events from a plurality of electronic information terminals can be appropriately processed on the display.
Second aspect
This aspect has a feature that the application input/output unit 120 that processes each operation event for each electronic information terminal based on the identification information is disposed.
According to this aspect, each operation event is processed for each electronic information terminal. Accordingly, operation events from a plurality of electronic information terminals can be divided and processed.
Third aspect
This aspect has a feature that the distribution service input/output unit 210 that adds the terminal ID indicating which electronic information terminal 20 has performed the operation, to the operation event, is disposed.
According to this aspect, the distribution service input/output unit 210 can add the terminal ID to the operation event. Accordingly, operation events from a plurality of electronic information terminals can be appropriately processed.
Fourth aspect
This aspect has a feature that an image distribution system including the electronic information terminals 20 and the application server 10 coupled to the electronic information terminals 20 via a wireless LAN is provided. The application server 10 includes the capture unit 270 that acquires an image, the frame output unit 130 that generates video in a predetermined format from the image acquired by the capture unit 270, and the distribution control unit 220 that distributes the video generated by the frame output unit 130 to the electronic information terminals 20. Each electronic information terminal 20 includes the communication control unit 240 that receives the operation event related to the operation performed to the video distributed to the electronic information terminals 20, from each electronic information terminal 20, and the application input/output unit 120 that identifies an electronic information terminal 20 on which an operation event is performed based on the identification information added to the operation event.
According to this aspect, the application server 10 is coupled to the electronic information terminal 20 via a wireless LAN to construct the image distribution system. Accordingly, the application server 10 and the electronic information terminal 20 can share the processing based on the operation event on the electronic information terminal 20 and the application server 10.
Fifth aspect
In this aspect, each electronic information terminal 20 includes the identification information adder 370 that adds the terminal ID to the operation event.
According to this aspect, the application server 10 is coupled to the electronic information terminal 20 via a wireless LAN to construct the electronic information board system, and the identification information adder 370 of the electronic information terminal 20 adds the terminal ID to the operation event. Accordingly, the application server 10 can perform processing while recognizing the electronic information terminal 20 from which the operation event is transmitted.
Sixth aspect
This aspect has a feature that an image processing method performed by the application server 10 coupled to the electronic information terminals 20 via a wireless LAN is provided. The method includes acquiring an image, generating video in a predetermined format from the image acquired at the acquiring, distributing the video generated at the generating to the electronic information terminals 20, receiving the operation event related to the operation performed to the video distributed to the electronic information terminals 20, from each electronic information terminal 20, and identifying an electronic information terminal 20 on which an operation event is performed based on the identification information added to the operation event.
According to this aspect, the application input/output unit 120 can recognize the electronic information terminal 20 on which the operation event is performed. Accordingly, operation events from a plurality of electronic information terminals can be appropriately processed.
10 Application server (image processing device)
20 Electronic information terminal
100 Application
110 Function execution unit
120 Application input/output unit (electronic information terminal identification unit, event processing unit)
130 Frame output unit
200 Video distribution service
210 Distribution service input/output unit (identification information adder)
220 Distribution control unit
230 Distribution information management unit
240 Communication control unit (communication unit)
250 Encoder control unit
260 Video encoder unit (video generation unit)
270 Capture unit (image acquisition unit)
370 Identification information adder (identification information adder)
Japanese Laid-open Patent Publication No. 2014-200076

Claims (6)

  1. An image processing device coupled to electronic information terminals via a communication network, the image processing device comprising:
    an image acquisition unit configured to acquire an image;
    a video generation unit configured to generate video in a predetermined format from the image acquired by the image acquisition unit;
    a distribution unit configured to distribute the video generated by the video generation unit to the electronic information terminals;
    a communication unit configured to receive an operation event related to an operation performed to the video distributed to the electronic information terminals, from each of the electronic information terminals; and
    an electronic information terminal identification unit configured to identify an electronic information terminal on which an operation event is performed, based on identification information added to the operation event.
  2. The image processing device according to claim 1, further comprising an event processing unit configured to process each operation event for each electronic information terminal based on the identification information.
  3. The image processing device according to claim 1 or 2, further comprising an identification information adder configured to add the identification information to the operation event.
  4. An image distribution system comprising:
    an electronic information terminals; and
    an image processing device coupled to the electronic information terminals via a communication network,
    the image processing device comprising:
    an image acquisition unit configured to acquire an image;
    a video generation unit configured to generate video in a predetermined format from the image acquired by the image acquisition unit; and
    a distribution unit configured to distribute the video generated by the video generation unit to the electronic information terminals,
    each of the electronic information terminals comprising:
    a communication unit configured to receive an operation event related to an operation performed to the video distributed to the electronic information terminals, from each of the electronic information terminals; and
    an electronic information terminal identification unit configured to identify an electronic information terminal on which an operation event is performed, based on identification information added to the operation event.
  5. The image distribution system according to claim 4, wherein each of the electronic information terminals comprises an identification information adder configured to add identification information to the operation event.
  6. An image processing method performed by an image processing device coupled to electronic information terminals via a communication network, the method comprising:
    acquiring an image;
    generating video in a predetermined format from the image acquired at the acquiring;
    distributing the video generated at the generating to the electronic information terminals;
    receiving an operation event related to an operation performed to the video distributed to the electronic information terminals, from each of the electronic information terminals; and
    identifying an electronic information terminal on which an operation event is performed based on identification information added to the operation event.
PCT/JP2016/001937 2015-04-16 2016-04-06 Image processing device, image distribution system, and image processing method WO2016166955A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2015-084484 2015-04-16
JP2015084484 2015-04-16
JP2015-122779 2015-06-18
JP2015122779 2015-06-18
JP2016-026034 2016-02-15
JP2016026034A JP2016224907A (en) 2015-04-16 2016-02-15 Image processing device, image distribution system, and image processing method

Publications (1)

Publication Number Publication Date
WO2016166955A1 true WO2016166955A1 (en) 2016-10-20

Family

ID=57126433

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/001937 WO2016166955A1 (en) 2015-04-16 2016-04-06 Image processing device, image distribution system, and image processing method

Country Status (1)

Country Link
WO (1) WO2016166955A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022252600A1 (en) * 2021-06-01 2022-12-08 刘启成 Data processing method and apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013065125A (en) * 2011-09-16 2013-04-11 Ricoh Co Ltd Screen sharing system, screen sharing terminal, electronic blackboard system and program
WO2015045787A1 (en) * 2013-09-27 2015-04-02 株式会社リコー Distribution management device, terminal, and distribution management method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013065125A (en) * 2011-09-16 2013-04-11 Ricoh Co Ltd Screen sharing system, screen sharing terminal, electronic blackboard system and program
WO2015045787A1 (en) * 2013-09-27 2015-04-02 株式会社リコー Distribution management device, terminal, and distribution management method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022252600A1 (en) * 2021-06-01 2022-12-08 刘启成 Data processing method and apparatus

Similar Documents

Publication Publication Date Title
US10521879B2 (en) Overlaying multi-source media in VRAM
US10332296B2 (en) Overlaying multi-source media in VRAM
US11003353B2 (en) Method and system of enhanced interaction with a shared screen
CN109309842B (en) Live broadcast data processing method and device, computer equipment and storage medium
US20230049197A1 (en) Screen sharing method, apparatus, and device, and storage medium
US10637895B2 (en) Communication terminal, communication system, communication control method and program
US20140361991A1 (en) Method and electronic device for controlling mouse module
JP6089454B2 (en) Image distribution apparatus, display apparatus, and image distribution system
CN113655975B (en) Image display method, image display device, electronic apparatus, and medium
JP6458581B2 (en) Information processing system, display position determination method, terminal device, information processing device, and program
JP2015035996A (en) Server and method for providing game
CN113867580B (en) Display control method and device for pointer in window, equipment and storage medium
US20160117140A1 (en) Electronic apparatus, processing method, and storage medium
CN113721876A (en) Screen projection processing method and related equipment
WO2016166955A1 (en) Image processing device, image distribution system, and image processing method
JP2017068683A (en) Information processing apparatus, image transmission method, and program
CN111143017A (en) Cloud operating system interaction processing method, client and cloud operating system
JP2017033543A (en) Image processing device, image processing system, and image processing method
US11656834B2 (en) Information processing device, non-transitory recording medium, and information processing system
JP2017062645A (en) Image distribution system, image distribution method, and program
JP2015089485A (en) Server and method for providing game
CN114615535A (en) Synchronous display method and device, electronic equipment and readable storage medium
CN109960562B (en) Information display method and device and computer readable storage medium
JP2016224907A (en) Image processing device, image distribution system, and image processing method
CN112131539A (en) Watermark information adding method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16779752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16779752

Country of ref document: EP

Kind code of ref document: A1