CN104796658A - Display apparatus, display system, and display method - Google Patents

Display apparatus, display system, and display method Download PDF

Info

Publication number
CN104796658A
CN104796658A CN201410708521.2A CN201410708521A CN104796658A CN 104796658 A CN104796658 A CN 104796658A CN 201410708521 A CN201410708521 A CN 201410708521A CN 104796658 A CN104796658 A CN 104796658A
Authority
CN
China
Prior art keywords
object data
data
display
script
display unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410708521.2A
Other languages
Chinese (zh)
Other versions
CN104796658B (en
Inventor
太田浩一郎
北林一良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp filed Critical Seiko Epson Corp
Publication of CN104796658A publication Critical patent/CN104796658A/en
Application granted granted Critical
Publication of CN104796658B publication Critical patent/CN104796658B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
    • H04M3/567Multimedia conference systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences

Abstract

Provided are a display apparatus, a display system, and a display method. A client (4) can return a display state of an object to the past. When a point body moves on a screen, a projector (1) draws an object of a line representing a moving trajectory. The projector (1) generates object data representing the object and transmits the generated object data to tablet terminals (4). The object data contains order information representing generation order of the object. The tablet terminals (4) display the object represented by the transmitted object data on a touch panel (43). When performing an operation for returning the display, the tablet terminals (4) return the display by object unit based on the order information contained in the object data.

Description

Display unit, display system and display packing
Whole disclosures of the Japanese patent application 2014-005751 submitted on January 16th, 2014 combine in this application by way of reference.
Technical field
The present invention relates to display unit, display system and display packing.
Background technology
In patent documentation 1, disclose the system being shared picture by multiple teleconference client.Within the system, the data of shown rendered object are sent to multiple teleconference client from Teleconference server in units of object.Each teleconference client preserves the data comprising the rendered object of various attribute information (color, position, thickness etc.), shows rendered object in the display according to preserved data.
Patent documentation 1: Japanese Unexamined Patent Publication 2013-65125 publication
When carrying out meeting while shared picture, sometimes wish to make display turn back to the moment in the past, in the system of patent documentation 1, do not make rendered object turn back to the mechanism of show state in the past, the show state that can not recreate the past.
Summary of the invention
The present invention completes in view of the foregoing, and one of its object it is possible to make the show state of object be backwards to in the client.
In order to reach above-mentioned purpose, display unit of the present invention has: display unit, its show image; Acquisition unit, it obtains the coordinate on the viewing area of described display unit; Object display unit, it makes object be presented at the position of the coordinate that described acquisition unit obtains; Generation unit, its generating object data, this object data is the data representing described object, comprises the order information of the genesis sequence representing described object; And script transmitting element, the script being used for obtaining described object data is sent to client by it.
According to the present invention, show according to object data in the mode making the order object in the past of being specified by user in the client be shown, the show state of object can be made to be backwards to over.
In the present invention, also can be, described display unit has: the 2nd object data receiving element, and it receives the 2nd object data that described client generates and sends; And the 2nd object display unit, the 2nd object data received described in its display, described generation unit generate described order information is additional to described in the 2nd object data that receives and the 3rd object data that obtains, described script transmitting element sends the described script for obtaining the object data comprising described 3rd object data.
According to this structure, the object data generated by client terminal can be shown, thus share the display of identical object in display unit with client.
In the present invention, also can be configured to, described display unit has: receiving element, and it receives the order information that described client sends; Data transmission unit, the object data that described generation unit generates is sent to described client by it, and the object data played till the order that order information that described receiving element receives represents from the initial object data generated in the object data that described generation unit generates by described data transmission unit sends to described client.
According to this structure, owing to the data beyond the object data before the order information of specifying not being sent to client, therefore, it is possible to suppress the traffic.
In addition, in the present invention, also can be configured to, described receiving element is received in the order information of specifying in the user interface of the slider bar form of described client display.
In the structure shown here, owing to the data beyond the object data before the order information of specifying not being sent to client, therefore, it is possible to suppress the traffic.
In addition, in the present invention, also can be configured to, that described receiving element receives the display of described client, that the object data of the object of being specified by user in corresponding with the object data that described data transmission unit sends object comprises order information.
In the structure shown here, owing to the data beyond the object data before the order information of specifying not being sent to client, therefore, it is possible to suppress the traffic.
In addition, in the present invention, also can be configured to, described display unit has voice data generation unit, this voice data generation unit generates the voice data of sound representing and collect, described data transmission unit by described voice data, audio data transmitting after rise time of the object data of order that order information that described receiving element obtains represents gives described client.
According to this structure, the sound during show state of the object that can recreate the past.
In addition, display system of the present invention has display unit and client, and described display unit has: display unit, its show image; Acquisition unit, it obtains the coordinate on the viewing area of described display unit; Object display unit, it makes object be presented at the position of the coordinate that described acquisition unit obtains; Generation unit, its generating object data, this object data is the data representing described object, comprises the order information of the genesis sequence representing described object; Data transmission unit, the object data that described generation unit generates is sent to client by it; And script transmitting element, the script being used for obtaining described object data is sent to described client by it, and described client has: script receiving element, and it receives described script; Script executing unit, it performs described script; Data receipt unit, it receives described object data; And indicative control unit, it controls according to described object data the display unit that described client has, to show the order object in the past of being specified by user.
According to the present invention, show according to object data in the mode making the order object in the past of being specified by user in the client be shown, the show state of object can be made to be backwards to over.
In addition, display packing of the present invention comprises the steps: object display step, makes object be presented at the viewing area of display unit; Generation step, generating object data, this object data is the data representing described object, comprises the order information of the genesis sequence representing described object; Data sending step, sends to client by the object data generated in described generation step; Script forwarding step, sends to described client by the script being used for obtaining described object data; Script receiving step, receives described script; Script performs step, performs described script; Data reception step, receives described object data; And display and control step, according to described object data, control the display unit that client has, to show the order object in the past of being specified by user.
According to the present invention, show according to object data in the mode making the order object in the past of being specified by user in the client be shown, the show state of object can be made to be backwards to over.
Accompanying drawing explanation
Fig. 1 is the integrally-built figure that display system PS1 is shown.
Fig. 2 is the figure of the hardware configuration that indication body 3 is shown.
Fig. 3 is the figure of the hardware configuration that tablet terminal 4 is shown.
Fig. 4 is the figure of the structure that the function realized in tablet terminal 4 is shown.
Fig. 5 is the figure of the hardware configuration that projecting apparatus 1 is shown.
Fig. 6 is the figure of the structure that the function realized in projecting apparatus 1 is shown.
Fig. 7 is the figure of the action for illustration of execution mode.
Fig. 8 is the figure of the action for illustration of execution mode.
Fig. 9 is the figure that the image example projected in screen SC is shown.
Figure 10 is the figure that the image example projected in screen SC is shown.
Figure 11 is the figure that the image example projected in screen SC is shown.
Figure 12 is the figure that the image example projected in screen SC is shown.
Figure 13 is the figure that the image example projected in screen SC is shown.
Figure 14 is the figure that the image example projected in screen SC is shown.
Figure 15 is the figure that the image example projected in screen SC is shown.
Figure 16 is the figure that the image example projected in screen SC is shown.
Figure 17 is the figure of the hardware configuration of the tablet terminal 4 that the 2nd execution mode is shown.
Figure 18 is the figure of the hardware configuration of the projecting apparatus 1 that the 2nd execution mode is shown.
Embodiment
[the 1st execution mode]
(overall structure)
Fig. 1 is the integrally-built figure of the display system PS1 that an embodiment of the invention are shown.Display system PS1 has projecting apparatus 1, indication body 3, the tablet terminal 4A with touch pad, 4B, controller RC and screen SC.Projecting apparatus 1 is connected via WLAN (Local Area Network: local area network (LAN)) with tablet terminal 4A, 4B.Tablet terminal 4A is identical with the structure of tablet terminal 4B, therefore, below, when distinguishing each other, is called tablet terminal 4.
Projecting apparatus 1 as an example of display unit of the present invention is the image projection that represented by the signal of video signal provided from external device (ED) (such as personal computer) device to the screen SC of plane.Projecting apparatus 1 is the projecting apparatus of the front projection type of short focus, and it is configured in the position nearer with screen SC.In the example in fig 1, projecting apparatus 1 is configured in the top of screen SC.The projecting apparatus 1 of present embodiment has blank (whiteboard) function, when making indication body 3 move on the surface of screen SC, detects the position of indication body 3, will represent that the image projection of the motion track of indication body 3 is to screen SC.Utilize this function, the mode that pen can be used to write on blank with user, rendered object (such as line, word or figure etc.) in screen SC.
Indication body 3 is user's devices as the Writing utensil equipped pen type that uses or rod type during rendered object in screen SC.Controller RC is the remote controllers for being controlled projecting apparatus 1 by radio communication (such as infrared communication).
Tablet terminal 4 is examples that display projects the terminal installation of the image in screen SC.Utilize the white boarding of projecting apparatus 1, when user uses indication body 3 to carry out rendered object, projecting apparatus 1 communicates via WLAN with tablet terminal 4, in tablet terminal 4, demonstrate the image identical with the image projected in screen SC.When user on a touchpad mobile handwriting pen or finger time, tablet terminal 4 detects the position of writing pen or finger, demonstrates the image of the motion track representing writing pen or finger.Thus, in tablet terminal 4, object can be write in the image of display.In addition, utilizing writing pen or finger in the image of tablet terminal 4 display after rendered object, tablet terminal 4 communicates with projecting apparatus 1, by the image projection of the object of description in tablet terminal 4 in screen SC.
(structure of indication body 3)
Fig. 2 is the block diagram of the hardware configuration that indication body 3 is shown.Indication body 3 has control part 30, pressure sensor 31 and Department of Communication Force 32.Pressure sensor 31 is arranged on the end of indication body 3, detects the pressure putting on the end of indication body 3, will represent that the signal of the pressure detected is supplied to control part 30.Control part 30, according to the signal provided from pressure sensor 31, controls Department of Communication Force 32.Department of Communication Force 32 has the LED (LightEmitting Diode: light-emitting diode) of the light sending the wavelength preset.When the pressure of the signal indication provided from pressure sensor 31 exceedes the threshold value preset, control part 30 controls Department of Communication Force 32, makes LED luminous.In addition, when the pressure of the signal indication provided from pressure sensor 31 is below the threshold value that presets, control part 30 controls Department of Communication Force 32, and LED is extinguished.
(structure of tablet terminal 4)
Fig. 3 is the block diagram of the hardware configuration that tablet terminal 4 is shown.
Control part 40 has CPU (Central Processing Unit: central processing unit), RAM (Random AccessMemory: random access memory) and nonvolatile memory, when CPU performs the program stored in nonvolatile memory, the operating system of tablet terminal 4 carries out action, then, can executive utility.
Touch pad 43 carries out integrated touch pad by display unit (such as, liquid crystal display) with the location input device of the electrostatic capacitive detecting the position that user touches on the display apparatus.Operating portion 41 has the button for operate tablet terminal 4.Department of Communication Force 45 is the communication interfaces of carrying out radio communication via WLAN.
Storage part 42 has nonvolatile memory, stores the data of various application program and application program use.Storage part 42 stores the program for realizing Web browser, when CPU performs this program, in tablet terminal 4, realizes Web browser.The Web browser realized in tablet terminal 4 can explain script, perform the script obtained from projecting apparatus 1, realize display thus and project the function of the image in screen SC, the function writing object in the image shown at Web browser and the function etc. that the object shown by Web browser is edited.
Fig. 4 is the block diagram of the structure of function of the present invention in the function illustrating that tablet terminal 4 realizes.Coordinate obtaining section 400 plays a role as acquisition unit, its obtain as indication body writing pen or finger in touch pad 43 contact position (coordinate).Object generating unit 401 plays a role as generation unit, and it is created on the object data of the object of the position display that coordinate obtaining section 400 obtains.Object display part 402 plays a role as indicative control unit, and it controls touch pad 43, the position being presented at coordinate obtaining section 400 to make object and obtaining.In addition, object display part 402 controls touch pad 43, and the object represented to make the object data sent from projecting apparatus 1 is shown.
Data sending part 403 sends to the data transmission unit of projecting apparatus 1 to play a role as by various data.Script acceptance division 404 receives the Web page sent from projecting apparatus 1.Comprise script in Web page, script acceptance division 404 plays a role as script receiving element.Data reception portion 405 plays a role as the receiving element receiving the object data sent from projecting apparatus 1.Script enforcement division 406 plays a role as the script executing unit performing the script that script acceptance division 404 receives.
(structure of projecting apparatus 1)
Fig. 5 is the block diagram of the hardware configuration that projecting apparatus 1 is shown.Imageing sensor 16 is solid-state imagers of photographed screen SC and image data generating, such as, be cmos image sensor or ccd image sensor.Light accepting part 17 obtains the infrared signal sent from controller RC, and the signal of telecommunication of infrared signal expression obtained is supplied to control part 10.Department of Communication Force 18 is the communication interfaces of carrying out radio communication via WLAN.
Operating portion 11 has the multiple buttons for operating projecting apparatus 1.Control part 10 according to by each portion of button control operated, can carry out the setting etc. of the various functions that the adjustment of the image projected in screen SC or projecting apparatus 1 have thus.
Drawing section 15 is controlled by control part 10, generating the menu image of the setting of the various functions for carrying out projecting apparatus 1, for adjusting the menu image of image, the image etc. of indicated object that projecting apparatus 1 shows, will represent that the signal of the image generated is supplied to image processing portion 13.
Image processing portion 13 obtains the signal of video signal provided from external device (ED), the signal provided from drawing section 15.Image processing portion 13 has multiple IPF, implements various process to the signal of video signal be provided.Such as, image processing portion 13 is controlled by control part 10, carries out the adjustment process of the image quality such as adjustment of the brightness of the image projecting screen SC, contrast, the concentration of color, tone, colour temperature.In addition, image processing portion 13 provides the signal of video signal obtained in signal of video signal by the Signal averaging provided from drawing section 15 to display part 14.
Display part 14 has light source 141, light valve 142, drive circuit 144 and projection lens 143, is an example of the display unit of show image.Light source is the lamp sending light, and it is red, green, blue light that the light that light source sends is omitted illustrated multiple dichronic mirror light splitting, and the redness after light splitting, green, blue light are omitted illustrated mirror (mirror) and are directed to light valve 142.
Drive circuit 144 obtains the signal of video signal provided from image processing portion 13.The signal of video signal being supplied to drive circuit 144 has the gradation data of the gray scale of the composition of the redness in the image representing and project, represents the gradation data of the gray scale of the composition of the blueness in the image of the gradation data of the gray scale of the composition of the green in the image of projection and expression projection.Drive circuit 144 obtains the gradation data of redness, green, blue each color, drives light valve 142 according to the gradation data of each color obtained.
The liquid crystal light valve that light valve 142 has the liquid crystal light valve injected for red light, the liquid crystal light valve supplying green light to inject and injects for blue light.Liquid crystal light valve is the liquid crystal panel of transmission-type, has and is configured to rectangular pixel in the mode of multiple lines and multiple rows.The liquid crystal light valve injected for red light is driven according to the gradation data of redness, and the liquid crystal light valve that the light for green is injected is driven according to the gradation data of green, and the liquid crystal light valve that the light for blueness is injected is driven according to the gradation data of blueness.Each pixel of each liquid crystal light valve is controlled by drive circuit 144, thus the change of the transmissivity of pixel.By controlling the transmissivity of pixel, the light through each color after liquid crystal light valve becomes the image corresponding with each gradation data.Image through the redness of liquid crystal light valve, green, blue light is omitted the synthesis of illustrated colour splitting prism and injects projection lens 143.Projection lens 143 is the camera lenses amplified the image injected, and the image zoom injected projects in screen SC by it.
Control part 10 has CPU, RAM and nonvolatile memory, when CPU performs the program of nonvolatile memory storage, in projecting apparatus 1, realizes following function etc.: the image projection represented by the signal of video signal inputted from external device (ED) is on screen; Adjust the aspect ratio of the image projected; The image quality of the image of adjustment projection; Description or the editor of object is carried out according to the operation of indication body 3; And demonstrate the object described in tablet terminal 4.In addition, when control part 10 performs the program stored in nonvolatile memory, the server as client server system plays a role, and realizes the function of carrying out with the tablet terminal 4 as client communicating.In addition, when projecting apparatus 1 carries out action as server, realize such function being called as so-called server push (Comet): after accepting the request from client, retain response, when producing event in projecting apparatus 1, remove the reservation of response, return response to the client sent request.
Storage part 12 has nonvolatile memory.Storage part 12 store unique identify user user identifier and password be used as carrying out certification to the user of projecting apparatus 1 information.
In addition, the data of storage part 12 to the image shown in white boarding store.The data of the image shown in white boarding are stored as Web page, comprise: script, and it realizes the write function of object or the function etc. of edit object; And object data, its represent shown by image in the object that comprises.In object data, comprise the information for rendered object, such as, when object is line, comprise the data such as the color of line, the thickness of line, the coordinate of line.In addition, in object data, the order information of the genesis sequence of indicated object is comprised.Such as, when after at first depicting line with indication body 3, following depict round with indication body 3, comprise in online object data and represent it is the 1st order information be generated " 1 ", comprise in the object data of circle and represent it is the 2nd order information be generated " 2 ".
In addition, object data comprises the data relevant to editor sometimes.Such as, when the operation of having carried out deleting the line described by the 1st operation after carrying out the description justified is used as the 3rd operation, the object data of the deletion of indicated object is generated.When the line that deletion the 1st is described, " 1 " order information as the object deleted is comprised in the object data relevant to this deletion, and as the genesis sequence of object data, then comprise " 3 " as order information, this order information represents that this object data is the 3rd according to the genesis sequence of object data and is generated.
Fig. 6 is the block diagram of the structure of the function of the present invention illustrated in the function of realization in projecting apparatus 1.Coordinate obtaining section 100 plays a role as acquisition unit, and it resolves the view data provided from imageing sensor 16, obtains the position (coordinate) of indication body 3.Object generating unit 101 plays a role as generation unit, and it generates the object data being presented at the object of the position of the indication body 3 that coordinate obtaining section 100 obtains.Object display part 102 plays a role as object display unit, and it controls drawing section 15, the position making object be presented at coordinate obtaining section 100 to obtain.Data sending part 103 plays a role as object data being sent to the data transmission unit of tablet terminal 4.Web page is sent to tablet terminal 4 by script sending part 104.Owing to comprising script in Web page, therefore, script sending part 104 plays a role as script transmitting element.Data reception portion 105 plays a role as the receiving element receiving the various information sent from tablet terminal 4.
(action case of execution mode)
Next, use Fig. 7 ~ Figure 16, the action case of present embodiment is described.Fig. 7,8 is sequential charts of the action for illustration of present embodiment, and Fig. 9 ~ 16 are figure that shown image example is shown.
(action case in projecting apparatus 1 during formation object)
When user has carried out indicating the operation starting white boarding in controller RC, utilize projecting apparatus 1, screen SC has shown the illustrative image of Fig. 9.Herein, in screen SC, show toolbar TB, toolbar TB represents the function that can utilize in white boarding.In toolbar TB, have following icon etc.: icon IC1, it represents the function describing line; Icon IC2, it represents the function of deleting object; Icon IC3, it represents function description content being saved as the page; Icon IC4, it represents the function showing the page preserved.
In addition, in screen SC, slider bar S is shown.Slider bar S is the instrument for making the display of object fall back or advance on a timeline.The left end of slider bar S represents that the generation moment of initial object, the right-hand member of slider bar S represent the generation moment of up-to-date object.When carrying out the slide block of slider bar S to offset the operation of 1 step left, demonstrating the state of carrying out last operation, when carrying out the slide block of slider bar S to offset the operation of 1 step to the right, demonstrating the state of an operation after carrying out.
Under the state showing the image shown in Fig. 9, when user utilizes indication body 3 to touch the position of icon IC1, projecting apparatus 1 becomes the state of the image of the motion track of display indication body 3.When user makes indication body 3 contact with screen SC and moves, imageing sensor 16 obtains the light that indication body 3 sends.Control part 10 (coordinate obtaining section 100) resolves the view data provided from imageing sensor 16, obtains the position of the light that indication body 3 sends, and determines the motion track of indication body 3 in screen SC.When control part 10 (object display part 102) controls drawing section 15 according to the motion track determined, as illustrated in Figure 10, in screen SC, at the position display line L1 of indication body 3 movement.Control part 10 (object generating unit 101), when detecting that indication body 3 leaves from screen SC, generates the object data of the line L1 shown by representing.Herein, because line L1 is the 1st object be generated, the order information therefore comprised in object data is " 1 ".
In addition, when user makes indication body 3 contact with screen SC and moves in the mode describing circle, as illustrated in Figure 11, in screen SC, at the position display line L2 of indication body 3 movement.Herein, because line L2 is the 2nd object be generated, therefore, the order information comprised in the object data generated is " 2 ".
(action case when tablet terminal 4A logs in)
Next, use Fig. 7 and Fig. 8, the action case shown when projecting the image of screen SC in tablet terminal 4 is described.Perform Web browser program tablet terminal 4A in, when user utilized touch pad 43 to carry out access projecting apparatus 1 operation time, tablet terminal 4A communicates with projecting apparatus 1, shows the page for signing in projecting apparatus 1 in touch pad 43.When user inputs user identifier and password and has carried out user identifier and password being sent to the operation of projecting apparatus 1 in the shown page, control part 40 controls Department of Communication Force 45, inputted user identifier and password is sent to projecting apparatus 1 (step S1).When Department of Communication Force 18 receives identifier and the password of transmission, when storing the combination of user identifier and the password received in storage part 12, control part 10 (script sending part 104) controls Department of Communication Force 18, the data of the Web page of projected image is sent to tablet terminal 4A (step S2).
When Department of Communication Force 45 receives the data of the Web page that projecting apparatus 1 sends, the control part 40 (script acceptance division 404, script enforcement division 406) of tablet terminal 4A performs the script comprised in Web page.Control part 40 (data sending part 403) controls Department of Communication Force 45, sends the message (step S3) of the object data for asking the whole objects in Web page to projecting apparatus 1.When Department of Communication Force 18 receives this message, control part 10 (data sending part 103) controls Department of Communication Force 18, and the object data of shown line L1 and the object data of line L2 are sent to tablet terminal 4A (step S4).
When Department of Communication Force 45 receives the object data of projecting apparatus 1 transmission, the control part 40 (data reception portion 405, object display part 402) of tablet terminal 4A, according to the object data received, controls touch pad 43, with display line L1 and line L2.Herein, touch pad 43 shows the image (step S5) identical with the image shown in screen SC.At the end of the display of object, control part 40 obtains the order information of maximum from the object data received, and control Department of Communication Force 45, send request to projecting apparatus 1, this request comprises the object data (step S6) as the order information of the value larger than acquired value for asking.When Department of Communication Force 18 receives the request from tablet terminal 4A, control part 10 retains the response (step S7) for the request received.
(under the listed state of tablet terminal 4A, the action case in projecting apparatus 1 during formation object)
Next, user makes indication body 3 contact with screen SC and move, when control part 10 controls drawing section 15 according to the motion track of indication body 3, at the position display line L3 (Figure 12) of indication body 3 movement, generate the object data (step S8) representing outlet L3.Herein, because line L3 is the 3rd object be generated, therefore, the order information comprised in object data is " 3 ".
After generating object data, control part 10 removes the reservation to response carried out in the step s 7, and control Department of Communication Force 18, the object data of generated line L3 is sent to tablet terminal 4A (step S9) as the response of the request for step S6.When Department of Communication Force 45 receives the object data of the line L3 that projecting apparatus 1 sends, the control part 40 of tablet terminal 4A controls touch pad 43, makes its display line L3.Herein, touch pad 43 shows the image (step S10) identical with the image shown in screen SC.At the end of the display of object, control part 40 obtains the order information of maximum from the object data received, control Department of Communication Force 45, send request to projecting apparatus 1, this request comprises the object data (step S11) as the order information of the value larger than the value obtained for asking.The control part 40 sent request waits for the response from projecting apparatus 1.In addition, when Department of Communication Force 18 receives the request sent from tablet terminal 4A, control part 10 retains the response (step S12) for the request received.
(action case in tablet terminal 4A during formation object)
Next, after the user of tablet terminal 4A utilizes writing pen to touch the position of the icon IC1 in the image of touch pad 43 display, when making writing pen contact with touch pad 43 and move, the control part 40 (coordinate obtaining section 400) of tablet terminal 4A determines the motion track of the writing pen on touch pad 43.When control part 40 (object display part 402) controls touch pad 43 according to the motion track determined, touch pad 43 describes outlet L4 (step S13) in the position of writing pen movement.When finishing the description of line L4, control part 40 (object generating unit 401) generates the object data (step S14) of line L4, controls Department of Communication Force 45, generated object data is sent to projecting apparatus 1 (step S15).The object data of this generation is equivalent to the 2nd object data.Order information is not comprised in the object data that control part 40 generates.
When Department of Communication Force 18 receives the object data sent from tablet terminal 4A, control part 10, according to the object data received, controls drawing section 15.Herein, in screen SC, show image (Figure 13) (step S16) identical with the image that tablet terminal 4A shows.Control part 10 at the end of the description of object, additional sequence information (step S17) in the object of online L4.Herein, because line L4 is the 4th object be generated, therefore, the order information comprised in object data is " 4 ".The object data being attached with this order information is equivalent to the 3rd object data.
Order information is being attached to after in the object data received, control part 10 removes the reservation to response carried out in step s 12, and the object data being attached with the line L4 of order information is sent to tablet terminal 4A (step S18) as the response of the request for step S11.The control part 40 of tablet terminal 4A, when Department of Communication Force 45 receives the object data of projecting apparatus 1 transmission, controls touch pad 43, makes its display line L4.Herein, touch pad 43 shows the image (step S19) identical with the image shown in screen SC.At the end of the display of object, control part 40 obtains the order information of maximum from the object data received, control Department of Communication Force 45, send request to projecting apparatus 1, this request comprises the object data (step S20) as the order information of the value larger than the value obtained for asking.When Department of Communication Force 18 receives the request sent from tablet terminal 4A, control part 10 retains the response (step S21) for the request received.
(action case when tablet terminal 4B logs in)
Next, perform Web browser program tablet terminal 4B in, when user utilized touch pad 43 to carry out access projecting apparatus 1 operation time, tablet terminal 4B communicates with projecting apparatus 1, demonstrates the page for signing in projecting apparatus 1 in touch pad 43.When user inputs user identifier and password and has carried out user identifier and password being sent to the operation of projecting apparatus 1 in the shown page, control part 40 controls Department of Communication Force 45, inputted user identifier and password is sent to projecting apparatus 1 (step S22).When Department of Communication Force 18 receives identifier and the password of transmission, when storing the combination of user identifier and the password received in storage part 12, control part 10 controls Department of Communication Force 18, the data of the Web page of projected image is sent to tablet terminal 4B (step S23).
When the Department of Communication Force 45 of tablet terminal 4B receives the data of the Web page that projecting apparatus 1 sends, the control part 40 of tablet terminal 4B controls Department of Communication Force 45, sends the message (step S3) of the object data for asking the whole objects in Web page to projecting apparatus 1.When Department of Communication Force 18 receives this message, control part 10 controls Department of Communication Force 18, and the object data of the line of display L1 ~ line L4 is sent to tablet terminal 4B (step S25).
When Department of Communication Force 45 receives the object data of projecting apparatus 1 transmission, the control part 40 of tablet terminal 4B, according to the object data received, controls touch pad 43, makes its display line L1 ~ line L4.Herein, touch pad 43 shows the image (step S26) identical with the image shown in screen SC.At the end of the display of object, control part 40 obtains the order information of maximum from the object data received, control Department of Communication Force 45, send request to projecting apparatus 1, this request comprises the object data (step S27) as the order information of the value larger than the value obtained for asking.
When Department of Communication Force 18 receives the request from tablet terminal 4B, control part 10 retains the response (step S28) for the request received.
(action case in tablet terminal 4B during formation object)
Next, after the position of the icon IC1 of user in the image utilizing writing pen touching touch pad 43 to show of tablet terminal 4B, when making writing pen contact with touch pad 43 and move, the control part 40 (coordinate obtaining section 400) of tablet terminal 4B determines the motion track of the writing pen on touch pad 43.When control part 40 controls touch pad 43 according to the motion track determined, touch pad 43 describes outlet L5 (step S29) in the position of writing pen movement.At the end of the description of online L5, control part 40 generates the object data (step S30) of line L5, controls Department of Communication Force 45, generated object data is sent to projecting apparatus 1 (step S31).The object data of this generation is also equivalent to the 2nd object data.In addition, in the object data of control part 40 generation, order information is not comprised.
When Department of Communication Force 18 receives the object data sent from tablet terminal 4B, control part 10, according to the object data received, controls drawing section 15.Herein, in screen SC, show image (Figure 14) (step S32) identical with the image that tablet terminal 4B shows.At the end of the description of object, additional sequence information (step S33) in the object of the online L5 of control part 10.Herein, because line L5 is the 5th object be generated, therefore, the order information comprised in object data is " 5 ".The object data being attached with this order information is also equivalent to the 3rd object data.
Be attached to by order information after in the object data received, control part 10 removes the reservation to response carried out in step S21 and step S28.The object data being attached with the line L5 of order information is sent to tablet terminal 4A (step S34) as the response of the request for step S20 by control part 10, and the object data being attached with the line L5 of order information is sent to tablet terminal 4B (step S35) as the response of the request for step S27.
When Department of Communication Force 45 receives the object data of projecting apparatus 1 transmission, the control part 40 of tablet terminal 4A and tablet terminal 4B controls touch pad 43, makes its display line L5.Herein, touch pad 43 shows the image (step S36, step S37) identical with the image shown in screen SC.At the end of the display of object, the control part 40 of tablet terminal 4A and tablet terminal 4B obtains the order information of maximum from the object data received, control Department of Communication Force 45, send request to projecting apparatus 1, this request is for asking the object data (step S38, step S39) of the order information comprised as the value larger than the value obtained.When Department of Communication Force 18 receives the request from tablet terminal 4A, 4B, control part 10 retains the response (step S40) for the request received.
(action case during deleting object)
Next, to delete shown by object time action case be described.Under the state showing the image shown in Figure 14, when user utilizes indication body 3 to touch the position of icon IC2, projecting apparatus 1 becomes the state that can utilize indication body 3 deleting object.
When user makes indication body 3 contact with screen SC and moves, imageing sensor 16 obtains the light that indication body 3 sends.Control part 10 (coordinate obtaining section 100) resolves the view data provided from imageing sensor 16, determines the position of the light that indication body 3 sends, and obtains the position of indication body 3 in screen SC.When the position determined exists object, control part 10 controls drawing section 15, deletes the object (step S41) being presented at the position of indication body 3, generating object data (step S42).
Such as, when user makes indication body 3 move to the position of line L3, line L3 is deleted, as shown in Figure 15, becomes the state of display line L1, line L2, line L4 and line L5 in screen SC.In addition, in the object data of strikethrough L3, comprise the order information " 3 " of the object data of the line L3 of deletion.In addition, the object data due to strikethrough L3 is the 6th object data be generated, and therefore, the order information comprised in object data is " 6 ".
Control part 10 is when generating object data, remove the reservation to response carried out in step s 40, control Department of Communication Force 18, the object data of strikethrough L3 is sent to tablet terminal 4A and tablet terminal 4B (step S43, step S44) as the response of the request for step S38 and step S39.When Department of Communication Force 45 receives the object data of projecting apparatus 1 transmission, the control part 40 of tablet terminal 4A and tablet terminal 4B controls touch pad 43, makes its strikethrough L3 (step S45, step S46).Herein, touch pad 43 shows the image identical with the image shown in screen SC.When the deletion of end object, the control part 40 of tablet terminal 4A and tablet terminal 4B obtains the order information of maximum from the object data received, control Department of Communication Force 45, send request to projecting apparatus 1, this request is for asking the object data (step S47, step S48) of the order information comprised as the value larger than the value obtained.
(action case during operation slider bar S)
Next, action during user operation slider bar S is described.Such as, under the state that the slide block of slider bar S is positioned at right-hand member, when user make indication body 3 be positioned at the position of the slide block of slider bar S after, make indication body 3 be moved to the left 1 step of the scale of slider bar S along the direction of principal axis of slider bar S time, the slide block of slider bar S is presented at the position being moved to the left 1 step by control part 10.In addition, control part 10 obtains the order information of maximum from object data, deducts the value of the step number of the slide block movement of slider bar S from the value obtained.Such as, as mentioned above, when making the slide block of slider bar S be moved to the left 1 step from deleting the state of line L3, the maximum due to order information is " 6 " of the object data of strikethrough L3, and therefore, result is 6-1=5.
Control part 10, according to the object data of object data to the object data that order information is the value obtained by above-mentioned calculating from order information being 1, controls drawing section 15, carries out describing again of the image projecting screen SC.Herein, when the value obtained by calculating as stated being 5, being the object data of 1 ~ 5 according to order information, carrying out describing again of the image projecting screen SC.Due to the object data that the order information object data that is 1 ~ 5 is line L1 ~ line L5, therefore, as shown in figure 16, the image projected in screen SC become line L3 deleted before image.
When make from right-hand member the slide block of slider bar S move after carried out the operation of the icon of contact tool hurdle TB, control part 10 makes the slide block of slider bar S turn back to right-hand member, makes image restoration be slider bar S by the state before operating.In addition, when carried out the operation of the slide block movement making slider bar S in tablet terminal 4, tablet terminal 4, in the same manner as projecting apparatus 1, has carried out describing again of the image of display in touch pad 43.
As mentioned above, according to the present embodiment, when having carried out object additional, the object data of additional object is sent to tablet terminal 4, and the image of tablet terminal 4 has been updated.According to the present embodiment, even if do not send the total data in shown image, also image can be upgraded, therefore, it is possible to suppress the amount of information of communication.
In addition, according to the present embodiment, owing to carrying out the display of object based on object data, therefore, it is possible in units of object, make the display of object fall back or proceed to user expect position.In addition, owing to showing in units of object, therefore, it is possible to integrate the object described in multiple tablet terminal 4, in multiple device, public image is shared.
[the 2nd execution mode]
Next, the 2nd execution mode of the present invention is described.The display system PS1 of the 2nd execution mode of the present invention is made up of the device identical with the 1st execution mode.In the present embodiment, the structure of projecting apparatus 1 and tablet terminal 4 is different from the 1st execution mode.In the following description, the incomplete structure identical with the 1st execution mode is illustrated, below, the difference with the 1st execution mode is described.
Figure 17 is the figure of the hardware configuration of the tablet terminal 4 that the 2nd execution mode is shown.The tablet terminal 4 of present embodiment has sound processing section 46.Sound processing section 46 has microphone and loud speaker.Sound processing section 46 will represent that the voice data of sound is converted to analog signal.This analog signal is provided to loud speaker, based on analog signal, sounds from loud speaker.In addition, sound processing section 46 has following function: the sound collected by microphone is converted to digital signal, generates the voice data representing the sound collected.That is, sound processing section 46 is examples for the voice data generation unit generating voice data.
Figure 18 is the figure of the hardware configuration of the projecting apparatus 1 that the 2nd execution mode is shown.The projecting apparatus 1 of present embodiment has sound processing section 19.Sound processing section 19 has microphone and loud speaker.Sound processing section 19 will represent that the voice data of sound is converted to analog signal.This analog signal is provided to loud speaker, based on analog signal, sounds from loud speaker.In addition, sound processing section 19 has following function: the sound collected by microphone is converted to digital signal, generates the voice data representing the sound collected.Voice data is stored in storage part 12.
In the present embodiment, when user has carried out indicating the operation starting white boarding in controller RC, control part 10 will represent that the voice data of the sound that microphone is collected is stored in storage part 12.In addition, control part 10 carries out timing to the elapsed time from starting the storage of voice data, when generating object data, is included in object data in the elapsed time measured when generating object data.In addition, because object data generates more late, then the value in elapsed time is larger, therefore, it is possible to according to elapsed time of each object data, obtain the genesis sequence of object data, therefore, the elapsed time can be described as an example of order information.
In projecting apparatus 1, when having carried out the operation of the slide block movement making slider bar S, control part 10, according to the position of slide block, has reproduced the voice data stored.Such as, as illustrated in the 1st execution mode, when making the slide block of slider bar S be moved to the left 1 step under the state of Figure 15, control part 10 obtains the order information of maximum from object data, deducts the value of the step number of the slide block movement of slider bar S from the value obtained.Herein, the maximum due to order information is " 6 " of the object data of strikethrough L3, and therefore, result of calculation is 6-1=5.Control part 10 is determined and is comprised the object data of result of calculation 5 as order information, obtains the elapsed time comprised in the object data determined.Control part 10, after obtaining the elapsed time, obtains the voice data after the acquired elapsed time, is sent to sound processing section 19 from storage part 12.The sound that sound processing section 19 makes loud speaker send sent voice data to represent.According to this structure, due to the sound that can recreate the past during additional object, therefore, it is possible to the content of session when easily obtaining additional object.
In addition, when carried out the operation of the slide block movement making slider bar S in tablet terminal 4, the voice data stored in control part 40 projection instrument 1.Such as, in tablet terminal 4A, when making the slide block of slider bar S be moved to the left 1 step under the state at Figure 15, control part 40 obtains the order information of maximum from object data, deducts the value of the step number of the slide block movement of slider bar S from the value obtained.Herein, the maximum due to order information is " 6 " of the object data of strikethrough L3, and therefore, result of calculation is 6-1=5.Control part 40 is determined and is comprised the object data of result of calculation 5 as order information, obtains the elapsed time comprised in the object data determined.
Control part 40 controls Department of Communication Force 45, by the elapsed time and be used for asking the message of the voice data after the acquired elapsed time to send to projecting apparatus 1.The projecting apparatus 1 achieving this message obtains the voice data after the elapsed time comprised in acquired message from storage part 12, controls Department of Communication Force 18, gives tablet terminal 4 by the audio data transmitting obtained.Sent audio data transmitting is given sound processing section 46 by the control part 40 of tablet terminal 4.The sound that sound processing section 46 makes loud speaker send sent voice data to represent.According to this structure, recreate the past in tablet terminal 4 additional object time sound, therefore, it is possible to the content of session when easily obtaining additional object.
In addition, when the structure stored voice data, also can give projecting apparatus 1 by the audio data transmitting that tablet terminal generates according to the sound collected, the voice data that the voice data generate projecting apparatus 1 and tablet terminal generate synthesizes.
In addition, in said structure, be configured to, in object data, comprise the elapsed time, but also can comprise the date-time of generating object data.When this structure, projecting apparatus 1 or tablet terminal 4 obtain the date-time comprised in object data, the voice data after the elapsed time that reproduction obtains.Herein, owing to can obtain the genesis sequence of object data according to the date-time comprised in each object data, therefore, date-time can be described as an example of order information.
[variation]
Above, embodiments of the present invention are illustrated, but the invention is not restricted to above-mentioned execution mode, can be implemented by other various mode.Such as, distortion can be carried out to implement the present invention to above-mentioned execution mode as follows.In addition, also above-mentioned execution mode and following variation can be combined respectively.
In the above-described embodiment, also can be when projecting apparatus 1, tablet terminal 4A or tablet terminal 4B any one in carried out making the operation of the slide block movement of slider bar S time, the image that in the device beyond the device making slide block movement, also display is corresponding with the position of the slide block of movement.
Such as, when carried out the operation of the slide block movement making slider bar S in projecting apparatus 1, the object data of object data to the object data corresponding with the position of slide block from order information being 1 has been sent to tablet terminal 4A, 4B by projecting apparatus 1.Receive the tablet terminal 4A of these object datas, 4B according to the object data sent, upgrade the object of display in touch pad 43.
In addition, when carried out the operation of the slide block movement making slider bar S in tablet terminal 4, carry out making the tablet terminal 4 of the operation of slide block movement to play the object data till the object data corresponding with the position of slide block to projecting apparatus 1 object data that to ask from order information be 1.Order information is comprised in this request.The request from tablet terminal 4 that projecting apparatus 1 receives according to data reception portion 105, sends to tablet terminal 4A, 4B by the object data corresponding to asked order information.Receive the tablet terminal 4A of these object datas, 4B according to the object data sent, upgrade the object of display in touch pad 43.In addition, projecting apparatus 1, according to requested object data, upgrades the image projected.
According to this variation, projecting apparatus 1 can be made identical with the image that tablet terminal 4 shows.
In above-mentioned 2nd execution mode, carry out producing sound data according to the operation of slider bar S, but the structure of the 2nd execution mode is not limited to the structure that voice data reproduces.Such as, also can when utilize indication body 3 or writing pen carried out the operation of the icon of contact tool hurdle TB after, utilized indication body 3 or writing pen to carry out touching object operation, obtain the elapsed time comprised in object data the object data of the object touched from indication body 3 or writing pen, sound according to the voice data after the elapsed time obtained.In addition, in the structure shown here, the elapsed time comprised in object data is obtained the object data of the object also can touched from indication body 3 or writing pen, determine the object data comprising the elapsed time in the past in acquired elapsed time, according to the object data determined, the image shown by renewal.In addition, when having carried out this operation in tablet terminal 4, the object data in the elapsed time before the elapsed time obtained can be comprised to projecting apparatus 1 request, asked object data is sent to tablet terminal 4 by projecting apparatus 1, tablet terminal 4, according to the object data sent, upgrades the display of touch pad 43.
In the above-described embodiment, when state when showing formation object according to the operation of slider bar S, also following structure can be adopted.In this variation, the left end of slider bar S represents that the generation moment of initial object, the right-hand member of slider bar S represent the generation moment of up-to-date object.
When having carried out the operation of the slide block movement making slider bar S in projecting apparatus 1, control part 10 has been obtained from the generation moment of initial object and has been played elapsed time till the moment corresponding with the position of slide block.Control part 10 determines the object data comprising the elapsed time in the past in elapsed time obtained, and according to the object data determined, upgrades the image projected.
In addition, when having carried out the operation of the slide block movement making slider bar S in tablet terminal 4, control part 40 has been obtained from the generation moment of initial object and has been played elapsed time till the moment corresponding with the position of slide block.Control part 40 determines the object data in the elapsed time before the elapsed time comprising and obtain, and according to the object data determined, upgrades the image of display in touch pad 43.In addition, tablet terminal 4 also can comprise the object data in the elapsed time before the elapsed time obtained to projecting apparatus 1 request.In the case of such a construction, asked object data is sent to tablet terminal 4 by projecting apparatus 1, and tablet terminal 4, according to the object data sent, upgrades the display of touch pad 43.
In this variation, the display of object also can be made to carry out falling back or advancing.
In the above-described embodiment, as an example of display unit, the projecting apparatus 1 of front projection type is illustrated, but also can is the projecting apparatus of rear projection type.In addition, in projecting apparatus 1, about light valve, be not limited to use liquid crystal, such as also can time use the structure of Digital Micromirror Device.In addition, as long as the device of display unit show image of the present invention, being then not limited to projecting apparatus 1, also can be direct view display.In addition, as direct view display, can be liquid crystal indicator, employ the display unit, plasma display system, organic EL display etc. of CRT (Cathode Ray Tube: negative electrode camera tube).In addition, the projecting apparatus that also can use smart mobile phone or personal computer or be provided with white boarding carrys out alternative tablet terminal.In addition, as the technology sharing display data between projecting apparatus and tablet terminal, employ the technology of Web page and Web browser and describe invention, but the technology sharing display data is not limited thereto, other various method also can be utilized to realize.In addition, as the unit of rendered object in screen SC, describe invention by employing the example of indication body 3, but the unit of coordinate on entr screen SC is not limited thereto, and also can adopt the method utilizing user to point, or utilize the method for controller RC.In addition, also can be the method for the light utilizing laser designator.That is, the light of the finger of user, controller RC or laser designator is also an example of indication body.

Claims (10)

1. a display unit, it has:
Display unit, its show image;
Acquisition unit, it obtains the coordinate on the viewing area of described display unit;
Object display unit, it makes object be presented at the position of the coordinate that described acquisition unit obtains;
Generation unit, its generating object data, this object data is the data representing described object, comprises the order information of the genesis sequence representing described object; And
Script transmitting element, the script being used for obtaining described object data is sent to client by it.
2. display unit according to claim 1, wherein,
Described display unit has:
2nd object data receiving element, it receives the 2nd object data that described client generates and sends; And
2nd object display unit, the 2nd object data received described in its display,
Described generation unit generate described order information is additional to described in the 2nd object data that receives and the 3rd object data that obtains,
Described script transmitting element sends the described script for obtaining the object data comprising described 3rd object data.
3. display unit according to claim 1, wherein,
Described display unit has:
Receiving element, it receives the order information that described client sends;
Data transmission unit, the object data that described generation unit generates is sent to described client by it,
The object data played till the order that order information that described receiving element receives represents from the initial object data generated in the object data that described generation unit generates by described data transmission unit sends to described client.
4. display unit according to claim 3, wherein,
Described receiving element is received in the order information of specifying in the user interface of the slider bar form of described client display.
5. display unit according to claim 3, wherein,
That described receiving element receives the display of described client, that the object data of the object of being specified by user in corresponding with the object data that described data transmission unit sends object comprises order information.
6. display unit according to claim 3, wherein,
Described display unit has voice data generation unit, and this voice data generation unit generates the voice data representing the sound collected,
Described data transmission unit by described voice data, audio data transmitting after rise time of the object data of order that order information that described receiving element obtains represents gives described client.
7. a display system, it has display unit and client,
Described display unit has:
Display unit, its show image;
Acquisition unit, it obtains the coordinate on the viewing area of described display unit;
Object display unit, it makes object be presented at the position of the coordinate that described acquisition unit obtains;
Generation unit, its generating object data, this object data is the data representing described object, comprises the order information of the genesis sequence representing described object;
Data transmission unit, the object data that described generation unit generates is sent to client by it; And
Script transmitting element, the script being used for obtaining described object data is sent to described client by it,
Described client has:
Script receiving element, it receives described script;
Script executing unit, it performs described script;
Data receipt unit, it receives described object data; And
Indicative control unit, it controls according to described object data the display unit that described client has, to show the order object in the past of being specified by user.
8. display system according to claim 7, wherein,
Described client also has:
2nd object data generation unit, it generates the 2nd object data;
2nd object data transmitting element, described 2nd object data is sent to described display unit by it,
Described display unit has:
2nd object data receiving element, it receives described 2nd object data; And
2nd object display unit, the 2nd object data received described in its display,
Described generation unit generate described order information is additional to described in the 2nd object data that receives and the 3rd object data that obtains,
Described script transmitting element sends the described script for obtaining the object data comprising described 3rd object data.
9. a display packing, comprises the steps:
Object display step, makes object be presented at the viewing area of display unit;
Generation step, generating object data, this object data is the data representing described object, comprises the order information of the genesis sequence representing described object;
Data sending step, sends to client by the object data generated in described generation step;
Script forwarding step, sends to described client by the script being used for obtaining described object data;
Script receiving step, receives described script;
Script performs step, performs described script;
Data reception step, receives described object data; And
Display and control step, according to described object data, controls the display unit that client has, to show the order object in the past of being specified by user.
10. display packing according to claim 9, also comprises the steps:
2nd object data receiving step, receives the 2nd object data that described client generates and sends;
2nd object data step display, the 2nd object data received described in display,
In described generation step, described order information is additional to the 2nd object data received,
In described script forwarding step, send the script for obtaining the object data comprising the 2nd object data being attached with described order information.
CN201410708521.2A 2014-01-16 2014-11-28 Display device, display system and display methods Active CN104796658B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014005751A JP6307889B2 (en) 2014-01-16 2014-01-16 Display device, display system, and display method
JP2014-005751 2014-01-16

Publications (2)

Publication Number Publication Date
CN104796658A true CN104796658A (en) 2015-07-22
CN104796658B CN104796658B (en) 2018-04-10

Family

ID=52396441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410708521.2A Active CN104796658B (en) 2014-01-16 2014-11-28 Display device, display system and display methods

Country Status (4)

Country Link
US (2) US9489075B2 (en)
EP (1) EP2897043B1 (en)
JP (1) JP6307889B2 (en)
CN (1) CN104796658B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019012499A (en) * 2017-07-03 2019-01-24 Necネッツエスアイ株式会社 Electronic writing board system

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6307889B2 (en) * 2014-01-16 2018-04-11 セイコーエプソン株式会社 Display device, display system, and display method
CN104932700B (en) * 2015-07-17 2017-01-25 焦点教育科技有限公司 Methods for achieving object projection by means of intelligent terminal
CN113918072A (en) 2015-08-04 2022-01-11 株式会社和冠 Display control method, computer, storage medium, and method
CN106990650B (en) * 2016-01-20 2020-05-22 中兴通讯股份有限公司 Method and device for adjusting projection picture
JP6619308B2 (en) * 2016-09-07 2019-12-11 株式会社ワコム Handwritten data drawing method, handwritten data drawing apparatus, and program
JP7238594B2 (en) * 2019-05-21 2023-03-14 セイコーエプソン株式会社 Display system control method and display system
JP6849774B2 (en) * 2019-11-14 2021-03-31 株式会社ワコム Methods, programs, and computers
JP7388159B2 (en) 2019-11-29 2023-11-29 株式会社リコー Display device, display method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6144991A (en) * 1998-02-19 2000-11-07 Telcordia Technologies, Inc. System and method for managing interactions between users in a browser-based telecommunications network
US20040172588A1 (en) * 1996-08-21 2004-09-02 Mattaway Shane D. Collaborative multimedia architecture for packet-switched data networks
US7043529B1 (en) * 1999-04-23 2006-05-09 The United States Of America As Represented By The Secretary Of The Navy Collaborative development network for widely dispersed users and methods therefor
CN102685174A (en) * 2011-05-18 2012-09-19 上海华博信息服务有限公司 Man-machine interaction information processing method based on large-scale display system
CN102780757A (en) * 2011-05-12 2012-11-14 索尼公司 Information processing apparatus, information processing method and computer program
JP2013065125A (en) * 2011-09-16 2013-04-11 Ricoh Co Ltd Screen sharing system, screen sharing terminal, electronic blackboard system and program

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3526067B2 (en) * 1993-03-15 2004-05-10 株式会社東芝 Reproduction device and reproduction method
JPH0764893A (en) * 1993-08-31 1995-03-10 Canon Inc Network system
US20020133611A1 (en) * 2001-03-16 2002-09-19 Eddy Gorsuch System and method for facilitating real-time, multi-point communications over an electronic network
DE102006001607B4 (en) * 2005-01-14 2013-02-28 Mediatek Inc. Methods and systems for the transmission of sound and image data
JP2008544383A (en) * 2005-06-25 2008-12-04 インテル・コーポレーション Apparatus, system, and method for supporting service call
JP4976975B2 (en) * 2007-10-03 2012-07-18 株式会社東芝 Server apparatus, server apparatus control method, and server apparatus control program
JP4916428B2 (en) * 2007-12-20 2012-04-11 パナソニック株式会社 CONNECTION DEVICE, ITS CONNECTION METHOD, AND PROGRAM
JP5147742B2 (en) * 2009-01-15 2013-02-20 三菱電機株式会社 Screen transmission system and screen transmission method
JP4676014B2 (en) * 2009-06-30 2011-04-27 株式会社東芝 Information processing apparatus and capture image transfer processing method
US20110066971A1 (en) * 2009-09-14 2011-03-17 Babak Forutanpour Method and apparatus for providing application interface portions on peripheral computing devices
US20120154511A1 (en) * 2010-12-20 2012-06-21 Shi-Ping Hsu Systems and methods for providing geographically distributed creative design
JP5741821B2 (en) * 2011-03-24 2015-07-01 コニカミノルタ株式会社 Data processing transmission apparatus, data processing transmission program, and method
JP6171319B2 (en) * 2012-12-10 2017-08-02 株式会社リコー Information processing apparatus, information processing method, information processing system, and program
KR20140090297A (en) * 2012-12-20 2014-07-17 삼성전자주식회사 Image forming method and apparatus of using near field communication
US20140267077A1 (en) * 2013-03-15 2014-09-18 Amazon Technologies, Inc. User Device with a Primary Display and a Substantially Transparent Secondary Display
JP5871876B2 (en) * 2013-09-30 2016-03-01 シャープ株式会社 Information processing apparatus and electronic conference system
JP6307889B2 (en) * 2014-01-16 2018-04-11 セイコーエプソン株式会社 Display device, display system, and display method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040172588A1 (en) * 1996-08-21 2004-09-02 Mattaway Shane D. Collaborative multimedia architecture for packet-switched data networks
US6144991A (en) * 1998-02-19 2000-11-07 Telcordia Technologies, Inc. System and method for managing interactions between users in a browser-based telecommunications network
US7043529B1 (en) * 1999-04-23 2006-05-09 The United States Of America As Represented By The Secretary Of The Navy Collaborative development network for widely dispersed users and methods therefor
CN102780757A (en) * 2011-05-12 2012-11-14 索尼公司 Information processing apparatus, information processing method and computer program
CN102685174A (en) * 2011-05-18 2012-09-19 上海华博信息服务有限公司 Man-machine interaction information processing method based on large-scale display system
JP2013065125A (en) * 2011-09-16 2013-04-11 Ricoh Co Ltd Screen sharing system, screen sharing terminal, electronic blackboard system and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019012499A (en) * 2017-07-03 2019-01-24 Necネッツエスアイ株式会社 Electronic writing board system

Also Published As

Publication number Publication date
JP2015135544A (en) 2015-07-27
JP6307889B2 (en) 2018-04-11
EP2897043A1 (en) 2015-07-22
US20150199059A1 (en) 2015-07-16
US20170052621A1 (en) 2017-02-23
CN104796658B (en) 2018-04-10
US9939943B2 (en) 2018-04-10
EP2897043B1 (en) 2017-03-22
US9489075B2 (en) 2016-11-08

Similar Documents

Publication Publication Date Title
CN104796658A (en) Display apparatus, display system, and display method
CN104777991A (en) Remote interactive projection system based on mobile phone
JP6379521B2 (en) Transmission terminal, transmission system, transmission method and program
JP7413693B2 (en) Communication terminals, communication systems, data sharing methods and programs
JP6720513B2 (en) Communication terminal, communication system, communication control method, and program
JP2018093361A (en) Communication terminal, communication system, video output method, and program
JP6464692B2 (en) Information processing apparatus, information processing system, information processing method, and program
JP2015070543A (en) Transmission terminal, transmission method, and program
JP7400345B2 (en) Communication terminals, communication systems, data sharing methods and programs
JP6260201B2 (en) Transmission terminal, transmission method, and program
JP2017068329A (en) Communication management system, communication system, communication management method, and program
JP6361728B2 (en) Transmission control system, transmission system, transmission control method, and recording medium
JP6314539B2 (en) Transmission terminal, transmission system, transmission method and program
JP6544117B2 (en) Terminal, communication system, communication method, and program
JP7196951B2 (en) Information processing device, program, method, system
TW202016904A (en) Object teaching projection system and method thereof
JP2022049507A (en) Communication system, communication terminal, screen sharing method, and program
CN113316011A (en) Control method, system, equipment and storage medium of electronic whiteboard system
JP7358934B2 (en) Communication management system, communication system, communication method, and program
JP6880572B2 (en) Mobile terminals, communication systems, communication methods, and programs
JP2020154437A (en) Communication management system, communication system, communication management device, communication management method, and program
JP7452602B2 (en) Terminal device, communication system, communication method, and program
JP2013232124A (en) Electronic conference system
JP2020161120A (en) Information transmission system, information processing apparatus, transmission method, and program
CN114745505A (en) Shooting method, shooting device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant