US20080005283A1 - Remote instruction system, remote instruction method, and program product therefor - Google Patents

Remote instruction system, remote instruction method, and program product therefor Download PDF

Info

Publication number
US20080005283A1
US20080005283A1 US11/589,176 US58917606A US2008005283A1 US 20080005283 A1 US20080005283 A1 US 20080005283A1 US 58917606 A US58917606 A US 58917606A US 2008005283 A1 US2008005283 A1 US 2008005283A1
Authority
US
United States
Prior art keywords
image
attention
annotation
captured
attention image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/589,176
Inventor
Jun Shingu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fuji Xerox Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd filed Critical Fuji Xerox Co Ltd
Assigned to FUJI XEROX CO., LTD. reassignment FUJI XEROX CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHINGU, JUN
Publication of US20080005283A1 publication Critical patent/US20080005283A1/en
Priority to US13/137,583 priority Critical patent/US8587677B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Definitions

  • This invention generally relates to a remote instruction system, by which an instruction can be given to an object from a remote site.
  • An aspect of the present invention provides a remote instruction system including an attention image outputting portion that projects an annotation image and an attention image from a projecting portion onto a captured area of an image capturing portion that captures an image of an object, the annotation image being created on the basis of the image captured, the attention image being provided for attracting attention to the annotation image.
  • FIG. 1 schematically shows a remote instruction system according to an aspect of the present invention
  • FIG. 2 is a view of an annotation image projected onto an object
  • FIG. 3 is a functional block diagram of a server
  • FIG. 4 is a flowchart showing an operation of an attention image outputting portion provided in the server
  • FIG. 5 is a flowchart showing the operation of a remote terminal
  • FIG. 6A through FIG. 6D show an exemplary embodiment of the method of projecting an attention image
  • FIG. 7 shows an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 8 shows an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 9 shows an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 10 shows an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 11 is an alternative flowchart showing the operation of the attention image outputting portion provided in the server.
  • FIG. 12A through FIG. 12D show an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 13 is an alternative flowchart showing the operation of the attention image outputting portion provided in the server
  • FIG. 14A through FIG. 14C show an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 15A and FIG. 15B schematically show circuit boards on which electronics parts are mounted
  • FIG. 16A and FIG. 16B show an alternative exemplary embodiment of the method of projecting the attention image
  • FIG. 17 is an alternative flowchart showing the operation of the attention image outputting portion provided in the server.
  • FIG. 18A and FIG. 18B are display examples on a display at a remote terminal
  • FIG. 19A and FIG. 19B are alternative display examples on a display at a remote terminal.
  • FIG. 20 schematically shows the remote instruction system in which a laser pointer is connected.
  • FIG. 1 schematically shows a remote instruction system according to an aspect of the present invention.
  • FIG. 2 is a view of an annotation image 8 projected onto an object 5 .
  • FIG. 3 is a functional block diagram of a server 1 .
  • the remote instruction system includes: the server 1 ; and a remote terminal 3 connected through a network 2 to the server 1 .
  • the server 1 is provided with: a camcorder 7 serving as an image capturing portion that captures an image of an object 5 ; and a projector 9 serving as a projecting portion that projects the annotation image 8 onto the object 5 .
  • the annotation image 8 includes any type of image such as a line, character, drawing, and the like.
  • the remote terminal 3 includes: a display 13 that displays the image captured by the camcorder 7 ; and a mouse 15 used for giving an instruction to project the annotation image 8 to the object 5 .
  • the annotation image 8 is projected onto the object 5 located in a captured area 4 , as shown in FIG. 2 .
  • the server 1 is provided with: a capturing portion 12 that controls the camcorder 7 ; and a transmitting portion 14 that transmits the image captured by the camcorder 7 to the remote terminal 3 .
  • the server 1 is also provided with: a receiving portion 18 that receives the annotation image 8 ; and a projecting portion 16 that controls the projector 9 to make the annotation image 8 project.
  • the server 1 is also provided with an attention image outputting portion 11 that outputs an attention image 10 to the projecting portion 16 .
  • the captured image of the object 5 captured by the camcorder 7 is transmitted from the server 1 through the network 2 to the remote terminal 3 . Then, the captured image is displayed on the display 13 . This allows an operator who operates the remote terminal 3 to give an instruction to draw a desired annotation image 8 by means of the mouse 15 , according to the captured image displayed on the display 13 .
  • the annotation image 8 instructed by the mouse 15 is transmitted from the remote terminal 3 through the network 2 to the server 1 , and the annotation image 8 is projected onto the object 5 by the projector 9 . While the annotation image 8 is being projected or after the annotation image 8 is projected, an attention image 10 is projected onto the object 5 by the projector 9 .
  • FIG. 4 is a flowchart showing an operation of the attention image outputting portion 11 provided in the server 1 .
  • the attention image outputting portion 11 determines whether or not the annotation image 8 starts to be drawn on the captured image (S 1 ). If the annotation image 8 is drawn on the captured image (S 1 : Yes), the attention image outputting portion 11 makes the projector 9 start projecting the annotation image 8 (S 2 ). If the annotation image 8 is not drawn on the captured image (S 1 : No), step S 2 is skipped.
  • the attention image outputting portion 11 determines whether or not the annotation image 8 is finished drawing on the captured image (S 3 ). If the annotation image 8 is finished drawing on the captured image (S 3 : Yes), the attention image outputting portion 11 makes the projector 9 stop projecting the annotation image 8 (S 4 ), and makes the projector 9 start projecting the annotation image 8 (S 5 ). Meanwhile, if the annotation image 8 is not finished drawing on the captured image (S 1 : No), processing returns to step S 1 .
  • FIG. 5 is a flowchart showing the operation of the remote terminal 3 .
  • the remote terminal 3 Upon receiving the captured image (S 11 ), the remote terminal 3 displays the captured image on the display 13 (S 12 ). Next, if the annotation image 8 is drawn on the captured image (S 13 : Yes), the remote terminal 3 transmits the annotation image 8 to the server 1 (S 14 ). If not (S 13 : No), the remote terminal 3 waits until the annotation image 8 is drawn. The annotation image 8 is transmitted until the annotation image 8 is finished drawing (S 15 : No). When the annotation image 8 is finished drawing (S 15 : Yes), the remote terminal 3 finishes processing.
  • the attention image 10 projected onto the object 5 by the attention image outputting portion 11 may be the images shown in FIG. 6A through FIG. 10 .
  • FIG. 6A through FIG. 6D show an exemplary embodiment of the method of projecting the attention image 10 .
  • the attention image outputting portion 11 displays the attention image 10 to wholly include the annotation image 8 .
  • the attention image outputting portion 11 is projected to gradually concentrate on the annotation image 8 .
  • the attention image outputting portion 11 is projected to get closer to the annotation image 8 .
  • the attention image 10 is stopped projecting to prevent the attention image 10 from continuing being projected
  • the attention image outputting portion 11 performs the above-described processing of changing an output state, while the attention image is being output. This makes the annotation image 8 noticeable.
  • FIG. 7 and FIG. 8 show alternative exemplary embodiments of the method of projecting the attention image 10 .
  • the attention image 10 may be composed of triple circles, as shown in FIG. 7 , or may be, for example, composed of a dotted line, as shown in FIG. 8 .
  • the method of projecting the attention image 10 is not limited to the afore-described ones.
  • FIG. 9 also shows an alternative method of projecting the attention image 10 . If the shape of the annotation image 8 to be output is, for example, horizontally long, the attention image 10 may be projected in accordance with the shape of the annotation image 8 . In the afore-mentioned case, it is desirable that the horizontal to vertical ratio of the attention image 10 should be determined by that of the annotation image 8 .
  • FIG. 10 also shows an alternative method of projecting the attention image 10 .
  • the attention image 10 may overlap the annotation image 8 . Such projection can make the annotation image 8 noticeable.
  • the output state of the attention images shown in FIG. 7 through FIG. 10 maybe changed while the attention images are being projected.
  • FIG. 11 is a flowchart showing an alternative operation of the attention image outputting portion 11 .
  • the attention image outputting portion 11 determines whether or not the annotation image 8 has started to be drawn on the captured image (S 21 ). If the annotation image 8 has started to be drawn on the captured image (S 21 : Yes), the attention image outputting portion 11 makes the projector 9 start projecting the annotation image 8 (S 22 ).
  • the attention image outputting portion 11 determines whether or not the annotation image 8 has just started to be drawn (S 23 ). If the annotation image 8 has just started to be drawn (S 23 : Yes), the attention image outputting portion 11 makes the projector 9 project the attention image 10 (S 24 ). If the annotation image 8 is not drawn on the captured image (S 21 : No), steps S 22 , S 23 , and S 24 are skipped. If the annotation image 8 has not just started to be drawn, step S 24 is skipped.
  • the attention image outputting portion 11 determines whether or not the annotation image 8 is finished drawing on the captured image (S 25 ). If the annotation image 8 is finished drawing on the captured image (S 25 : Yes), the attention image outputting portion 11 makes the projector 9 finish projecting the annotation image 8 (S 26 ), and makes the projector 9 start projecting the attention image 10 (S 27 ). Meanwhile, if the annotation image 8 is not finished drawing on the captured image (S 25 : No), processing returns to step S 21 .
  • the attention image outputting portion 11 projects the attention image 10 , when the annotation image 8 starts to be projected. While the annotation image 8 is being drawn, the attention image 10 is not projected as shown in FIG. 12B . When the annotation image 8 is finished drawing, the attention image 10 is displayed again as shown in FIG. 12C . Then, the attention image 10 is stopped projecting lastly, as shown in FIG. 12D . In this manner, the attention image 10 is projected not only when the annotation image 8 is finished drawing but also when the annotation image 8 starts to be drawn.
  • FIG. 13 is a flowchart showing an alternative operation of the attention image outputting portion 11 .
  • FIG. 13 shows another exemplary embodiment of processing at step S 4 and step S 5 shown in FIG. 4 or at step S 24 and S 27 shown in FIG. 11 .
  • the attention image outputting portion 11 makes the projector 9 start projecting the attention image 10 .
  • the attention image outputting portion 11 makes the projector 9 start projecting the attention image 10 (S 33 ).
  • FIG. 14A through FIG. 14C a description will be given with reference to FIG. 14A through FIG. 14C .
  • the attention image 10 is projected, resulting in an excessive projection of the attention image 10 .
  • This may annoy the viewer of the annotation images 8 a through 8 a .
  • it is configured to project the attention image 10 after a given period of time has passed since the latest annotation image 8 was projected.
  • the viewer of the annotation image 8 does not feel annoyance any longer.
  • FIG. 14A the first annotation image 8 a is drawn and the attention image 10 a is projected.
  • FIG. 14A the first annotation image 8 a is drawn and the attention image 10 a is projected.
  • FIG. 15A is a plane view of a circuit board 19 on which electronics parts 21 are mounted. If the object 5 is a circuit board 19 in this manner, the annotation image 8 is projected between the electronics parts 21 as shown in FIG. 15B . In this case, the electronics part 21 possibly blocks the sight line to the annotation image 8 . This makes the viewer of the annotation image 8 difficult to find the projection position of the annotation image 8 .
  • the attention image 10 is moved and concentrated on the annotation image 8 .
  • the attention image outputting portion 11 is capable of analyzing the complexity of the image captured by the camcorder 7 by means of differential histogram or the like, and changing the shape of the attention image 10 to be output according to the analysis results.
  • the attention image 10 is a circle, it is desirable to change the number of the circles, to change the radius thereof in starting drawing, or to change the thickness thereof.
  • FIG. 17 is a flowchart showing an alternative operation of the attention image outputting portion 11 , when the remote terminal 3 is connected to the server 1 .
  • FIG. 17 shows another exemplary embodiment of the processing at step S 5 shown in FIG. 4 or at step S 24 and S 27 shown in FIG. 11 .
  • the attention image outputting portion 11 makes the projector 9 start projecting the attention image 10 .
  • the attention image outputting portion 11 makes the projector 9 project the attention image 10 (S 41 ), while the attention image 10 is being projected (S 42 : No), the captured image is stopped transmitting (S 44 ). After the projection of the attention image 10 is finished ( 542 : Yes), the captured image starts to be transmitted again (S 43 ).
  • the attention image 10 is projected onto the annotation image 8 projected onto the object 5 .
  • FIG. 18A if the attention image 10 is being projected and the projection of the attention image 10 has not finished, the image is not transmitted to the display 13 at a remote site.
  • the captured image of the object 5 onto which the attention image 10 is projected is not displayed, and the captured image of the object 5 onto which only the annotation image 8 is projected is displayed.
  • FIG. 18B the projection of the attention image 10 onto the object 5 is finished, and the captured image is transmitted to the display 13 and displayed.
  • the captured image needs not to be transmitted to the remote terminal, while the attention image 10 is being projected, thereby reducing the load over the network.
  • FIG. 19A illustrates the annotation image 8 projected onto the object 5 .
  • FIG. 19B illustrate displays 13 a and 13 b respectively provided for remote terminals 3 a and 3 b.
  • the attention image outputting portion 11 is capable of not displaying the attention image 10 on the remote terminal 3 b by which the annotation image 8 is instructed to project, and is capable of displaying the attention image 10 on the remote terminal 3 a by which the annotation image 8 is not instructed to project.
  • the remote terminal 3 b that gives an instruction to project the annotation image 8 is capable of recognizing the position of the object 5 to be instructed to project the annotation image 8 , thereby eliminating the necessity of displaying the attention image 10 .
  • the remote terminal 3 a that does not give an instruction to project the annotation image 8 is not capable of recognizing the position of the object 5 to be instructed by the remote terminal 3 b .
  • An attention image 10 a needs to be displayed on the remote terminal 3 a .
  • the attention image outputting portion 11 displays the attention image 10 a on the remote terminal 3 a by which an instruction to project the annotation image 8 is not given, thereby making the annotation image 8 projected on the captured image noticeable on the remote terminal 3 a .
  • the attention image 10 a displayed on the remote terminal is not necessarily identical to the captured attention image. For example, as shown in FIG. 13A , a rectangular attention image 10 a may be displayed whereas a circular attention image 10 is being projected.
  • FIG. 20 schematically shows the remote instruction system in which a laser pointer 17 is connected.
  • the annotation image 8 is directly projected onto the object 5
  • the attention image 10 is projected onto the annotation image 8 .
  • the trajectory of the laser made by the laser pointer 17 is captured by the camcorder 7
  • the annotation image 8 is projected onto the object 5 according to the captured image.
  • the attention image 10 is projected onto the object 5 according to the annotation image 8 .
  • the attention image outputting portion 11 is provided in the server 1 .
  • the attention image outputting portion 11 may be provided in the remote terminal 3 , whereas the attention image outputting portion 11 in the server 1 applies the less network load than that provided in the remote terminal 3 .
  • a remote instruction method employed as an aspect of the present invention is realized with a Central Processing Unit (CPU), Read Only Memory (ROM), Random Access Memory (RAM), and the like, by installing a program from a portable memory device or a storage device such as a hard disc device, CD-ROM, DVD, or a flexible disc or downloading the program through a communications line. Then the steps of program are executed as the CPU operates the program.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory

Abstract

A remote instruction system includes an attention image outputting portion that projects an annotation image and an attention image from a projecting portion onto a captured area of an image capturing portion that captures an image of an object, the annotation image being created on the basis of the image captured, the attention image being provided for attracting attention to the annotation image.

Description

    BACKGROUND
  • 1. Technical Field
  • This invention generally relates to a remote instruction system, by which an instruction can be given to an object from a remote site.
  • 2. Related Art
  • There are systems by which communications are made between remote sites. For example, while an object at a remote site is being captured by a camera and such captured image is being transmitted to a monitoring site at another remote site, a pointer created based on the captured image (hereinafter, referred to as annotation image) is transmitted to the remote site and the annotation image is projected onto the object from a video projector. This allows a monitoring person to give an instruction, by use of the annotation image, not only to a telephone or the captured image but also to detailed portions of the object.
  • SUMMARY
  • An aspect of the present invention provides a remote instruction system including an attention image outputting portion that projects an annotation image and an attention image from a projecting portion onto a captured area of an image capturing portion that captures an image of an object, the annotation image being created on the basis of the image captured, the attention image being provided for attracting attention to the annotation image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the present invention will be described in detail based on the following figures, wherein:
  • FIG. 1 schematically shows a remote instruction system according to an aspect of the present invention;
  • FIG. 2 is a view of an annotation image projected onto an object;
  • FIG. 3 is a functional block diagram of a server;
  • FIG. 4 is a flowchart showing an operation of an attention image outputting portion provided in the server;
  • FIG. 5 is a flowchart showing the operation of a remote terminal;
  • FIG. 6A through FIG. 6D show an exemplary embodiment of the method of projecting an attention image;
  • FIG. 7 shows an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 8 shows an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 9 shows an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 10 shows an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 11 is an alternative flowchart showing the operation of the attention image outputting portion provided in the server;
  • FIG. 12A through FIG. 12D show an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 13 is an alternative flowchart showing the operation of the attention image outputting portion provided in the server;
  • FIG. 14A through FIG. 14C show an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 15A and FIG. 15B schematically show circuit boards on which electronics parts are mounted;
  • FIG. 16A and FIG. 16B show an alternative exemplary embodiment of the method of projecting the attention image;
  • FIG. 17 is an alternative flowchart showing the operation of the attention image outputting portion provided in the server;
  • FIG. 18A and FIG. 18B are display examples on a display at a remote terminal;
  • FIG. 19A and FIG. 19B are alternative display examples on a display at a remote terminal; and
  • FIG. 20 schematically shows the remote instruction system in which a laser pointer is connected.
  • DETAILED DESCRIPTION
  • A description will now be given, with reference to the accompanying drawings, of exemplary embodiments of the present invention.
  • FIG. 1 schematically shows a remote instruction system according to an aspect of the present invention. FIG. 2 is a view of an annotation image 8 projected onto an object 5. FIG. 3 is a functional block diagram of a server 1. Referring to FIG. 1, the remote instruction system includes: the server 1; and a remote terminal 3 connected through a network 2 to the server 1. The server 1 is provided with: a camcorder 7 serving as an image capturing portion that captures an image of an object 5; and a projector 9 serving as a projecting portion that projects the annotation image 8 onto the object 5. The annotation image 8 includes any type of image such as a line, character, drawing, and the like.
  • Meanwhile, the remote terminal 3 includes: a display 13 that displays the image captured by the camcorder 7; and a mouse 15 used for giving an instruction to project the annotation image 8 to the object 5. The annotation image 8 is projected onto the object 5 located in a captured area 4, as shown in FIG. 2.
  • Referring now to FIG. 3, the server 1 is provided with: a capturing portion 12 that controls the camcorder 7; and a transmitting portion 14 that transmits the image captured by the camcorder 7 to the remote terminal 3. The server 1 is also provided with: a receiving portion 18 that receives the annotation image 8; and a projecting portion 16 that controls the projector 9 to make the annotation image 8 project. The server 1 is also provided with an attention image outputting portion 11 that outputs an attention image 10 to the projecting portion 16.
  • A description will now be given of the operation of the remote instruction system configured as described above. The captured image of the object 5 captured by the camcorder 7 is transmitted from the server 1 through the network 2 to the remote terminal 3. Then, the captured image is displayed on the display 13. This allows an operator who operates the remote terminal 3 to give an instruction to draw a desired annotation image 8 by means of the mouse 15, according to the captured image displayed on the display 13.
  • The annotation image 8 instructed by the mouse 15 is transmitted from the remote terminal 3 through the network 2 to the server 1, and the annotation image 8 is projected onto the object 5 by the projector 9. While the annotation image 8 is being projected or after the annotation image 8 is projected, an attention image 10 is projected onto the object 5 by the projector 9.
  • FIG. 4 is a flowchart showing an operation of the attention image outputting portion 11 provided in the server 1. Firstly, the attention image outputting portion 11 determines whether or not the annotation image 8 starts to be drawn on the captured image (S1). If the annotation image 8 is drawn on the captured image (S1: Yes), the attention image outputting portion 11 makes the projector 9 start projecting the annotation image 8 (S2). If the annotation image 8 is not drawn on the captured image (S1: No), step S2 is skipped.
  • Next, the attention image outputting portion 11 determines whether or not the annotation image 8 is finished drawing on the captured image (S3). If the annotation image 8 is finished drawing on the captured image (S3: Yes), the attention image outputting portion 11 makes the projector 9 stop projecting the annotation image 8 (S4), and makes the projector 9 start projecting the annotation image 8 (S5). Meanwhile, if the annotation image 8 is not finished drawing on the captured image (S1: No), processing returns to step S1.
  • FIG. 5 is a flowchart showing the operation of the remote terminal 3. Upon receiving the captured image (S11), the remote terminal 3 displays the captured image on the display 13 (S12). Next, if the annotation image 8 is drawn on the captured image (S13: Yes), the remote terminal 3 transmits the annotation image 8 to the server 1 (S14). If not (S13: No), the remote terminal 3 waits until the annotation image 8 is drawn. The annotation image 8 is transmitted until the annotation image 8 is finished drawing (S15: No). When the annotation image 8 is finished drawing (S15: Yes), the remote terminal 3 finishes processing.
  • The attention image 10 projected onto the object 5 by the attention image outputting portion 11 may be the images shown in FIG. 6A through FIG. 10. FIG. 6A through FIG. 6D show an exemplary embodiment of the method of projecting the attention image 10. Firstly, as shown in FIG. 6A, the attention image outputting portion 11 displays the attention image 10 to wholly include the annotation image 8. Next, as shown in FIG. 6B, the attention image outputting portion 11 is projected to gradually concentrate on the annotation image 8. Then, as shown in FIG. 6C, the attention image outputting portion 11 is projected to get closer to the annotation image 8. Lastly, as shown in FIG. 6D, the attention image 10 is stopped projecting to prevent the attention image 10 from continuing being projected The attention image outputting portion 11 performs the above-described processing of changing an output state, while the attention image is being output. This makes the annotation image 8 noticeable.
  • FIG. 7 and FIG. 8 show alternative exemplary embodiments of the method of projecting the attention image 10. The attention image 10 may be composed of triple circles, as shown in FIG. 7, or may be, for example, composed of a dotted line, as shown in FIG. 8. The method of projecting the attention image 10 is not limited to the afore-described ones. FIG. 9 also shows an alternative method of projecting the attention image 10. If the shape of the annotation image 8 to be output is, for example, horizontally long, the attention image 10 may be projected in accordance with the shape of the annotation image 8. In the afore-mentioned case, it is desirable that the horizontal to vertical ratio of the attention image 10 should be determined by that of the annotation image 8.
  • FIG. 10 also shows an alternative method of projecting the attention image 10. The attention image 10 may overlap the annotation image 8. Such projection can make the annotation image 8 noticeable. The output state of the attention images shown in FIG. 7 through FIG. 10 maybe changed while the attention images are being projected.
  • FIG. 11 is a flowchart showing an alternative operation of the attention image outputting portion 11. In FIG. 11, firstly, the attention image outputting portion 11 determines whether or not the annotation image 8 has started to be drawn on the captured image (S21). If the annotation image 8 has started to be drawn on the captured image (S21: Yes), the attention image outputting portion 11 makes the projector 9 start projecting the annotation image 8 (S22). Next, the attention image outputting portion 11 determines whether or not the annotation image 8 has just started to be drawn (S23). If the annotation image 8 has just started to be drawn (S23: Yes), the attention image outputting portion 11 makes the projector 9 project the attention image 10 (S24). If the annotation image 8 is not drawn on the captured image (S21: No), steps S22, S23, and S24 are skipped. If the annotation image 8 has not just started to be drawn, step S24 is skipped.
  • Next, the attention image outputting portion 11 determines whether or not the annotation image 8 is finished drawing on the captured image (S25). If the annotation image 8 is finished drawing on the captured image (S25: Yes), the attention image outputting portion 11 makes the projector 9 finish projecting the annotation image 8 (S26), and makes the projector 9 start projecting the attention image 10 (S27). Meanwhile, if the annotation image 8 is not finished drawing on the captured image (S25: No), processing returns to step S21.
  • Specifically, referring now to FIG. 12A, the attention image outputting portion 11 projects the attention image 10, when the annotation image 8 starts to be projected. While the annotation image 8 is being drawn, the attention image 10 is not projected as shown in FIG. 12B. When the annotation image 8 is finished drawing, the attention image 10 is displayed again as shown in FIG. 12C. Then, the attention image 10 is stopped projecting lastly, as shown in FIG. 12D. In this manner, the attention image 10 is projected not only when the annotation image 8 is finished drawing but also when the annotation image 8 starts to be drawn.
  • FIG. 13 is a flowchart showing an alternative operation of the attention image outputting portion 11. FIG. 13 shows another exemplary embodiment of processing at step S4 and step S5 shown in FIG. 4 or at step S24 and S27 shown in FIG. 11. At step S5, for example, after making the projector 9 finish projecting the annotation image 8, the attention image outputting portion 11 makes the projector 9 start projecting the attention image 10. In the flowchart of FIG. 13, however, after a given period of time has passed since the latest attention image 10 was projected (S31: Yes) and when there is a given distance apart from the position on which the latest attention image 10 is projected (S32: Yes), the attention image outputting portion 11 makes the projector 9 start projecting the attention image 10 (S33).
  • Specifically, a description will be given with reference to FIG. 14A through FIG. 14C. When the annotation images 8 a through 8 c are continuously projected onto the object 5, every time the annotation image is added in FIG. 14A through FIG. 14C, the attention image 10 is projected, resulting in an excessive projection of the attention image 10. This may annoy the viewer of the annotation images 8 a through 8 a. In order to avoid such annoyance, it is configured to project the attention image 10 after a given period of time has passed since the latest annotation image 8 was projected. The viewer of the annotation image 8 does not feel annoyance any longer. In FIG. 14A, the first annotation image 8 a is drawn and the attention image 10 a is projected. Next, in FIG. 14B, when the annotation image 8 b is drawn within a given period of time, the attention image is not projected to avoid the excessive projection. Then, when the annotation image 8 c is drawn after a given period of time has passes since the annotation image 8 a was projected as shown in FIG. 14C, the attention image 10 c is projected.
  • Next, a description will be given of a case where the object 5 has a complex structure. FIG. 15A is a plane view of a circuit board 19 on which electronics parts 21 are mounted. If the object 5 is a circuit board 19 in this manner, the annotation image 8 is projected between the electronics parts 21 as shown in FIG. 15B. In this case, the electronics part 21 possibly blocks the sight line to the annotation image 8. This makes the viewer of the annotation image 8 difficult to find the projection position of the annotation image 8.
  • Referring now to FIG. 16A and FIG. 16B, the attention image 10 is moved and concentrated on the annotation image 8. There is advantage for the viewer of the annotation image 8 to be able to realize the position instructed by the annotation image 8 with ease.
  • If the object 5 is the circuit board 19 as described above, the attention image outputting portion 11 is capable of analyzing the complexity of the image captured by the camcorder 7 by means of differential histogram or the like, and changing the shape of the attention image 10 to be output according to the analysis results. For example, if the attention image 10 is a circle, it is desirable to change the number of the circles, to change the radius thereof in starting drawing, or to change the thickness thereof.
  • FIG. 17 is a flowchart showing an alternative operation of the attention image outputting portion 11, when the remote terminal 3 is connected to the server 1. FIG. 17 shows another exemplary embodiment of the processing at step S5 shown in FIG. 4 or at step S24 and S27 shown in FIG. 11. At step 5, for example, after making the projector 9 finish projecting the annotation image 8, the attention image outputting portion 11 makes the projector 9 start projecting the attention image 10. In the flowchart of FIG. 17, however, the attention image outputting portion 11 makes the projector 9 project the attention image 10 (S41), while the attention image 10 is being projected (S42: No), the captured image is stopped transmitting (S44). After the projection of the attention image 10 is finished (542: Yes), the captured image starts to be transmitted again (S43).
  • Specifically, the attention image 10 is projected onto the annotation image 8 projected onto the object 5. Referring to FIG. 18A, if the attention image 10 is being projected and the projection of the attention image 10 has not finished, the image is not transmitted to the display 13 at a remote site. The captured image of the object 5 onto which the attention image 10 is projected is not displayed, and the captured image of the object 5 onto which only the annotation image 8 is projected is displayed. Referring to FIG. 18B, the projection of the attention image 10 onto the object 5 is finished, and the captured image is transmitted to the display 13 and displayed. By performing the afore-mentioned process, the captured image needs not to be transmitted to the remote terminal, while the attention image 10 is being projected, thereby reducing the load over the network.
  • Referring now to FIG. 19A and FIG. 19B, a description will be given of a case where multiple remote terminals 3 are connected in the remote instruction system. FIG. 19A illustrates the annotation image 8 projected onto the object 5. FIG. 19B illustrate displays 13 a and 13 b respectively provided for remote terminals 3 a and 3 b.
  • As shown in FIG. 19B, the attention image outputting portion 11 is capable of not displaying the attention image 10 on the remote terminal 3 b by which the annotation image 8 is instructed to project, and is capable of displaying the attention image 10 on the remote terminal 3 a by which the annotation image 8 is not instructed to project. The remote terminal 3 b that gives an instruction to project the annotation image 8 is capable of recognizing the position of the object 5 to be instructed to project the annotation image 8, thereby eliminating the necessity of displaying the attention image 10. The remote terminal 3 a that does not give an instruction to project the annotation image 8 is not capable of recognizing the position of the object 5 to be instructed by the remote terminal 3 b. An attention image 10 a needs to be displayed on the remote terminal 3 a. The attention image outputting portion 11 displays the attention image 10 a on the remote terminal 3 a by which an instruction to project the annotation image 8 is not given, thereby making the annotation image 8 projected on the captured image noticeable on the remote terminal 3 a. The attention image 10 a displayed on the remote terminal is not necessarily identical to the captured attention image. For example, as shown in FIG. 13A, a rectangular attention image 10 a may be displayed whereas a circular attention image 10 is being projected.
  • FIG. 20 schematically shows the remote instruction system in which a laser pointer 17 is connected. By use of the laser pointer 17, the annotation image 8 is directly projected onto the object 5, and the attention image 10 is projected onto the annotation image 8. Specifically, the trajectory of the laser made by the laser pointer 17 is captured by the camcorder 7, and the annotation image 8 is projected onto the object 5 according to the captured image. Also, the attention image 10 is projected onto the object 5 according to the annotation image 8.
  • It should be appreciated that modifications and adaptations to those exemplary embodiments may occur to one skilled in the art without departing from the scope of the present invention. For example, in the abode-described exemplary embodiments, the attention image outputting portion 11 is provided in the server 1. The attention image outputting portion 11, however, may be provided in the remote terminal 3, whereas the attention image outputting portion 11 in the server 1 applies the less network load than that provided in the remote terminal 3. A remote instruction method employed as an aspect of the present invention is realized with a Central Processing Unit (CPU), Read Only Memory (ROM), Random Access Memory (RAM), and the like, by installing a program from a portable memory device or a storage device such as a hard disc device, CD-ROM, DVD, or a flexible disc or downloading the program through a communications line. Then the steps of program are executed as the CPU operates the program.
  • The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The exemplary embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (11)

1. A remote instruction system comprising an attention image outputting portion that projects an annotation image and an attention image from a projecting portion onto a captured area of an image capturing portion that captures an image of an object, the annotation image being created on the basis of the image captured, the attention image being provided for attracting attention to the annotation image.
2. The remote instruction system according to claim 1, wherein the attention image outputting portion changes an output state of the attention image, while the attention image is being output.
3. The remote instruction system according to claim 1, wherein the attention image outputting portion projects the attention image to wholly include the annotation image, and then moves the attention image to concentrate on the annotation image.
4. The remote instruction system according to claim 1, wherein the attention image outputting portion changes an output state of the attention image according to the output state of the annotation image.
5. The remote instruction system according to claim 1, wherein the attention image outputting portion changes a shape of the attention image according to the shape of the annotation image.
6. The remote instruction system according to claim 1, wherein the attention image outputting portion does not project the attention image until a given period of time passes.
7. The remote instruction system according to claim 1, further comprising:
a server that controls the image capturing portion and the projecting portion; and
one or more remote terminals that receive the captured image from the server and give an instruction to project the annotation image,
wherein the attention image outputting portion does not transmit the captured image to the one or more remote terminals, while the attention image is being projected onto the object.
8. The remote instruction system according to claim 7, wherein the attention image outputting portion displays the attention image on the remote terminal by which the annotation image is not instructed to project.
9. The remote instruction system according to claim 1, wherein the attention image outputting portion forms the attention image according to an area onto which the annotation image is projected.
10. A remote instruction system comprising projecting an annotation image and an attention image onto a captured area in which an image of an object is captured, the annotation image being created on the basis of the image captured, the attention image being provided for attracting attention to the annotation image.
11. A computer readable medium storing a program causing a computer to execute a process for remote instruction, the process comprising projecting an annotation image and an attention image onto a captured area in which an image of an object is captured, the annotation image being created on the basis of the image captured, the attention image being provided for attracting attention to the annotation image.
US11/589,176 2006-06-28 2006-10-30 Remote instruction system, remote instruction method, and program product therefor Abandoned US20080005283A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/137,583 US8587677B2 (en) 2006-06-28 2011-08-26 Remote instruction system, remote instruction method, and program product therefor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006178281A JP5200341B2 (en) 2006-06-28 2006-06-28 Remote indication system and remote indication method
JP2006-178281 2006-06-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/137,583 Division US8587677B2 (en) 2006-06-28 2011-08-26 Remote instruction system, remote instruction method, and program product therefor

Publications (1)

Publication Number Publication Date
US20080005283A1 true US20080005283A1 (en) 2008-01-03

Family

ID=38878093

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/589,176 Abandoned US20080005283A1 (en) 2006-06-28 2006-10-30 Remote instruction system, remote instruction method, and program product therefor
US13/137,583 Active US8587677B2 (en) 2006-06-28 2011-08-26 Remote instruction system, remote instruction method, and program product therefor

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/137,583 Active US8587677B2 (en) 2006-06-28 2011-08-26 Remote instruction system, remote instruction method, and program product therefor

Country Status (3)

Country Link
US (2) US20080005283A1 (en)
JP (1) JP5200341B2 (en)
CN (1) CN100594727C (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090164899A1 (en) * 2007-12-21 2009-06-25 Brian Hernacki Providing Image-Based Guidance for Remote Assistance
WO2014206622A1 (en) * 2013-06-27 2014-12-31 Abb Technology Ltd Method and data presenting device for assisting a remote user to provide instructions

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5061924B2 (en) * 2008-01-25 2012-10-31 富士ゼロックス株式会社 Instruction system, instruction program and instruction apparatus
JP4816704B2 (en) * 2008-09-25 2011-11-16 富士ゼロックス株式会社 Instruction system, instruction program
CN104243898A (en) * 2013-06-18 2014-12-24 鸿富锦精密工业(深圳)有限公司 Remote guidance system and guidance terminal and help terminal thereof
US11500917B2 (en) 2017-11-24 2022-11-15 Microsoft Technology Licensing, Llc Providing a summary of a multimedia document in a session

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473343A (en) * 1994-06-23 1995-12-05 Microsoft Corporation Method and apparatus for locating a cursor on a computer screen
US5767897A (en) * 1994-10-31 1998-06-16 Picturetel Corporation Video conferencing system
US6339431B1 (en) * 1998-09-30 2002-01-15 Kabushiki Kaisha Toshiba Information presentation apparatus and method
US6392694B1 (en) * 1998-11-03 2002-05-21 Telcordia Technologies, Inc. Method and apparatus for an automatic camera selection system
US20040054295A1 (en) * 2002-09-18 2004-03-18 Ramseth Douglas J. Method and apparatus for interactive annotation and measurement of time series data with automatic marking
US20040070674A1 (en) * 2002-10-15 2004-04-15 Foote Jonathan T. Method, apparatus, and system for remotely annotating a target
US6989801B2 (en) * 2001-03-22 2006-01-24 Koninklijke Philips Electronics N.V. Two-way presentation display system
US7224847B2 (en) * 2003-02-24 2007-05-29 Microsoft Corp. System and method for real-time whiteboard streaming
US20070234220A1 (en) * 2006-03-29 2007-10-04 Autodesk Inc. Large display attention focus system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05225303A (en) * 1992-02-14 1993-09-03 Hitachi Ltd Remote indication system
JP3793987B2 (en) * 2000-09-13 2006-07-05 セイコーエプソン株式会社 Correction curve generation method, image processing method, image display apparatus, and recording medium
JP3719411B2 (en) * 2001-05-31 2005-11-24 セイコーエプソン株式会社 Image display system, projector, program, information storage medium, and image processing method
JP3735708B2 (en) * 2002-01-11 2006-01-18 独立行政法人産業技術総合研究所 Remote environment guide device
JP2003323610A (en) * 2002-03-01 2003-11-14 Nec Corp Color correcting method and device, for projector
JP3959354B2 (en) * 2003-01-10 2007-08-15 株式会社東芝 Image generation apparatus, image generation method, and image generation program
US7193608B2 (en) * 2003-05-27 2007-03-20 York University Collaborative pointing devices
JP2006041884A (en) 2004-07-27 2006-02-09 Sony Corp Information processing apparatus and method therefor, recording medium and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473343A (en) * 1994-06-23 1995-12-05 Microsoft Corporation Method and apparatus for locating a cursor on a computer screen
US5767897A (en) * 1994-10-31 1998-06-16 Picturetel Corporation Video conferencing system
US6339431B1 (en) * 1998-09-30 2002-01-15 Kabushiki Kaisha Toshiba Information presentation apparatus and method
US6392694B1 (en) * 1998-11-03 2002-05-21 Telcordia Technologies, Inc. Method and apparatus for an automatic camera selection system
US6989801B2 (en) * 2001-03-22 2006-01-24 Koninklijke Philips Electronics N.V. Two-way presentation display system
US20040054295A1 (en) * 2002-09-18 2004-03-18 Ramseth Douglas J. Method and apparatus for interactive annotation and measurement of time series data with automatic marking
US20040070674A1 (en) * 2002-10-15 2004-04-15 Foote Jonathan T. Method, apparatus, and system for remotely annotating a target
US7224847B2 (en) * 2003-02-24 2007-05-29 Microsoft Corp. System and method for real-time whiteboard streaming
US20070234220A1 (en) * 2006-03-29 2007-10-04 Autodesk Inc. Large display attention focus system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090164899A1 (en) * 2007-12-21 2009-06-25 Brian Hernacki Providing Image-Based Guidance for Remote Assistance
US8151193B2 (en) * 2007-12-21 2012-04-03 Symantec Corporation Providing image-based guidance for remote assistance
WO2014206622A1 (en) * 2013-06-27 2014-12-31 Abb Technology Ltd Method and data presenting device for assisting a remote user to provide instructions
US9829873B2 (en) 2013-06-27 2017-11-28 Abb Schweiz Ag Method and data presenting device for assisting a remote user to provide instructions

Also Published As

Publication number Publication date
US20110310122A1 (en) 2011-12-22
JP2008011067A (en) 2008-01-17
CN101098460A (en) 2008-01-02
CN100594727C (en) 2010-03-17
JP5200341B2 (en) 2013-06-05
US8587677B2 (en) 2013-11-19

Similar Documents

Publication Publication Date Title
US8587677B2 (en) Remote instruction system, remote instruction method, and program product therefor
US9473741B2 (en) Teleconference system and teleconference terminal
US10135925B2 (en) Non-transitory computer-readable medium, terminal, and method
CN111601066B (en) Information acquisition method and device, electronic equipment and storage medium
KR101735755B1 (en) Method and apparatus for prompting device connection
CN105320695A (en) Picture processing method and device
WO2022156598A1 (en) Bluetooth connection method and apparatus, and electronic device
US20200343972A1 (en) Optical communication method
WO2019119643A1 (en) Interaction terminal and method for mobile live broadcast, and computer-readable storage medium
US20190052745A1 (en) Method For Presenting An Interface Of A Remote Controller In A Mobile Device
CN112449165B (en) Projection method and device and electronic equipment
EP3425533A1 (en) Displaying page
CN113709368A (en) Image display method, device and equipment
CN112616078A (en) Screen projection processing method and device, electronic equipment and storage medium
US8619097B2 (en) Remote instruction system, remote instruction method, and program product therefor
CN113452931B (en) Screen splicing interaction method and system based on image recognition and smart screen
CN114463358A (en) Screen projection display method and device, electronic equipment and readable storage medium
US8405703B2 (en) Television conference apparatus, method of controlling television conference, and computer-readable medium storing control program executable on television conference apparatus
KR20120115898A (en) Display apparatus having camera, remote controller for controlling the display apparatus, and display control method thereof
CN112561809A (en) Image processing method, device and equipment
EP3518198A1 (en) Video image transmission apparatus, information processing apparatus, system, information processing method, and program
CN112788244A (en) Shooting method, shooting device and electronic equipment
CN107566471B (en) Remote control method and device and mobile terminal
JP2004053760A (en) Projector system
JP2009153077A (en) Information processing device and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJI XEROX CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHINGU, JUN;REEL/FRAME:018484/0432

Effective date: 20061027

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION