JP2008276219A - Digital platform device - Google Patents

Digital platform device Download PDF

Info

Publication number
JP2008276219A
JP2008276219A JP2008106102A JP2008106102A JP2008276219A JP 2008276219 A JP2008276219 A JP 2008276219A JP 2008106102 A JP2008106102 A JP 2008106102A JP 2008106102 A JP2008106102 A JP 2008106102A JP 2008276219 A JP2008276219 A JP 2008276219A
Authority
JP
Japan
Prior art keywords
image data
camera
means
image
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008106102A
Other languages
Japanese (ja)
Inventor
Osamu Nonaka
Kazuhiro Sato
和宏 佐藤
修 野中
Original Assignee
Olympus Imaging Corp
オリンパスイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp, オリンパスイメージング株式会社 filed Critical Olympus Imaging Corp
Priority to JP2008106102A priority Critical patent/JP2008276219A/en
Publication of JP2008276219A publication Critical patent/JP2008276219A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a digital platform device having a user interface which is easy to treat image data of a camera, etc. <P>SOLUTION: A table type screen device 10 is constituted by providing; a display device part 11; a control part 20 comprised of a microcontroller; a display control part 21; a position detection part 23; an operation judgment part 24; a communication part 25 which is an interface means; a recording part 26; and a printer 27, and wherein a position of the camera 31 loaded on a screen 15 is detected by the position detection part 23, image data are transferred from the camera 31 via the communication part 25 according to a predetermined instruction operation, further displayed on the screen 15 by a projector from the control part 20 via the display control part 21, all the pieces of image data recorded in the camera 31 are contracted and displayed near the camera 31 and when the specified image data are transferred to a recording area on the screen, a contracted image of the image data to be transferred is displayed near the recording area. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

  The present invention relates to a digital platform device, and more particularly to an improvement in a user interface for making image data easier to handle.

  In recent years, as a table type display, a table type display device has been proposed in which image data is displayed on a table-like display using a projector to provide display means that can be enjoyed by a plurality of people.

And, for example, a table-type display that reflects a light beam administered from a projector to a predetermined area of a horizontally installed screen by reflecting it with a mirror, the screen comprising a transparent base member for reducing bending, an image There is known an apparatus composed of a diffusing screen for forming an image (see, for example, Patent Document 1).
Japanese Patent Laid-Open No. 2004-12712

  By the way, in the table type display, an apparatus can be placed on the table. However, in the apparatus described in Patent Document 1, the point is not mentioned up to that point. Therefore, it is necessary to devise such that a device placed on the table can display image data stored in the device in the vicinity and easily check the data.

  In addition, conventionally, when data is exchanged, there is simply a space for signal lines, radio waves, and optical communication between the transmitting device and the receiving device, and the transmission / reception relationship is unclear. It was difficult for the user to understand.

  Therefore, the present invention makes it possible to clearly transmit and receive signals clearly and clearly to the user by clarifying the place where devices and data are placed on the table, the signal transmission and reception processes of signals to be transmitted and received. An object of the present invention is to provide a digital platform device that can be used.

  That is, the invention described in claim 1 is a plane unit on which an information device storing image data can be placed, a detecting means for detecting a placement position of the information device, and receiving image data from the information device. Receiving means, and display means for displaying the image data received by the receiving means on the flat portion around the information device position detected by the detecting means.

  According to a second aspect of the present invention, there is provided a plane portion on which an information device storing image data can be placed, detection means for detecting a placement position of the information device, and image data stored in the information device. The receiving means for receiving, the display means for displaying the image data received by the receiving means, and the image data received by the receiving means are stored in the flat portion around the information device position detected by the detecting means. Storage means.

  The invention according to claim 3 is the invention according to claim 2, further comprising selection means for selecting image data, wherein the storage means stores the image data selected by the selection means. And

  According to a fourth aspect of the present invention, in the invention according to the second or third aspect, the display means stores the image data stored in the storage means at a position different from the image data received by the receiving means. Is displayed.

  According to a fifth aspect of the present invention, there is provided a flat surface portion on which a plurality of information devices storing image data can be placed, detection means for detecting a placement position of the information devices, and images stored by the information devices. Receiving means for receiving data, display means for displaying the image data received by the receiving means on the plane portion around the position of the information device detected by the detecting means, and image data received by the receiving means. Transmitting means for transmitting to the other information equipment.

  The invention described in claim 6 further comprises selection means for selecting image data in the invention described in claim 5, wherein the transmission means transmits the image data selected by the selection means to the other information device. It is characterized by transmitting to.

  According to the present invention, it is possible to clearly transmit and receive signals that are easy to understand for the user by clarifying the place where devices and data are placed on the table, the signal to be sent and the signal transmission and reception process to be received, and so on. It is possible to provide a digital platform device capable of

  In addition, data operations can be performed intuitively without requiring complicated settings.

  Hereinafter, an embodiment of an imaging device according to the present invention will be described with reference to the drawings.

(First embodiment)
FIG. 1 is a block diagram showing a configuration of a digital platform apparatus according to the first embodiment of the present invention.

  In FIG. 1, a table type screen device 10 includes a display device unit 11, a control unit 20 composed of a microcontroller, and a display control unit 21 as control means for controlling the overall control operation of the table type screen device 10. The position detection unit 23 is a position detection unit, the operation determination unit 24 is an operation determination unit, the communication unit 25 is an interface unit, the recording unit 26, and the printer 27. The display device unit 12 projects an image, and includes a projector 13 as a projection unit (reception device, reception unit), a mirror 14 for guiding a projection image to a screen 155, a screen 15 as a display unit, and a surface. It is comprised from the touchscreen 16 comprised as a sensor part (plane part).

  The screen 15 is actually composed of a transparent base portion having a predetermined thickness and a diffusion screen portion so that the table does not bend when a force is applied on the screen. Since it has a predetermined strength as described above, even if the touch panel 16 is pressed, or even if a force such as a camera 31 is placed and pressed as will be described later, the screen 15 does not bend or break. I have to.

  The touch panel 16 is a touch switch for detecting movement of a device or a user's finger on the screen 15. As the touch panel 16, a touch panel whose electrical characteristics are changed by a pressing force on the panel is used.

  The display control unit 21 controls an image projected by the projector 13. The position detection unit 23 detects a position on the screen 15 on which the camera 31 is placed as described later, and the operation determination unit 24 detects and determines a position where the touch panel 16 is pressed. . The communication unit 25 performs data communication with the camera 31 placed on the screen 15 and receives image data compressed and stored by an image processing circuit in the camera.

  The recording unit 26 is for recording an image captured from the camera 31 via the communication unit 25, and constitutes a recording unit for an album, for example. The printer 27 is for outputting the acquired image on paper or the like.

  In such a configuration, the user 30 places the camera 31 that has been photographed and has image data as the transmission device on the table-like screen 15. Then, the touch panel 16 reacts to detect the position of the camera 31. Then, the communication unit 25 of the table-like screen starts communication with the communication unit of the camera 31, and the photographing data can be displayed on the display device unit 11 by the control unit 20.

  That is, in the display device unit 11, the image is projected by the projector 13 via the display control unit 21 by the control unit 20 and displayed on the screen 15 reflected by the mirror 14. Here, the image projected by the projector 13 is not an image acquired from the camera 31 but an image including the image of the camera 31. In addition, the image projected from the projector 13 is laid out so as to be visible in the vicinity of the camera 31, as shown in FIG.

  The touch panel 16 on the screen 15 can also detect the position of the camera 31, and an image corresponding to the image data acquired from the camera 31 is controlled to be displayed near the camera 31. In this case, as shown in FIG. 2, an album storage recording area 26a is provided at a predetermined position. When the user 30 drags an arbitrary image 33 on the screen 15 with a finger and brings it to the position of the recording area 26a, the image can be recorded in the recording section 26 for album. At this time, the image 33 is moved from the camera 31 to the recording area 26a so as to flow. For example, when a plurality of images are moved, the images are sequentially moved from the camera 33 toward the recording area 26a like a slide display. Such an application is also possible.

  In order to realize such a drag function, the screen 15 includes an operation determination unit 24 that detects and determines the position where the touch panel is pressed in order to detect the position of the user's finger. When the operation determination unit 24 determines that the touch panel 16 is pressed, the control unit 20 displays the display control unit as if the image 33 is dragged on the screen 15 based on the determination result. 21 is controlled to switch the display.

  When the image 33 is stored in the album, control is performed so that image data corresponding to the image 33 can be recorded in the recording unit 26. That is, the control unit 20 detects the position of the camera 31 and the operation input by the user 30 by the touch panel 16, the position detection unit 23, and the operation determination unit 24. Then, the image data input by the communication unit 25 is captured according to the position of the device such as the camera 31 described above and the operation described above, and the display control unit 21 is controlled so as to correspond to the position according to the operation described above. Is displayed.

  FIG. 3 is a flowchart for explaining the operation according to the program controlled by the control unit 20.

  When a power switch (not shown) of the table type screen device 10 is turned on, it is determined whether or not the camera 30 is placed on the screen 15 in step S1. When the user 30 places the camera 31 on the table-like screen 15, it is detected that the camera is placed from the change in the pressing force at that time. Then, according to the pressed position of the touch panel 16, the camera position is detected by the position detector 23 in step S2.

  Next, in step S3, data in the camera 31 is read by the communication unit 25. In step S4, the display control unit 21 is controlled so that an image corresponding to the data read at that time is projected to be reproduced near the camera position detected in step S2.

  Thereafter, in step S5, it is determined whether or not the user 30 has operated. Here, if no particular operation is performed, this routine ends. On the other hand, if any operation is performed, the process proceeds to step S6, and each processing operation is performed.

  That is, the pressed state of the touch panel 16 and the change in the pressed position are determined. For example, in step S6, a double click (double tap) of a predetermined image is determined. Here, when the double click is not performed, the process proceeds to step S10 described later, and when the double click is performed, the process proceeds to step S7 to determine whether the image is a pre-enlargement image.

  Here, in the case of the pre-enlargement image, the process proceeds to step S9 and an enlarged display is performed. On the other hand, when it is determined that the image is not the pre-enlargement image, that is, the post-enlargement image, the process proceeds to step S8 and the reduced display is performed. In this way, in step S8 or S9, the user 30 is devised so that it can be handled in a desired size, such as checking and appreciation by image enlargement, and arrangement by image reduction. This is performed by the control unit 20 controlling the display control unit 21. The ratio of the predetermined image described above may be increased in the entire screen.

  In step S10, it is determined whether the finger is traced on the table-like screen 15 and is shifted from a predetermined image position to another position, that is, the operation position is moved. Here, if it does not move, it will transfer to step S12 mentioned later, and if it moves, it will transfer to step S11.

  In step S11, the display position is switched so that the image is moved to the different position. If the destination is an album part as shown in FIG. 2 as the recording area 26a, the destination is determined in step S12. If the destination is the recording area 26a, the process proceeds to step S13, and image data is recorded in the recording area 26a.

  FIG. 4 shows an example of an album created in this way. Here, an example of a display form of an electronic album created in the recording unit 26 of the table type screen device is shown.

  In this way, the moved image is displayed like a simple album 26b placed on the table, and can be displayed completely different from a paper album. In this case, when the user 30 touches the triangular switch section 16a or 16b, the page can be returned and the album 26b can be viewed. The display position of the album 26b can also be switched to an arbitrary position on the table by a drag operation.

  When the print (PRINT) button 16c is pressed, an image recorded on the paper is printed out from the print output discharge port 27a (see FIG. 2).

  When the recording destination is not the recording area 26a in step S12 and after the recording is completed in step S13, the process proceeds to step S5 and the subsequent processing operations are repeated.

  As a communication method of the communication unit 25, optical communication is also conceivable, but it is preferable that there is no directivity. Therefore, it is preferable to perform communication using weak radio waves such as Bluetooth.

  Furthermore, the touch panel 16 may be configured as shown in FIG. 5, for example.

That is, a plurality of switches composed of transparent electrodes such as P 1 to P 12 may be provided on the screen 15 to determine which switch is pressed and turned on. That is, the switch control unit 41 sequentially turns on / off the switches in the switch units 42a and 42b in time series, and whether or not the current is supplied at this time is determined by the detection unit 43b based on the output result of the ammeter 43a. If the determination is detected, it is possible to determine which switch (which location) is being pressed.

For example, when a finger or a device is placed at the position of the switch P 1 and the switch is turned on, when the switch S1 of the switch unit 42a and the switch SA of the switch unit 42b are turned on, the current is output from the ammeter 43a. Detected.

Here, the simplification is made with twelve switches P 1 to P 12 , but in reality, it is assumed that an infinite number of such switches are laid.

  Further, the display device unit 11 described above may be configured as shown in FIG.

  That is, in the display unit 11 ′, a panel 16 ′ whose reflection changes when pressed by a finger is disposed on the screen 15, and a mirror for guiding the light of the projector 13 to the screen 15 is a half mirror 14a or It consists of a mirror with wavelength conductivity. Thereby, the change in shadow due to the change in reflection is detected by the image sensor 46 via the light receiving lens 45, and the position of the finger is detected.

  Of course, the display device unit 11 according to the first embodiment described above may be configured by combining both the touch panel shown in FIG. 5 and the display device unit 11 ′ shown in FIG.

  By using such a touch panel screen, the position of the device or the finger can be determined. Whether the device is a device or a finger is determined according to the size and shape of the range of the pressed portion or the communication result of the communication unit 25 described above.

  That is, in the example of the touch panel shown in FIG. 5, the number of switches being pressed is small for a finger, and there is a change with time such as when dragging. On the other hand, when the camera 31 is placed, the number of pressed switches is large, there is no change with time, and predetermined communication with the communication unit 25 is established.

(Second Embodiment)
Next, a second embodiment of the present invention will be described.

  In the first embodiment described above, an example in which one device (camera) is placed on the screen has been described. However, in the second embodiment, a plurality of devices are placed on the screen. An example of the case will be described.

  FIG. 7 is a diagram illustrating a configuration of a digital platform apparatus according to the second embodiment of the present invention.

  In the embodiment described below, the configuration and basic operation of the digital platform device are the same as those in the first embodiment shown in FIGS. The same parts are denoted by the same reference numerals, illustration and description thereof are omitted, and only different configurations and operations will be described.

  For example, consider a case where a camera 31 in which captured image data is recorded and a printer 35 are placed on the screen 15. Then, in order to obtain a correct image print from the printer 35, an image in the camera 31 as a transmission device is displayed on the screen 15 (image 33) as shown in FIG. Among them, only the selected one (33a) displays the data exchange in a format that enters the printer 35 as a receiving device, so an image different from the photograph that the user had mistakenly thought is printed. Can be prevented.

  In order to realize such an embodiment, in the second embodiment, the camera 31 and the printer 35 communicate with the communication unit 25 of the table type screen 15 wirelessly, and the communication unit 25, the control unit 20, The device type determination unit 24a for determining the type of device is provided between the two.

  This can be easily realized by allowing each device to transmit a signal indicating its function and allowing the position detector 23 to detect the position of each device.

  For example, when the camera 31 is placed at a position as shown in FIG. 7, the image 33 is displayed in the vicinity thereof. If the user drags one of the images 33, the image is displayed in such a manner that the location is moved as in the image 33a according to the movement of the finger. If the display control unit 21 determines that the moving location has come close to the printer 35, the user determines that the printer 35 wants to print the image. The control unit 20 transmits image data to the printer 35 via the display control unit 21, and the printer 35 prints out an image corresponding to the transmitted data.

  In FIG. 7, only the image 33 a is shown, but it is needless to say that an image obtained from the camera 31 can also be displayed near the printer 35.

  As described above, in order to print out the photographic image selected by the user without fail, the control unit 20 must always know which image data is displayed at which position.

  FIG. 8 is a flowchart illustrating an image data display operation performed by the digital platform device according to the second embodiment. Hereinafter, a method of grasping the image display position will be described with reference to this flowchart.

First, in step S21, the subroutine “camera position detection P C ” is executed, and then in step S22, the subroutine “printer position detection P P ” is executed. Details of these subroutines of steps S21 and S22 will be described later.

Next, in step S23, the image data read from the camera 31 is reproduced at the screen positions P 1 to P 3 in front L of the detected position P C of the camera 31, as shown in FIG. , Three images 33 are arranged. Furthermore, in the subsequent step S24, the position P 4 to P 6 in the front of the distance L, another captured image is displayed. The distance L is equivalent to the vertical size of the image.

  As described above, the control unit 20 first determines the display position in consideration of the camera position.

  Next, in step S25, it is determined whether or not there is a press on the display position. Here, if the above-mentioned pressing substantially corresponds to the size or shape of the tip of the finger, it is determined that the finger is pressed at the display position, and the process proceeds to step S26. This pressed portion is monitored sequentially, and it is determined in step S26 whether or not it has moved.

If there is no movement of the image in step S26 or if there is no press in step S25, the process proceeds to step S25. On the other hand, if it is determined in step S25 that there is a press and it is determined in step S26 that the image has moved, the process proceeds to step S27, and the image displayed in that portion is also moved in accordance with the movement. Then, at the subsequent step S28, one of the image each time the movement is displayed in staggered position P 7 either are associated.

This association results in step S29, the printer position P P detected at that position and the step S21 whether or not within a predetermined distance range is determined. If the distance is not within the predetermined distance range, the process proceeds to step S25. On the other hand, if there is a printer location P P is within the predetermined distance, the process proceeds to step S30.

  In step S30, a display position shift is performed in order to express an image in which the image 33a is sucked into the printer 35. In step S31, digital image data corresponding to the display image is input to the printer 35. In step S32, a command to print out the image input to the printer 35 is transmitted. As a result, the printer 35 prints the transmitted image data as an image.

  In this way, the user can select a favorite image of the captured image data in the camera as if there is a photograph, and can reliably print out the image from the printer.

  Next, with reference to FIG. 9, a method of detecting each device in the digital platform apparatus according to the second embodiment will be described.

When the camera 31 of Δx c × Δy c having a square bottom surface is placed on the touch panel screen 15 of the table type screen device 10, the touch panel is pushed in the range (Δx c × Δy c ). For example, in the case of the switch having the configuration shown in FIG. 5, since the switch is turned on based on the bottom surface, the position of the camera can be detected from the range of the turned on switch according to the flowchart of FIG.

FIG. 10 is a flowchart for explaining the operation of the subroutine “camera position detection P C ” in step S21 of the flowchart of FIG.

When this subroutine is entered, first, in step S41, a portion that is turned on in the range of Δx c × Δy c among the turned-on switches P 1 to P 12 is detected. In step S42, it is determined whether or not there is a portion that is turned on within the above range. If there is no portion that is turned on, the process proceeds to step S41. On the other hand, if there is a part that is turned on, the process proceeds to step S43.

In this step S43, the center-of-gravity position coordinates x c and y c in the turned-on switch range are set as the camera position P C. However, since there is a possibility that the box placed in this switch range is a box other than the camera, in the subsequent step S44, it is confirmed whether or not the communication unit 25 has made an appropriate handshake with the camera 31. The

If it is determined that the camera is not used, the process proceeds to step S41. On the other hand, if it is determined that the camera, the camera position P C is determined, the process proceeds to step S45. Then, this subroutine is exited.

The printer 35, for example as shown in FIG. 9, four places on its bottom surface, is provided with leg portions 35a, its width and depth [Delta] x p, if [Delta] y p, panel that satisfies the respective matter The printer position is determined based on whether or not there is a pressing portion.

FIG. 11 is a flowchart for explaining the operation of the subroutine “printer position detection P P ” in step S22 of the flowchart of FIG.

  When this subroutine is entered, first, at step S51, it is determined whether or not there are pressing conditions for the four legs 35a. Here, if there are four pressing portions, it is determined in the subsequent step S52 whether or not the distance of the leg portion 35a satisfies a predetermined condition. Here, when there are no four pressing portions in step S51, or when the distance of the leg portion 35a does not satisfy the predetermined condition in step S52, the process proceeds to step S51.

On the other hand, when there are four pressing portions in steps S51 and S52 and the distance of the leg portion 35a satisfies the predetermined condition, the process proceeds to step S53. In this step S53, the coordinates x p and y p of the center of gravity of the quadrangle having the four legs 35a as the four corners are detected, and the position is set as the position of the printer.

However, as in step S44 of the flowchart of FIG. 10, there is a case where the one placed at the center of gravity is not a printer, so in step S54, it is determined whether or not communication with the printer is possible. As a result, if communication is possible, it is determined that the printer is correct, the process proceeds to step S55, and the printer position PP is determined. On the other hand, if it is determined in step S54 that the printer is not a printer, the process proceeds to step S51.

  Although a camera and a printer have been described as examples here, the same method is used when inputting an image from a camera to a storage, which is a large-capacity recording unit, or when projecting and displaying camera data with a projector or the like. Data can be reliably exchanged using position detection and communication.

  For example, as shown in FIG. 12, an image stored in the storage 50 that is a transmission device is displayed on a screen by a projector 51 that is a reception device, and an image 52 may be appropriately magnified by the projector 51. it can. In this case, for example, the image 33 displayed on the screen may be in the vicinity of the storage 50 and in the vicinity of the project 51. Alternatively, it may be displayed near both the storage 50 and the project 51.

  Further, in the camera position detection and the printer position detection shown in FIGS. 9 to 11, the touch panel function is used, but the following examples are also conceivable in order to distinguish individual devices on the screen.

  That is, as shown in FIG. 13A, a two-dimensional barcode 31a may be provided on the lower surface of the camera 31 or the like. Further, as shown in FIG. 13B, an IC tag 31 b may be provided on the lower surface portion of the camera 31. Further, as shown in FIG. 13C, an infrared communication device 31 c may be provided on the lower surface portion of the camera 31.

  As described above, various application modifications can be considered.

  FIG. 14 is a diagram showing a configuration example of a camera position detection system in a digital platform device of a type that reads a barcode 31a as shown in FIG. 13 (a), for example.

  Here, the screen 15a is constituted by an LCD, and the LCD control circuit 15b is configured to be able to electrically switch the transmissive or light diffusion type screen 15a. That is, when reading the barcode on the lower surface of the camera 31, the LCD 15 a is transmitted, and the light from the projector 13 is irradiated to the barcode 31 a by the half mirror 14. The reflected light is detected by the light receiving lens 45 and the monitor image sensor (sensor) 46.

  FIG. 15 is a flowchart for explaining an operation of barcode detection by the camera position detection system having such a configuration.

  When entering this routine, first, the LCD 15a is made transparent in step S61. In step S62, when uniform light is projected from the projector 13, the light is reflected by the half mirror 14a and guided to the barcode 31a disposed on the lower surface of the camera 31.

  In step S 63, the light reflected by the barcode 31 a passes through the half mirror 14 a and is read by the image sensor 46 through the light receiving lens 45. Thereby, the position of the camera 31 is detected. Here, in step S64, communication with the camera 31 is performed by the communication unit (see FIG. 1).

  Next, in step S65, the LCD 15a is switched to the light diffusion mode by the LCD control circuit 15b so that the output image of the projector 13 can be displayed. Thereafter, in step S66, the image in the camera 31 is projected from the projector 13 to reproduce and display in the vicinity of the camera 31.

  As described above, according to the second embodiment, device information and its position can be determined more accurately by code detection that detects a device other than the communication means using radio waves for image data.

  In the first and second embodiments described above, the example is described in which the device is placed on the table with the display function. However, the present invention is not limited to this.

  For example, as shown in FIG. 16, the flat panel display 10a is placed on a general table 10b. In this state, when the camera 31 or the printer 35 is placed on the table 10b, each device is determined by the communication units 25a and 25b. Then, the image data recorded in the internal memory of the camera 31 is displayed as an image 33 on the virtual table 10c on the flat panel display 10a as shown in the figure.

  Even if comprised in this way, it is also possible to acquire the effect similar to 1st and 2nd embodiment mentioned above. In this example, the object of the present invention can be achieved without using a display with a special table configuration.

  According to the above-described embodiment, the image in the device can be directly displayed, and when transferring image data, the image can be clearly shown to the user during the transfer process. Can be reliably transmitted and received without failure.

  Although the embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the present invention.

It is a block diagram which shows the structure of the digital platform apparatus by the 1st Embodiment of this invention. It is the figure which showed the example at the time of a user operating the digital platform apparatus of FIG. It is a flowchart explaining the main operation | movement of the digital platform apparatus by 1st Embodiment. It is the figure which showed the example of the album produced in the recording area of FIG. It is the figure which showed the structural example of the touchscreen in 1st Embodiment. It is the figure which showed the other structural example of the display apparatus part in 1st Embodiment. It is the figure which showed the structure of the digital platform apparatus by the 2nd Embodiment of this invention. It is a flowchart explaining the image data display operation | movement by the digital platform apparatus of 2nd Embodiment. It is a figure explaining the detection method of each apparatus in the digital platform apparatus by 2nd Embodiment. FIG. 9 is a flowchart for explaining an operation of a subroutine “camera position detection P C ” in step S <b> 21 of the flowchart of FIG. 8. FIG. 9 is a flowchart for explaining an operation of a subroutine “printer position detection P P ” in step S <b> 22 of the flowchart of FIG. 8. It is the modification of the 2nd Embodiment of this invention, and is the figure which showed the example which displays on the screen by the projector 51 the image accumulate | stored in the storage. The other example of the position detection of the apparatus mounted on a screen was shown, (a) is the figure which showed the example which provided the two-dimensional barcode on the lower surface part of the camera as an apparatus, (b) The figure which showed the example which provided the IC tag in the lower surface part of the camera, (c) is the figure which showed the example which provided the infrared communication device in the lower surface part of the camera. It is the figure which showed the structural example of the position detection system of the camera in the digital platform apparatus of the type which reads a barcode as shown in Fig.13 (a). It is a flowchart explaining the operation | movement of barcode detection by the position detection system of the camera of the structure shown by FIG. It is the figure which showed the further modification of the 1st and 2nd embodiment of this invention, and showed the example used as a virtual table using a flat panel display.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 10 ... Table type screen apparatus, 11 ... Display apparatus part, 13 ... Projector, 14 ... Mirror, 14a ... Half mirror, 15 ... Screen, 16 ... Touch panel, 16a, 16b ... Triangular switch part, 20 ... Control part, 21 ... Display Control unit, 23 ... Position detection unit, 24 ... Operation determination unit, 25 ... Communication unit, 26 ... Recording unit, 26a ... Recording area, 26b ... Album, 27 ... Printer, 27a ... Print output discharge port, 30 ... User, 31 ... Camera 30 user 33... Image 35. Printer 41 Switch control unit 42 a, 42 b Switch unit 43 a Ammeter 43 b Detection unit 45 Light receiving lens 46 Image sensor

Claims (6)

  1. A flat surface on which an information device storing image data can be placed;
    Detecting means for detecting the mounting position of the information device;
    Receiving means for receiving image data from the information device;
    Display means for displaying the image data received by the receiving means on the flat portion around the information device position detected by the detecting means;
    A digital platform device comprising:
  2. A flat surface on which an information device storing image data can be placed;
    Detecting means for detecting the mounting position of the information device;
    Receiving means for receiving image data stored in the information device;
    Display means for displaying the image data received by the receiving means on the flat portion around the information device position detected by the detecting means;
    Storage means for storing image data received by the receiving means;
    A digital platform device comprising:
  3. A selection means for selecting image data;
    3. The digital platform apparatus according to claim 2, wherein the storage means stores the image data selected by the selection means.
  4.   4. The digital platform apparatus according to claim 2, wherein the display unit displays the image data stored in the storage unit at a position different from the image data received by the receiving unit.
  5. A plane portion on which a plurality of information devices storing image data can be placed;
    Detecting means for detecting the mounting position of the information device;
    Receiving means for receiving image data stored in the information device;
    Display means for displaying the image data received by the receiving means on the flat portion around the information device position detected by the detecting means;
    Transmitting means for transmitting the image data received by the receiving means to the other information devices;
    A digital platform device comprising:
  6. A selection means for selecting image data;
    6. The digital platform apparatus according to claim 5, wherein the transmission unit transmits the image data selected by the selection unit to the other information device.
JP2008106102A 2008-04-15 2008-04-15 Digital platform device Pending JP2008276219A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008106102A JP2008276219A (en) 2008-04-15 2008-04-15 Digital platform device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008106102A JP2008276219A (en) 2008-04-15 2008-04-15 Digital platform device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP2004371678 Division 2004-12-22

Publications (1)

Publication Number Publication Date
JP2008276219A true JP2008276219A (en) 2008-11-13

Family

ID=40054178

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008106102A Pending JP2008276219A (en) 2008-04-15 2008-04-15 Digital platform device

Country Status (1)

Country Link
JP (1) JP2008276219A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2237139A2 (en) * 2009-04-01 2010-10-06 Samsung Electronics Co., Ltd. Method for providing GUI and multimedia device using the same
JP2010273253A (en) * 2009-05-25 2010-12-02 Canon Inc Display device, communication device and system
JP2010277552A (en) * 2009-06-01 2010-12-09 Canon Inc Output controller, control method therefor, and program
JP2011238064A (en) * 2010-05-11 2011-11-24 Canon Inc Display control apparatus and display control method
JP2011253025A (en) * 2010-06-02 2011-12-15 Canon Inc Display device and system

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07175587A (en) * 1993-10-28 1995-07-14 Hitachi Ltd Information processor
JPH1157216A (en) * 1997-08-15 1999-03-02 Sony Corp Game device
JP2001075733A (en) * 1999-08-31 2001-03-23 Nec Corp Information input and output device
JP2001109570A (en) * 1999-10-08 2001-04-20 Sony Corp System and method for inputting and outputting information
JP2001136504A (en) * 1999-11-08 2001-05-18 Sony Corp System and method for information input and output
JP2001175374A (en) * 1999-12-21 2001-06-29 Sony Corp Information input/output system and information input/ output method
JP2002204239A (en) * 2000-10-24 2002-07-19 Sony Corp Device and method for information processing, electronic equipment, information processing system, and recording medium
WO2003021875A1 (en) * 2001-08-28 2003-03-13 Sony Corporation Information processing apparatus and method, and recording medium
JP2004005402A (en) * 2002-01-21 2004-01-08 Mitsubishi Electric Research Laboratories Inc Method and system for visualizing a plurality of images in circular graphical user interface
JP2004012712A (en) * 2002-06-05 2004-01-15 Olympus Corp Table type display unit
JP2004164069A (en) * 2002-11-11 2004-06-10 Nippon Telegr & Teleph Corp <Ntt> Information control system, information control method, program for the same, recording medium recording that program
JP2004262091A (en) * 2003-02-28 2004-09-24 Seiko Epson Corp Printer
JP2004274152A (en) * 2003-03-05 2004-09-30 Canon Inc Pico-net building method
JP2004282189A (en) * 2003-03-13 2004-10-07 Konica Minolta Holdings Inc Image output system, printer, digital camera, display, and image output method
JP2004297509A (en) * 2003-03-27 2004-10-21 Minolta Co Ltd Image forming apparatus

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07175587A (en) * 1993-10-28 1995-07-14 Hitachi Ltd Information processor
JPH1157216A (en) * 1997-08-15 1999-03-02 Sony Corp Game device
JP2001075733A (en) * 1999-08-31 2001-03-23 Nec Corp Information input and output device
JP2001109570A (en) * 1999-10-08 2001-04-20 Sony Corp System and method for inputting and outputting information
JP2001136504A (en) * 1999-11-08 2001-05-18 Sony Corp System and method for information input and output
JP2001175374A (en) * 1999-12-21 2001-06-29 Sony Corp Information input/output system and information input/ output method
JP2002204239A (en) * 2000-10-24 2002-07-19 Sony Corp Device and method for information processing, electronic equipment, information processing system, and recording medium
WO2003021875A1 (en) * 2001-08-28 2003-03-13 Sony Corporation Information processing apparatus and method, and recording medium
JP2004005402A (en) * 2002-01-21 2004-01-08 Mitsubishi Electric Research Laboratories Inc Method and system for visualizing a plurality of images in circular graphical user interface
JP2004012712A (en) * 2002-06-05 2004-01-15 Olympus Corp Table type display unit
JP2004164069A (en) * 2002-11-11 2004-06-10 Nippon Telegr & Teleph Corp <Ntt> Information control system, information control method, program for the same, recording medium recording that program
JP2004262091A (en) * 2003-02-28 2004-09-24 Seiko Epson Corp Printer
JP2004274152A (en) * 2003-03-05 2004-09-30 Canon Inc Pico-net building method
JP2004282189A (en) * 2003-03-13 2004-10-07 Konica Minolta Holdings Inc Image output system, printer, digital camera, display, and image output method
JP2004297509A (en) * 2003-03-27 2004-10-21 Minolta Co Ltd Image forming apparatus

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2237139A2 (en) * 2009-04-01 2010-10-06 Samsung Electronics Co., Ltd. Method for providing GUI and multimedia device using the same
EP2237139A3 (en) * 2009-04-01 2013-05-15 Samsung Electronics Co., Ltd. Method for providing GUI and multimedia device using the same
JP2010273253A (en) * 2009-05-25 2010-12-02 Canon Inc Display device, communication device and system
JP2010277552A (en) * 2009-06-01 2010-12-09 Canon Inc Output controller, control method therefor, and program
JP2011238064A (en) * 2010-05-11 2011-11-24 Canon Inc Display control apparatus and display control method
JP2011253025A (en) * 2010-06-02 2011-12-15 Canon Inc Display device and system

Similar Documents

Publication Publication Date Title
EP3151553A1 (en) A self-calibrating projection apparatus and process
US20160189288A1 (en) Method of simulating a virtual out-0f-box experience of a packaged product
US9323444B2 (en) Device, method, and storage medium storing program
JP5066055B2 (en) Image display device, image display method, and program
US8830184B2 (en) Image displaying device, image displaying method, and program for displaying images
JP4166229B2 (en) Display device with touch panel
US10007401B2 (en) Information processing apparatus, method, and non-transitory computer-readable medium
TWI486864B (en) Mobile equipment with display function
CN103049233B (en) The control method of display device and display device
CN102591570B (en) Apparatus and method, Computer Memory Unit for controlling graphic user interface
KR20150025452A (en) Method for processing data and an electronic device thereof
US8766766B2 (en) Information processing apparatus, information processing system, information processing method, and program
JP5885714B2 (en) Image forming system and output instruction program
TW464800B (en) A method for inputting data to an electronic device, an article comprising a medium for storing instructions, and an image processing system
JP6044472B2 (en) Information processing apparatus and program
JP6103807B2 (en) Display control apparatus, control method thereof, and program
KR101622196B1 (en) Apparatus and method for providing poi information in portable terminal
KR101493603B1 (en) Display terminal device connectable to external display device and method therefor
JP5434997B2 (en) Image display device
US20130070232A1 (en) Projector
US7176881B2 (en) Presentation system, material presenting device, and photographing device for presentation
US20080068343A1 (en) Tactile pin display apparatus
US8350896B2 (en) Terminal apparatus, display control method, and display control program
US20150172555A1 (en) Projector, image projecting system, and image projecting method
CN100504487C (en) Method for selecting focusing domain for digital camera

Legal Events

Date Code Title Description
A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110329

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110502

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111115

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120116

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20120313