JP2010068272A - Portable device, display, and interdevice communication system - Google Patents

Portable device, display, and interdevice communication system Download PDF

Info

Publication number
JP2010068272A
JP2010068272A JP2008232993A JP2008232993A JP2010068272A JP 2010068272 A JP2010068272 A JP 2010068272A JP 2008232993 A JP2008232993 A JP 2008232993A JP 2008232993 A JP2008232993 A JP 2008232993A JP 2010068272 A JP2010068272 A JP 2010068272A
Authority
JP
Japan
Prior art keywords
communication
unit
external device
portable device
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008232993A
Other languages
Japanese (ja)
Inventor
Masao Shimao
雅夫 島尾
Original Assignee
Olympus Imaging Corp
オリンパスイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp, オリンパスイメージング株式会社 filed Critical Olympus Imaging Corp
Priority to JP2008232993A priority Critical patent/JP2010068272A/en
Publication of JP2010068272A publication Critical patent/JP2010068272A/en
Application status is Pending legal-status Critical

Links

Abstract

[PROBLEMS] To provide a portable device, a display device, and device-to-device communication that reliably and easily exchange images while enjoying a sense of communication status and a feeling of data movement with a simple operation on an external device. Provide a system.
A portable device 10 that communicates data in close proximity to an external device, the communication unit 12 for communicating with the external device 20, and the posture determination unit 7b that detects the posture state of the portable device 10. And an electromagnet 5b that controls the interval between the portable device 10 and the external device 20 according to the communication state by the communication unit 12 or the posture state when transmitting data communication of the captured image to the external device 20. .
[Selection] Figure 2

Description

  The present invention relates to a mobile device, a display device, and an inter-device communication system. More specifically, the present invention relates to a mobile device, a display device, and an inter-device communication system capable of transmitting data such as images by non-contact proximity communication.

  In recent years, personal computers and large-screen televisions have become widespread, and it has been practiced to reproduce and display camera images on these devices. There are also known product categories such as a digital photo frame that reproduces and displays an image recorded on a memory card or the like on a liquid crystal display as if it were a photo frame. As described above, as a part of life, reproduction and display of photographed photographs are performed.

  However, when image data is transmitted from an imaging device such as a camera to these external devices, it is troublesome to exchange data via a cable connection or a memory card. Therefore, it is desirable to transfer an image easily by wireless communication, infrared communication, or the like. In image transfer, there is a demand for a method that is easy to operate and can easily recognize that the image has been transferred reliably.

For example, Patent Documents 1 and 2 disclose a mobile communication terminal that vibrates with a vibrator and notifies whether or not a mail is normally transmitted. Patent Document 3 discloses a recording medium that controls reading and writing of data. That is, this recording medium is a non-contact type portable recording medium capable of reading and writing data, and a contactless switch is provided as a switch for prohibiting the writing of stored data in the same manner as the contact type recording medium. Yes.
JP 2000-270355 A Japanese Patent Laid-Open No. 2002-185584 JP 2007-214045 A

  The mobile communication terminals disclosed in Patent Documents 1 and 2 notify the end or abnormality of communication by vibration, but do not notify anything during communication. For this reason, the feeling of communication operation cannot be felt. Further, the recording medium disclosed in Patent Document 3 is merely positioning, and does not make communication feel intuitive.

  Communication in wireless communication, infrared communication, and the like is difficult to realize and has no touch of operation, which makes it difficult for the user to understand. Although it is important to transmit with certainty by strict operation, if you can transmit while enjoying the movement of data while enjoying it, you can enjoy it comfortably without feeling difficulty.

  In view of the above points, the present invention is a portable device that is easy to understand comfortably with a simple operation with respect to an external device, enjoys a feeling of communication status and a feeling of data movement, and reliably delivers images. Provided are a display device and an inter-device communication system. In addition, a portable device, a display device, and an inter-device communication system that can prevent devices from colliding with each other due to excessive proximity are provided. Furthermore, the present invention provides a portable device, a display device, and an inter-device communication system capable of knowing a position where communication can be performed reliably without touching the communication position.

  In order to achieve the above object, a portable device according to a first aspect of the present invention is a portable device that communicates data close to an external device, a communication unit for communicating with the external device, and an attitude of the portable device The distance between the portable device and the external device is determined according to the communication state by the communication unit or the posture state at the time of transmission of the communication of the data to the external device for detecting a state and the captured image. And an electromagnetic drive unit to be controlled.

The portable device according to the second invention is attracted to the external device by the electromagnetic drive unit when the detection unit determines that the posture is tilted in the first invention. Apply force.
The portable device according to a third invention is the portable device according to the first invention, wherein the communication state detects the intensity of radio waves or infrared rays, and based on the detected intensity, the interval between the portable device and the external device. To control.

  According to a fourth aspect of the present invention, there is provided a portable device that performs data communication in close proximity to an external device, a communication unit that communicates with the external device, and a communication state detection that detects a communication state by the communication unit. A position of the communication unit between the portable device and the external device based on the output of the communication state detection unit when transmitting the data to the external crisis of the captured image and the vibration unit that vibrates the portable device A vibration control unit that vibrates the vibration unit at a position where the relationship is appropriate.

According to a fifth aspect of the present invention, there is provided a portable device according to the fourth aspect, wherein the communication unit performs communication using radio waves or infrared rays in the communication of the data, and the communication state detection unit includes the radio waves and the intensity of the infrared rays. The vibration control unit determines the positional relationship based on the determination result by the communication state detection unit.
According to a sixth aspect of the present invention, there is provided the portable device according to the fourth aspect, wherein the vibration control unit strongly vibrates the vibration by the vibration unit within a range where a positional relationship between the portable device and the communication unit of the external device is appropriate. However, the vibration is weakened when it is away from the above range.
In the portable device according to a seventh aspect based on the fourth aspect, the vibration control unit acquires voice information from the external device, and vibrates the vibration unit in a pattern based on the acquired voice information.

  According to an eighth aspect of the present invention, there is provided a display device that performs data communication in the vicinity of a portable device, a communication unit for communicating with the portable device, and a magnet unit or an electromagnet disposed in the portable device. The distance between the portable device and the external device is determined according to the communication state by the communication unit or the posture state at the time of transmission of the communication of the data from the portable device and the electromagnet that is attracted or attracted to the unit. And an electromagnetic drive unit to be controlled.

  An inter-device communication system according to a ninth aspect of the present invention is an inter-device communication system in which data communication is performed close to an external device, and the communication determination unit that determines the end of the data communication is spaced apart from the external device. An electromagnetic unit that repels in the direction to be transmitted, and a control unit that terminates the near field communication after separating the devices by the electromagnetic unit when the communication determining unit determines the end of the communication.

  According to the present invention, a portable device and a display capable of reliably delivering images while enjoying an actual feeling of a communication state and a feeling of data movement with a simple operation with respect to an external device. An apparatus and an inter-device communication system can be provided. Further, according to the present invention, it is possible to provide a portable device, a display device, and an inter-device communication system that can prevent devices from colliding with each other due to being too close. Further, in a situation where collision may occur, a portable device, a display device, and an inter-device communication system that can know the optimum communication position by sliding the device, for example, can be provided.

  A preferred embodiment will be described below using a system including an external device 20 such as a personal computer (PC) or a television to which the present invention is applied and a camera 10 according to the drawings. FIG. 1 is a block diagram showing the configuration of a digital camera 10 and an external device 20 according to the first embodiment of the present invention.

  The image processing and control unit 1 includes a signal processing LSI dedicated to the camera 10 and controls the entire camera 10 and performs image processing of image data output from the imaging unit 2. The imaging unit 2 includes a photographing lens 2a (see FIG. 2A), an imaging element that converts a subject image formed by the photographing lens 2a into image data, and the like.

  The recording unit 4 performs image compression processing on the image data acquired by the imaging unit 2 by the image processing and control unit 1, and records this image data. In addition to the image data, the recording unit 4 has a function of recording shooting conditions such as a shooting date and time, and further recording protocol information at the time of device-to-device communication described later, information on a partner device, and the like.

  The change determination unit 3 determines camera shake or the like when the camera 10 is held based on the image data acquired by the imaging unit 2. Similarly, the motion determination unit 7 determines camera shake based on the outputs of the angular velocity sensor and the acceleration sensor. The posture determination unit 7b includes a tilt sensor and the like, and determines the posture of the camera 1 based on the detection output of the tilt sensor.

  The operation unit 6 includes operation members such as a release button and a menu button. When the user operates the operation unit 6, the image processing and control unit 1 outputs a control signal to each unit according to the operation. The display unit 8 reads a live view display that displays a subject image as a moving image in real time for framing based on the image data from the imaging unit 2, and reads out the image data recorded in the recording unit 4, and reproduces based on the image data. Display and so on.

  The clock unit 9 has a clock function and outputs shooting date information. When the image data is recorded in the recording unit 4, the photographing date / time information is also recorded.

  The communication unit 12 includes an antenna and a transmission / reception unit, and communicates image data and the like with an external device 20 such as a PC or a television. Communication is performed by non-contact proximity wireless communication or infrared communication. As will be described later, the external device 20 performs reproduction display and recording of the captured image based on the image data received from the camera 10.

  The activation unit 12b applies the IC card technology and turns on the power of the camera 10 by energy generated by electromagnetic induction when receiving radio waves from the counterpart device. When the activation unit 12b detects that the external device 20 has been approached, the image processing and control unit 1 may be activated to control the communication flow.

  The strength determination unit 12 c determines the strength of communication radio waves and communication infrared rays during communication performed by the communication unit 12. As will be described later, in the present embodiment, based on the determination result of the strength determination unit 12c, the strength of the radio wave for communication is optimized, or the distance between the camera 10 and the external device 20 is too close. And it is controlled so as not to hurt.

  The vibrator 5a generates vibration and shakes the camera 10 to notify the user of the communication state and the like. The electromagnet 5b generates an electromagnetic force when energized. As will be described later, the magnet unit 27 is disposed in the external device 20, and the electromagnet 5 b generates an electromagnetic force that repels or attracts an electromagnetic force with respect to the magnetic force of the magnet unit 27.

  An image processing and control unit 21 is disposed in the external device 20. The image processing and control unit 21 is configured by a signal processing LSI dedicated to the external device 20, etc., controls the entire external device 20, and receives image data transmitted from the camera 10, broadcast, or via the network. Image processing such as image data is performed.

  The communication unit 22 includes an antenna and a transmission / reception unit, and communicates with the communication unit 12 of the camera 10. As described above, the communication is performed by proximity wireless communication, infrared communication, or the like. The display / playback unit 23 plays back / displays an image such as a television broadcast received by the broadcast or network unit 26 described later, or an image transmitted from the camera 10.

  The display priority unit 24 selects an image to be displayed with priority in the display / playback unit 23. That is, in this embodiment, when a captured image from the camera 10 or an image from the broadcast or the network unit 26 competes, which image is to be displayed preferentially is selected. In the display, only one of them may be displayed, but the parent / child screen may be used to display the priority image on the parent screen and the non-priority image on the child screen. For example, when an image is transmitted from the camera 10 while viewing a television broadcast, the captured image from the camera 10 may be displayed as a sub image on the sub-screen without interrupting the television broadcast. it can.

  The operation unit 25 includes operation members such as a remote controller, a keyboard, and a mouse. When the user operates the operation unit 25, the image processing and control unit 21 outputs a control signal to each unit according to the operation. For example, an image or sound designated by the operation of the operation unit 25 may be played back or interrupted, or the sub image display may be instructed to be switched to an enlarged display.

  The broadcast or network unit 26 receives an image through a TV broadcast or a network such as the Internet through a TV antenna or the like, and outputs the image to the image processing or control unit 21.

  The magnet unit 27 is composed of a permanent magnet disposed near the surface of the external device 20. The polarity of either the S pole or the N pole of the magnet unit 27 is directed to the surface of the external device 20.

  Next, operations of the camera 10 and the external device 20 in the first embodiment of the present invention will be described with reference to FIGS. When the camera 10 captures and records an image recorded and reproduced by an external device 20 such as a PC or a television, the image is transmitted from the communication unit 12 by a non-contact wireless technology or the like. An image can be received by the communication unit 22 of the external device 20, reproduced and displayed by the display / reproduction unit 23, and recorded in the recording unit 28.

  When sending and receiving this image, the setting of the communication destination and the communication image is generally troublesome. Especially in the case of wireless communication or infrared communication, where and where the wireless connection is established and communication is established. Cannot judge intuitively. For this reason, many users feel the difficulty of communication.

  Therefore, as to where and where the data is connected, when the user performs non-contact proximity wireless communication with the external device 20 in which the camera 10 is actually brought close, the exchange of data can be understood by tactile sensation. Specifically, in this embodiment, an attractive force or a repulsive force is used between the electromagnet 5b built in the camera 10 and the magnet unit 27 in the adjacent external device 20. .

  When a device such as the camera 10 is held close to the other device (external device 20) for communication, the touch given to the hand changes when approaching, and the data communication state can be intuitively understood. . Therefore, the user can perform transmission / reception operations while having fun. In addition, depending on the situation, the posture determination unit 7b in the camera 10 may be controlled by determining the posture of the camera 10 and changing the touch.

  As shown in FIG. 2B, the communication unit 12 including the antenna is provided on the same surface as the photographing lens 2a of the camera 10. When performing proximity communication with the external device 20 (assuming a PC in the illustrated example), communication is established such that the communication unit 22 provided on the side of the keyboard of the external device 20 and the photographing lens 2a are in contact with each other. .

  However, if the user tries to transmit the image data while holding the camera 10 in the hand and close to the external device 20, the hand may move unstablely as shown in FIG. Therefore, when the camera 10 detects this situation, the magnet unit 27 generates an attractive force by the electromagnet 5b. Accordingly, the communication unit 12 of the camera 10 and the communication unit 22 of the external device 20 are parallel to each other, and communication can be stabilized.

  For example, in FIG. 2A, the keyboard part of the personal computer as the external device 20 is horizontal, and the communication part 22 arranged in the vicinity of the keyboard part does not face the communication part 12 of the camera 10 in parallel. And In this situation, the camera 10 can properly communicate with the image processing and control unit 1 by leveling the camera 10 based on the inclination sensor of the posture determination unit 7b and the detection result of the motion determination unit 7. To. For this reason, the image processing and control unit 1 drives and controls the electromagnet 5b so that the attractive force 41 acts on the magnet unit 27 based on the detection result. As a result, as shown in FIG. 2B, the communication units 12 and 22 of the camera 10 and the external device 20 face each other, and more stable communication can be performed.

  In addition, the camera 10 may be too close to the external device 20 and may be damaged by collision. Therefore, in the present embodiment, when the camera 10 gets too close to the external device 20, a repulsive force acts between the camera 10 and the external device as shown in FIG. That is, when it gets too close, the electromagnet 5 b is driven and controlled so that the repulsive force 42 acts on the magnet portion 27.

  In addition, when wireless communication or the like is performed between the camera 10 and the external device 20, a manner in which data moves is expressed in a simulated manner so that the user can recognize the communication state. In the conventional method, when the camera 10 is brought close to the external device 20 to perform data communication, the user has little sense of whether the data is actually transmitted, and in some cases, it is uneasy whether the communication is working well. May be. Therefore, in the present embodiment, the user is notified that the communication is in progress by vibration by the vibrator 5a.

  For example, in the example shown in FIG. 3, when the image is transmitted, the display unit 8 of the camera 10 displays the state of the image moving, and at this time, it seems that something is moving when the image goes out from the camera 10. The vibrator 5a may be made to vibrate in synchronization.

  In other words, digital data such as an image has nothing to realize even when communicating, but in the exchange of paper on which a photo is printed, a heavy paper moves. When the paper moves, there is some tactile change. Therefore, although it is easy to understand if there is a tactile sensation that the paper actually slides down, in the present embodiment, it is simulated by vibration.

  Further, the electromagnet 5b may be controlled so as to be attracted or repelled in order to give a feeling that the camera 10 becomes light by releasing a photograph. In this way, in the present embodiment, a material tactile sensation is given to the exchange of digital data that cannot be actually felt in the present embodiment, so that the actual feeling is accompanied.

  Next, the operation of the system including the camera 10 and the external device 20 according to the first embodiment of the present invention will be described using the flowcharts shown in FIGS.

  First, the flow of data communication will be described with reference to FIG. In this flow, data such as image data is transmitted from the camera 10 to the external device 20. If this flow is entered, it is first determined whether or not communication with the external device 20 is possible (S1). In this step, communication between the camera 10 and the external device 20 is possible. In other words, radio waves from the external device 20 are detected, or the first exchange with the external device 20 is established, and communication is established. Determine whether it is possible.

  If communication is not possible as a result of the determination in step S1, the electromagnet 5b is turned off (S11), the vibrator 5a is turned off (S12), and the process returns to step S1. That is, in steps S11 and S12, since communication is not possible, the electromagnet 5b and the vibrator 5a for notifying the user that communication is being performed with a tactile sensation during communication are turned off.

  If the result of determination in step S1 is that communication is possible, electromagnet control is performed (S2). That is, as described in FIGS. 2A to 2C, the communication unit 12 and 22 of the camera 10 and the external device 20 are sucked to a position facing each other, and if both are too close, they are repelled. The electromagnet 5b is controlled. This keeps the communication state optimal. Detailed operation of this electromagnet control will be described later with reference to FIG.

  Subsequently, vibrator control is performed (S3). In this step, the vibrator 5a is vibrated to inform the user that data communication is being performed. The detailed operation of the vibrator control will be described later with reference to FIG. Once the vibrator is controlled, next, image communication is performed (S4). Here, the image data is transmitted from the communication unit 12 of the camera 10 to the communication unit 22 of the external device 20 by non-contact wireless communication or the like.

  Once image communication has been carried out, it is next determined whether or not image communication has ended (S5). The image communication is performed by packet communication, and the communication capacity is transmitted in the initial stage of communication. When the external device 20 receives all of the image data, it returns a response signal to the camera 10. Therefore, in this step, it is determined whether or not a response signal has been received.

  If the result of determination in step S5 is that no response signal has been received and image communication has not ended, it is next determined whether or not it is NG (S8). That is, it is a determination whether or not a response signal is not received after waiting for a predetermined time. If the result of determination in step S8 is not NG, processing returns to step S1 and the aforementioned operation is executed.

  On the other hand, if the result of determination in step S8 is NG, NG determination is performed (S9). That is, if a response signal is not returned from the external device 20 even after waiting for a predetermined time, it is determined as NG and a warning is given to the user. When this is finished, the process returns to step S1.

  If the result of determination in step S5 is that image communication has ended, vibration of vibrator 5a is stopped (S6). As described above, the vibrator 5a is vibrated only during data communication to create a feeling of transmitting data, and the vibration of the vibrator 5a is stopped when the data transmission ends.

  Next, the electromagnet 5b is repelled for a moment (S7). In this step, the data communication is completed and a repulsive force 42 is generated by the electromagnet 5b in order to give a light feeling. When the electromagnet is repelled for a moment, the flow returns to the camera control flow described later (see FIG. 7).

  Next, the operation of electromagnet control in step S2 will be described with reference to FIG. In this flow, as shown in FIGS. 2A to 2B, the tilt of the camera 10 is controlled so that the antennas of the communication units 12 and 22 face each other. In addition, as shown in FIG. 3, the position of the camera 10 is stabilized even when the motion determination unit 7 detects that the camera 10 is staggered. That is, in these cases, the electromagnet 5b is energized, an attractive force is applied so as to attract the camera 10 to the external device 20, and the tilt and position of the camera 10 are stabilized.

  Further, when the distance between the camera 10 and the external device 20 becomes too close and is damaged, the electromagnet 5b is energized and a repulsive force is applied so as to separate the camera 10 from the external device 20. Here, the distance between the camera 10 and the external device 20 is determined by measuring the intensity of radio waves or the like by the intensity determination unit 12c and estimating the distance from the radio wave intensity.

  If the flow of electromagnet control is entered, it is first determined whether or not the inclination is unstable (S21). In this step, whether the camera 10 is tilted or unstable is determined by the motion determination unit 7 or the posture determination unit 7b.

  If the result of determination in step S21 is that there is an inclination or is unstable, the electromagnet 5b is energized in the adsorption method (S22). Thus, the tilt of the camera 10 is corrected and the antenna is made to face. Further, the camera 10 is attracted to the external device 20 to stabilize its position.

  If there is no inclination or instability as a result of the determination in step S21, or if energization is performed in the suction direction in step S22, it is next determined whether or not the radio wave intensity is strong (S23). In this step, the intensity determination unit 12c measures the intensity of radio waves and infrared rays, and estimates the distance between the camera 10 and the external device 20.

  If the result of determination in step S23 is that the intensity of radio waves or the like is higher than a predetermined value, the electromagnet 5a is energized in the repulsive direction (S24). This is to prevent the camera 10 from being too close to the external device 20 and being damaged. On the other hand, as a result of the determination in step S23, when the intensity of radio waves or the like is weaker than a predetermined value, energization to the electromagnet 5a is stopped (S31). This is because the distance between the camera 10 and the external device 20 is not too close.

  When the repulsion direction energization at step S24 or the energization stop at step S31 is performed, the process returns to the original flow.

  Next, the vibrator control in step S3 will be described with reference to FIG. In this flow, the vibrator 5a is vibrated to notify the user so as to maintain an optimum distance for performing non-contact wireless communication or the like. If the user holds the position of the camera 10 so as to have a constant vibration intensity, it is possible to perform optimal non-contact proximity communication.

  When the vibrator control flow is entered, a vibrator pattern is first determined (S39). As will be described later, when the external device 20 receives an image, the external device 20 transmits a response signal (S205). At that time, the external device 20 also transmits audio information (information about the audio in S206). . In step S39, a vibrator pattern is determined according to the received audio information. The vibrator pattern is changed by combining vibration intervals, vibration duty, vibration magnitude, frequency, and the like. For example, if the voice has a fast tempo, the vibration of the vibrator and the voice of the external device 20 may be matched sensuously, such as changing the vibrator so that it vibrates quickly.

  Once the vibrator pattern is determined, the detection intensity range is next determined (S40). The camera 10 stores data on how much the radio waves and infrared rays change when the camera 10 is brought close to the external device 20, and optimal communication is performed based on this data. The range of detection intensity that can be determined is determined.

  Once the detection intensity range is determined, it is next determined whether or not the intensity of radio waves is strong (S41). In this step, it is determined whether the intensity of the radio wave detected by the intensity determining unit 12c is stronger than the detected intensity range read in step S40. As a result of the determination, if the intensity of the radio wave or the like is strong, the vibration of the vibrator 5a is weakened (S43). This is because the camera 10 and the external device 20 are too close to each other and may be damaged. Therefore, the vibration by the vibrator 5a is weakened, and the user is made to recognize the touch so as to be away from the external device 20 so as to increase the vibration.

  If the result of determination in step S41 is that the intensity of radio waves or the like is not stronger than the detection intensity range, it is next determined whether or not the intensity of radio waves or the like is weaker than the detection intensity range (S42). As a result of the determination, if the intensity of the radio wave or the like is weak, the process proceeds to step S43 to weaken the vibrator. As a result, the current position of the camera 10 is not appropriate, and the camera 10 is recognized by touch so that the camera 10 approaches the external device 20.

  If the result of determination in step S42 is that the intensity of radio waves or the like is not weaker than the detection intensity range, the vibrator is strengthened (S44). That is, when the process proceeds from step S41 to step S42 to step S44, it means that the intensity of radio waves or the like is within a range where communication can be performed optimally. For this reason, it is made to recognize by vibration that a user can communicate optimally.

  When the vibration control of the vibrator in step S43 or step S44 is performed, the flow returns to the original flow.

  Thus, in the vibrator control flow, the user can be made aware of the position of the camera 10 that can perform optimal communication. That is, when the camera 10 is in the optimal range, the vibration by the vibrator 5a is strong, and when it is out of the optimal range, the vibration is weak. For this reason, the user may move the camera 10 in a direction in which the vibration becomes stronger and hold the position. Easy-to-understand communication can be performed, and communication between the camera 10 and the external device 20 can be realized with a tactile sensation due to vibrations and magnet forces.

  Depending on the external device, there may be a specification in which a cushion or the like is provided and slid like a mouse to find the optimum position. In this case, the upper limit of the detection intensity range in step S40 in FIG. 6 can be set to be large so that the optimum communication position can be found by touch without visual inspection. This is convenient for communication in dark places.

  Next, a camera control flow of the camera 10 capable of performing the data communication described above will be described with reference to FIG. This flow is executed by the image processing and control unit 1.

  If the camera control flow is entered, it is first determined whether or not the camera is in shooting mode (S101). The camera 100 operates in one of a shooting mode and a playback mode. If the result of this determination is that shooting mode has been set, live view display is performed (S102). In the live view display, a subject image is displayed on the display unit 8 based on the image data acquired by the imaging unit 2. The photographer observes the subject image and performs framing and the like.

  When live view display is performed, it is subsequently determined whether or not it is a release (S103). In this step, it is determined whether or not the release button is operated by the operation determination unit 6 to perform shooting. If the result of this determination is not release, processing returns to step S101 and the aforementioned operation is executed.

  If the result of determination in step S103 is that the shutter has been released, shooting is performed (S104). For imaging, image data is acquired by the imaging unit 2. In addition to recording the captured image, sound collection and recording may be performed.

  Once shooting is performed, recording is performed together with shooting conditions (S105). In this step, the image data acquired by the imaging unit 2 is subjected to image processing such as compression processing and then recorded in the recording unit 4. In addition to the image data, shooting conditions such as shooting date / time information are also recorded in the recording unit 4. If the process of step S105 is performed, the flow of camera control will be complete | finished and it will perform again from step S101.

  If the result of determination in step S101 is not shooting mode, it is next determined whether or not playback mode is in effect (S111). If the result of this determination is that playback mode has not been set, processing returns to step S101. On the other hand, if the result of determination is that the playback mode has been selected, the selected image is played back (S112). In this step, for example, an image selected from thumbnail-displayed images is enlarged and reproduced and displayed on the display unit 8.

  Once selective reproduction has been carried out, it is next determined whether or not to transmit (S113). If the image selected in step S112 is to be transmitted to the external device 20 such as a television or a PC, the transmission mode may be selected, but in the present embodiment, by approaching the counterpart device (external device 20), Sending is OK. The communication units 12 and 22 determine whether transmission is possible by monitoring the communication state.

  If the result of determination in step S113 is not transmission, processing returns to step S112 and the next image is selected. On the other hand, if the result of determination in step S113 is transmission, data transmission is performed (S114). The flow of data transmission is as described with reference to FIG.

  Once data transmission has been carried out, it is next determined whether or not data communication has ended (S115). That is, as described above, when the external device 20 receives all the image data in data transmission, a response signal is returned, and determination is made based on this response signal. If the result of this determination is end, the camera control flow is ended, and the process is executed again from step S101.

  On the other hand, if the result of determination in step S115 is not end, it is determined whether or not there is a response (S116). That is, it is determined whether there is no response signal for a predetermined time. As a result of this determination, if the predetermined time has not yet elapsed, the process returns to step S112 and waits for the end of data transmission.

  If the result of determination in step S116 is that there is no response for a predetermined time, a warning is given that data communication has failed (S117). The user only needs to bring the camera 10 closer to the external device 20 and try data communication again.

  Next, the operation of the external device 20 such as a personal computer (PC) or a television (TV) will be described with reference to the flowchart shown in FIG.

  First, it is determined whether or not a power operation has been performed (S201). Here, in addition to determining whether the power is turned on by operating a normal power switch (part of the operation unit 25), for example, when the camera 10 approaches, the power may be turned on by wireless communication or the like. good.

  If the result of determination in step S <b> 201 is that a power operation has been performed, next, power control is performed (S <b> 202). Here, the power is turned on from the shutdown mode, the standby mode, or the like. Subsequently, it is determined whether or not an instruction is received from the camera 10 (S203). Since this instruction is received via the communication unit 22, it is determined in this step whether or not the instruction has been received from the camera 10.

  If the result of determination in step S203 is that there is no instruction from the camera 10, normal control is performed (S211). For example, in the case of a PC, an instruction from a mouse or a keyboard is accepted, and in the case of a television, an instruction from a remote controller or the like is accepted. Processing according to these instructions is performed.

  If the result of determination in step S203 is that there is an instruction from the camera 10, it is next determined whether or not a photographic image has been received (S204). In this step, it is determined whether or not all data transmitted by packet communication has been received. If the result of this determination is that a photographic image has been received, that is, if all data has been received, a response signal is transmitted to the camera 10 (S205). In addition, when transmitting the response signal, audio information is also transmitted. That is, in step S206, a sound (melody) for notifying that a photographic image has been received is played, but information relating to this sound is transmitted in step S205. Based on this audio information, a vibrator pattern is determined as described above (S39).

  When the response signal is transmitted, next, sound generation is performed (S206). That is, since the communication is completed between the camera 10 and the external device 20, the external device 20 notifies the fact by voice (melody) or the like. Subsequently, the received image is displayed (S207). Here, the photographic image received from the camera 10 is reproduced and displayed on the display / reproduction unit 23.

  If the result of determination in step S204 is that a photographic image has not been received, it is determined whether or not a predetermined time has elapsed (S221). That is, it is determined whether or not an image can be received for a predetermined time (for example, about 10 seconds) even though the camera 10 has instructed data transmission. As a result of this determination, when a predetermined time elapses without receiving an image, a no response signal is transmitted to the camera 10.

  If step S207, S211 or S222 is processed, it will return to step S201 and will perform the above-mentioned operation | movement.

  Next, the structure of the motion determination part 7 is demonstrated using FIG. 9 and FIG. FIG. 9A shows an acceleration sensor 50 that detects acceleration applied to the camera 10. The acceleration sensor 50 includes a metal part 52 on the chip surface and a cross-linked metal part 51.

  When the acceleration is received, the positional relationship between the metal part 51 and the metal part 52 changes, and the acceleration output between the metal parts 51 and 52 changes as shown in FIG. 9B. When the acceleration output is integrated, a speed change as shown in FIG. 9C is obtained, and when the acceleration output is further integrated, a movement amount can be obtained as shown in FIG. 9D. This amount of movement represents the amount of movement of the camera 10 (for movement determination 1).

  FIG. 10A shows the structure of the angular velocity sensor 55. The angular velocity sensor 55 is composed of a pair of piezoelectric ceramic elements 56. The Coriolis force generated when the pair of piezoelectric ceramic elements 56 is vibrated and an angular velocity is generated causes the piezoelectric ceramic elements 56 to be deformed, thereby generating a voltage. When the output voltage at this time is integrated, as shown in FIG. 10B, the rotation amount can be obtained (for movement determination 2).

  In the camera 10, as shown in FIG. 10, the lateral slide 61 can be detected by the movement determination 1, and the tilt of the front and rear cameras 10 can be detected by the movement determination 2. It is possible to determine how the camera 10 has moved during communication and how the user is handling and moving the camera 10.

  In the above description, the operation of rotating the camera 10 has been described. However, operations such as a left / right slide, up / down shift, and tilt of the camera 10 may be determined.

  If the camera 10 is unstable and unstable, the communication state may deteriorate and communication may not be maintained. However, in this embodiment, the posture of the camera 10 is changed by the suction operation of the electromagnet 5a. By optimizing the distance, communication can be maintained.

  Such a staggered state may be determined by the strength of the radio wave, but may be the acceleration sensor 50 or the angular velocity sensor 55 of the motion determination unit 7 described above, and may be an inclination sensor.

  12A and 12B show the structure of the tilt sensor 60. FIG. The appearance of the tilt sensor 60 is a box shape as shown in FIG. 12C, and FIGS. 12A and 12B are internal sectional views thereof. Light emitting diodes (LEDs) 61 and photodiodes 62 and 63 are arranged at the corners of the space inside the tilt sensor 60, and the ball 64 can freely move in a moving space 65 in the space. .

  Since the ball 64 moves in accordance with gravity, the ball 64 moves according to gravity. Therefore, if the light of the LED 61 that irradiates the moving space 65 is incident on either the photodiode 62 or 63, the position of the ball 64 is determined. In other words, the inclination can be detected. When the tilt sensor 60 is disposed on the camera 10 as shown in FIG. 12D, the posture of the camera 10 can be determined.

  As described above, in the first embodiment of the present invention, when data is transmitted from the camera 10 to the external device 20, the communication unit is positioned opposite to each other by the action of the electromagnet 5 a and the magnet unit 27. Thus, the electromagnet 5a is controlled. For this reason, it is kept at an optimal position and communication can be performed appropriately. Further, since the user keeps the camera 10 in an optimal position while feeling the suction force, the user can enjoy the feeling of operation. Furthermore, when communicating in a dark place or in a situation where you cannot keep your eyes open, you can rely on the sense of touch for convenience.

  In the present embodiment, when the camera 10 gets too close to the external device 20, the camera 10 and the external device 20 are separated by a repulsive force. For this reason, it can prevent that both approach too much and are damaged.

  In the present embodiment, a repulsive force is generated for a moment at the end of communication. For this reason, the user can understand that the communication has ended by touch and can enjoy the operation.

  Further, in the present embodiment, the camera 10 is vibrated during communication. For this reason, the user can understand that he is communicating. In addition, since the vibration is stopped when the communication is completed, the data can be transmitted and enjoyed as if the data flowed out.

  Further, in the present embodiment, the vibration is strengthened at an optimum position for communication, and the vibration is weakened when the distance from the optimum range is exceeded. For this reason, the user may adjust the position of the camera 10 so that the vibration is strongest, and can grasp the optimum position by touch.

  Next, a second embodiment of the present invention will be described with reference to FIG. In the first embodiment of the present invention, the camera 10 is provided with one electromagnet 5a, which controls adsorption and repulsion. On the other hand, in the second embodiment, the ring-shaped magnet 13 is disposed in the camera 10 with the transmitting unit 12 as the center, and the plurality of electromagnets 29 are disposed in the external device 20. Since the configuration other than this is the same as that of the first embodiment, detailed description thereof is omitted.

  By adopting such a configuration, finer posture control can be performed, and more accurate positioning can be performed. Moreover, it is not necessary to arrange an electromagnet in a portable device such as the camera 10, and therefore, waste of the battery can be prevented.

  In order to perform accurate alignment, for example, when the radio wave condition is not stable, the camera 10 is requested to send an attitude signal, and if it is tilted, the electromagnet 29 is controlled to correct the tilt. In other words, the antennas of the communication units 12 and 22 can be correctly opposed by controlling the magnets on the side farther away from the device to be strengthened and the magnets on the closer side being controlled so that the action in the repulsion direction occurs.

  As described above, in each embodiment of the present invention, the electromagnets 5a and 29 are controlled according to the communication state and the posture state between the camera 10 and the external device 20. For this reason, it is possible to exchange images reliably and comfortably with a simple operation on an external device, while enjoying the feeling of communication and the feeling of data movement.

  In each embodiment of the present invention, the vibration state is changed according to the distance between the camera 10 and the external device 20. Therefore, the user can maintain the optimum communication state by checking this vibration state by hand.

  Furthermore, if the distance between the camera 10 and the external device 20 becomes too close, a repulsive force is applied to both. For this reason, it is possible to prevent the devices from colliding with being too close to each other.

  Further, in each embodiment of the present invention, the camera 10 is moved by the vibration unit during communication. For this reason, it can grasp | ascertain that it is communicating.

  In each embodiment of the present invention, the combination is a permanent magnet and an electromagnet, but both may be electromagnets, and the number may be selected as appropriate. The communication method is radio waves and infrared rays, but is not limited to this as long as it is non-contact proximity communication.

  In the embodiment of the present invention, the digital camera is used as the photographing device. However, the camera may be a digital single-lens reflex camera or a compact digital camera, and a video camera, a movie camera, or the like. For example, a camera built in a mobile phone or a personal digital assistant (PDA) may be used. In any case, the present invention can be applied to any portable device that can transmit data such as images to an external device.

  Furthermore, in the embodiment of the present invention, the external device is a television or a personal computer. However, the present invention is not limited to this, and the present invention is applied to any display device that can receive and reproduce data such as images. can do.

  The present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage. In addition, various inventions can be formed by appropriately combining a plurality of components disclosed in the embodiment. For example, you may delete some components of all the components shown by embodiment. Furthermore, constituent elements over different embodiments may be appropriately combined.

1 is a block diagram illustrating a circuit of a camera and an external device according to a first embodiment of the present invention. FIG. 6 is a diagram illustrating operations of the camera 10 and the external device 20 in the first embodiment of the present invention. FIG. 6 is a diagram illustrating operations of the camera 10 and the external device 20 in the first embodiment of the present invention. It is a flowchart which shows the operation | movement of the data communication of the camera concerning 1st Embodiment of this invention. It is a flowchart which shows the operation | movement of the electromagnet control of the camera concerning 1st Embodiment of this invention. It is a flowchart which shows the operation | movement of the vibrator control of the camera concerning 1st Embodiment of this invention. It is a flowchart which shows the operation | movement of the camera control of the camera concerning 1st Embodiment of this invention. It is a flowchart which shows the control operation of the external apparatus concerning 1st Embodiment of this invention. It is a figure explaining the acceleration sensor of the camera concerning 1st Embodiment of this invention. It is a figure explaining the angular velocity sensor of the camera concerning a 1st embodiment of the present invention. It is a figure explaining the motion determination in the camera concerning 1st Embodiment of this invention. It is a figure explaining the inclination sensor in the camera concerning 1st Embodiment of this invention. It is a figure which shows the relationship of the arrangement | positioning of an electromagnet and a magnet in a camera and an external apparatus in 2nd Embodiment of this invention.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 ... Image processing and control part, 2 ... Imaging part, 2a ... Shooting lens, 3 ... Change determination part, 4 ... Recording part, 5a ... Vibrator, 5b ... Electromagnet , 6 ... Operation determination unit, 7 ... Motion determination unit, 7b ... Posture determination unit, 8 ... Display unit, 9 ... Clock unit, 10 ... Camera, 12 ... Communication Part, 12b ... start-up part, 12c ... strength determination part, 13 ... magnet, 20 ... television, 21 ... image processing and control part, 22 ... communication part, 23 ... Display / playback unit, 24 ... display priority unit, 25 ... operation unit, 26 ... broadcast or network unit, 27 ... magnet unit, 28 ... recording unit, 29 ... electromagnet, 41 ... Suction force, 42 ... Repulsive force, 50 ... Acceleration sensor, 51 ... Metal part, 52 ... Metal part, 55 ... Angular velocity sensor, 6 ... Piezoceramic element, 60 ... Tilt sensor, 61 ... Light emitting diode (LED), 62 ... Photodiode (PD), 63 ... Photodiode (PD), 64 ... Ball , 65 ... Moving space

Claims (9)

  1. In portable devices that communicate data close to external devices,
    A communication unit for communicating with the external device;
    A detection unit for detecting the posture state of the portable device;
    An electromagnetic drive unit that controls a distance between the portable device and the external device according to a communication state by the communication unit or a posture state when transmitting communication of the data to the external device of a captured image;
    A portable device comprising:
  2.   The portable device according to claim 1, wherein when the posture is determined to be inclined by the detection unit, an adsorption force is applied to the external device by the electromagnetic drive unit. .
  3.   The portable device according to claim 1, wherein the communication state detects an intensity of radio waves or infrared rays and controls an interval between the portable device and the external device based on the detected intensity.
  4. In portable devices that communicate data close to external devices,
    A communication unit for communicating with the external device;
    A communication state detection unit for detecting a communication state by the communication unit;
    A vibrating section for vibrating the portable device;
    Vibration that causes the vibration unit to vibrate at a position where the positional relationship between the portable device and the communication unit of the external device becomes appropriate based on the output of the communication state detection unit when transmitting the data to the external crisis of the captured image A control unit;
    A portable device comprising:
  5. In communication of the data in the communication unit, communication by radio waves or infrared rays,
    The communication state detection unit determines a change in the intensity of the radio wave or the infrared ray,
    The portable device according to claim 4, wherein the vibration control unit determines the positional relationship based on a determination result by the communication state detection unit.
  6.   The vibration control unit strengthens the vibration by the vibration unit within a range where the positional relationship between the communication unit of the portable device and the external device is appropriate, and weakens the vibration when leaving the range. The mobile device according to claim 4.
  7.   The portable device according to claim 4, wherein the vibration control unit acquires voice information from the external device and vibrates the vibration unit with a pattern based on the acquired voice information.
  8. In a display device that communicates data close to a portable device,
    A communication unit for communicating with the portable device;
    An electromagnet that mutually attracts or attracts a magnet or electromagnet disposed in the portable device;
    An electromagnetic drive unit that controls a distance between the portable device and the external device according to a communication state by the communication unit or a posture state at the time of transmission of the data communication from the portable device;
    A display device comprising:
  9. In a communication system between devices that communicate data close to an external device,
    A communication determination unit that determines the end of communication of the data;
    An electromagnetic part that repels the external device in the direction of separating the space;
    A controller that terminates the near field communication after separating the devices by the electromagnetic unit when the communication determining unit determines the end of the communication;
    An inter-device communication system comprising:
JP2008232993A 2008-09-11 2008-09-11 Portable device, display, and interdevice communication system Pending JP2010068272A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008232993A JP2010068272A (en) 2008-09-11 2008-09-11 Portable device, display, and interdevice communication system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2008232993A JP2010068272A (en) 2008-09-11 2008-09-11 Portable device, display, and interdevice communication system
CN201210400770.6A CN102917162B (en) 2008-09-11 2009-09-11 Portable equipment
CN 200910173148 CN101674340B (en) 2008-09-11 2009-09-11 A portable apparatus, a display apparatus and a communication system

Publications (1)

Publication Number Publication Date
JP2010068272A true JP2010068272A (en) 2010-03-25

Family

ID=42021324

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008232993A Pending JP2010068272A (en) 2008-09-11 2008-09-11 Portable device, display, and interdevice communication system

Country Status (2)

Country Link
JP (1) JP2010068272A (en)
CN (2) CN102917162B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011176889A (en) * 2011-06-03 2011-09-08 Toshiba Corp Electronic apparatus and control method
JP2012034112A (en) * 2010-07-29 2012-02-16 Canon Inc Image controller, image control method, and program
JP2014123330A (en) * 2012-12-21 2014-07-03 Nhk Spring Co Ltd Function addition device, and communication system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08335968A (en) 1995-06-06 1996-12-17 Sony Corp Portable information terminal equipment
JP2005216044A (en) 2004-01-30 2005-08-11 Seiko Precision Inc Non-contact ic card and holder for non-contact ic card
CN101142759A (en) 2004-06-30 2008-03-12 施瑞修德公司 Method and apparatus for configuring a network appliance
JP4127835B2 (en) * 2005-01-28 2008-07-30 株式会社タイトー game system
US8951779B2 (en) * 2005-12-21 2015-02-10 Samsung Electronics Co., Ltd. Bio memory disc and bio memory disc drive apparatus, and assay method using the same
JP5124991B2 (en) 2006-05-30 2013-01-23 ソニー株式会社 Communication system, communication device, communication method, and program
CN101145203A (en) 2006-09-11 2008-03-19 精工爱普生株式会社 Contactless data communication system and contactless IC tag

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012034112A (en) * 2010-07-29 2012-02-16 Canon Inc Image controller, image control method, and program
JP2011176889A (en) * 2011-06-03 2011-09-08 Toshiba Corp Electronic apparatus and control method
JP2014123330A (en) * 2012-12-21 2014-07-03 Nhk Spring Co Ltd Function addition device, and communication system

Also Published As

Publication number Publication date
CN101674340A (en) 2010-03-17
CN102917162B (en) 2016-01-13
CN102917162A (en) 2013-02-06
CN101674340B (en) 2012-12-05

Similar Documents

Publication Publication Date Title
JP5029012B2 (en) Electronics
CN102668541B (en) Image capture device having tilt or perspective correction
KR20120047514A (en) Mobile terminal and method for controlling photographing image thereof
RU2460234C1 (en) Configuration device, image displaying device, methods of their control and software
US7307653B2 (en) Image stabilizer for a microcamera module of a handheld device, and method for stabilizing a microcamera module of a handheld device
CN101180874B (en) Imaging device, display control device, display device, printing control device, and printing device
US20190208065A1 (en) Electronic device, imaging device, image reproduction method, image reproduction program, recording medium with image reproduction program recorded thereupon, and image reproduction device
CN105681652B (en) The control method of camera and camera
US9979873B2 (en) Attachable digital photographing system and operation method thereof
JP2004096166A (en) Electronic camera and electronic camera system
US7952613B2 (en) Image blur correcting unit, image blur correcting device, image pickup apparatus and portable equipment
CN1270567C (en) Electronic equipment with two-dimensional and three-dimensional display function
US8786751B2 (en) Display control system, display control apparatus and control method therefor
WO2006033222A1 (en) Electronic apparatus
US20180095357A1 (en) Mobile and portable screen to view an image recorded by a camera
JPWO2006100804A1 (en) Imaging device
US20140036129A1 (en) Photographing apparatus and photographing method
CN103595909B (en) Mobile terminal and control method thereof
US8013894B2 (en) Image capturing apparatus and control method thereof with attitude control
US20090262205A1 (en) Voice activated headset imaging system
JP4747520B2 (en) Lens driving mechanism and imaging apparatus
JP2006135782A (en) Imaging apparatus, display controller and display device
JP2005159711A (en) Imaging apparatus
US8730335B2 (en) Imaging apparatus and imaging system
US20140119601A1 (en) Composition determination device, composition determination method, and program