JP2007121489A - Portable display device - Google Patents

Portable display device Download PDF

Info

Publication number
JP2007121489A
JP2007121489A JP2005310973A JP2005310973A JP2007121489A JP 2007121489 A JP2007121489 A JP 2007121489A JP 2005310973 A JP2005310973 A JP 2005310973A JP 2005310973 A JP2005310973 A JP 2005310973A JP 2007121489 A JP2007121489 A JP 2007121489A
Authority
JP
Japan
Prior art keywords
display
unit
movement
distance
housing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2005310973A
Other languages
Japanese (ja)
Other versions
JP4977995B2 (en
Inventor
Eiji Yamaguchi
英治 山口
Original Assignee
Nec Corp
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nec Corp, 日本電気株式会社 filed Critical Nec Corp
Priority to JP2005310973A priority Critical patent/JP4977995B2/en
Publication of JP2007121489A publication Critical patent/JP2007121489A/en
Application granted granted Critical
Publication of JP4977995B2 publication Critical patent/JP4977995B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a portable display device which is not always need to have an operation part independent of a device case, the mobile display device being characterized in that display contents can accurately be operated by case movement and can also be operated with a natural operation feeling without awareness of an interface. <P>SOLUTION: A movement quantity arithmetic unit 3 uses a measurement result of an acceleration measurement unit 1 and an image photographed by an outside camera unit 21 so that they are mutually corrected, and calculates a quantity of three-dimensional displacement of the device case, so that the quantity of three-dimensional displacement of the device case can be calculated more accurately. The movement quantity arithmetic unit 3 calculates the quantity and direction of movement of a display position of a map displayed at a display unit 5 from the quantity of three-dimensional displacement of the device case and a display control unit 4 scrolls the map displayed at the display unit 5 according to the quantity and direction of movement to move the display position. <P>COPYRIGHT: (C)2007,JPO&INPIT

Description

  The present invention relates to a portable display device capable of displaying various information such as a map on a display means.

  Conventionally, for a virtual image that cannot be displayed entirely on the display screen of a portable display device, such as a large map, for example, when a part of the image is displayed on the portable display device, display in the displayed image In order to change the portion, an operation by an operation input means such as pressing a button corresponding to up / down / left / right or scrolling using a pointing device is necessary.

  However, in general, a portable display device has a small display screen and operation input means, and there are many cases where it is not possible to operate a mouse on a desk, so it is troublesome to perform operations such as scrolling with a small operation input means. There was a risk of becoming.

  Here, there is an apparatus in which an acceleration sensor for detecting static acceleration is provided in a housing of a main body, and a cursor on a display screen is moved in accordance with an output value of the acceleration sensor (see, for example, Patent Document 1).

  In addition, in the operation unit connected to the video display control device main body, a rotation amount detection means for detecting the rotation amount of the operation unit around each three-dimensional coordinate axis, and a movement amount detection for detecting the movement amount in the two-dimensional direction And a means for controlling the display of a three-dimensional image (for example, see Patent Document 2).

In addition, there is one that detects the operator's viewpoint position and performs a line-of-sight display so as to perform a screen movement process when the line-of-sight movement speed exceeds a predetermined speed (for example, see Patent Document 3).
JP 2000-56897 A JP-A-4-330512 JP-A-8-22385

  However, as in Patent Document 1 described above, for example, when an operator operates on a train or a car only by detecting acceleration due to movement of the casing using an acceleration sensor and controlling display contents, etc. When there is an acceleration generation factor other than the case movement, there is a possibility that the operation cannot be performed smoothly or the operation is not intended by the operator.

The above-mentioned Patent Document 2 controls the display content by an operator operating an operation unit connected to the video display control device main body which controls to display an image on a display screen. The provision of the means separately from the main body and the display screen has not been taken into consideration for application to a portable display device that is not preferable in view of the housing size.
In addition, since the display content is controlled by the operator operating the operation unit, the operator needs to be aware of the operation unit as an interface, and pursues a natural feeling of operation that does not make the interface conscious. It was not something that was taken into consideration.

  Further, the above-mentioned Patent Document 3 provides a human interface that moves a cursor by the movement and stop of the eyeball, and makes the operator aware of the movement of the eyeball as an interface. In other words, it was not even considered to pursue a natural feeling of operation without making the interface conscious.

  The present invention has been made in view of such a situation. For a portable display device that does not necessarily require an operation unit separate from the device housing, display control by moving the housing cannot be accurately performed only by detecting the acceleration. To provide a portable display device capable of accurately manipulating display contents by moving a casing and capable of manipulating display contents with a natural feeling of operation without making the user aware of the interface even in situations. With the goal.

  In order to achieve this object, a portable display device according to the present invention includes an acceleration measuring unit that measures acceleration and a moving direction in movement of a device housing, and an external information recognition unit that recognizes a moving state of an entity outside the device. And a movement amount calculation means for calculating a three-dimensional displacement amount of the apparatus housing based on a measurement result by the acceleration measurement means and a recognition result by the external information recognition means, and a three-dimensional displacement amount calculated by the movement amount calculation means Display control means for controlling the display content on the display means based on the above.

  The display control means described above uses display information including information that exceeds the display range that can be displayed at once by the display means as display information by the display means, and causes the display means to display a part of the display information. Preferably, the display portion by the display means is moved based on the three-dimensional displacement amount of the apparatus housing calculated by the movement amount calculation means.

  The external information recognizing means is a distance measuring means for measuring a distance from an entity in a predetermined direction, and the movement amount calculating means is a measurement result by the acceleration measuring means and a measurement by the distance measuring means. It is preferable to calculate the three-dimensional displacement amount of the apparatus housing based on the result.

  The external information recognizing means described above is an image capturing means for capturing an image, and the movement amount calculating means described above is based on the measurement result obtained by the acceleration measuring means and the photographed image obtained by the image capturing means. The amount of displacement may be calculated.

  The external information recognizing means described above is a distance measuring means for measuring the distance to the existence object in a predetermined direction, and an imaging means for taking an image, and the movement amount calculating means described above is measured by the acceleration measuring means. The three-dimensional displacement amount of the apparatus housing may be calculated based on the result, the measurement result by the distance measuring unit, and the captured image by the imaging unit.

  The imaging means includes a display surface direction imaging unit that captures an approximate display surface direction by the display unit, and an outer direction imaging unit that captures a predetermined direction other than the display surface direction. It is preferable to calculate the three-dimensional displacement amount of the apparatus housing based on the measurement result by the acceleration measuring unit, the captured image by the display surface direction imaging unit, and the captured image by the outer direction imaging unit.

  It comprises gaze direction recognizing means for recognizing a human eyeball based on a photographed image by the display surface direction imaging section and calculating a gaze direction, and the display control means is configured such that the gaze direction is recognized by the gaze direction recognizing means. It is preferable to perform control to move the display portion on the display means in the display information only when it is recognized that the screen is facing.

  The imaging means described above includes at least a display surface direction imaging unit that captures an approximate display surface direction by the display means, and a face recognition unit that recognizes a human face based on an image captured by the display surface direction imaging unit, Interpersonal distance measuring means for measuring the distance from the apparatus housing to the face recognized by the face recognition means, and the display control means described above has a distance according to the distance measured by the interpersonal distance measuring means. The display magnification may be increased every time the distance is closer, and the display magnification may be reduced every time the distance is longer.

  A stop command input means for temporarily stopping the control for moving the display portion on the display means in the display information is displayed, while the input from the stop command input means is detected. It is preferable that the control means temporarily stops the control for moving the display portion on the display means in the display information.

  As described above, according to the present invention, even with respect to a portable display device that does not necessarily require an operation unit separate from the device housing, display control by moving the housing cannot be accurately performed only by detecting the acceleration. The display contents can be accurately manipulated by moving the casing, and the display contents can be manipulated with a natural feeling without making the interface conscious.

Next, an embodiment to which a portable display device according to the present invention is applied will be described in detail with reference to the drawings.
First, features common to the embodiments of the present invention will be described.

In the portable display device as each embodiment of the present invention, the display unit of the device is cut out by the operator moving the device housing of the device that is cutting out and displaying a part of the virtual image. It has a function of changing the existing image according to the movement content.
As a result, the operator can freely view the entire image by an operation close to that of the natural world.

  In order to realize the above-described function, each embodiment of the present invention is configured to obtain additional correction information such as captured image information from a camera or position information from an ultrasonic sensor in addition to movement acceleration information from an acceleration sensor. Thus, the movement contents (movement amount and movement direction) of the apparatus housing can be grasped more accurately, and appropriate display control can be performed.

[First Embodiment]
Next, a portable display device as a first embodiment of the present invention will be described.
As shown in FIG. 1, the portable display device according to the present embodiment includes an acceleration measurement unit 1 that measures the acceleration and movement direction in movement of the device housing as a three-dimensional displacement, and a direction other than the display surface direction of the display unit 5. An outside camera unit (outside direction imaging unit) 21 that captures an image in a direction (for example, the opposite direction), a movement amount calculation unit 3, a display control unit 4, and a display unit 5 are provided.

The acceleration measuring unit 1 includes an acceleration sensor, and measures the magnitude of acceleration, the three-dimensional direction of movement, the rotation angular velocity, the rotation direction, and the like using the acceleration sensor.
When some movement is performed on the portable display device, the movement amount calculation unit 3 is based on the measurement result by the acceleration measurement unit 1 and the photographed image by the outer camera unit 21, the movement direction, the movement distance, and the rotation direction. The three-dimensional displacement amount of the apparatus housing such as the rotation angle is calculated.
The display control unit 4 controls the display content displayed by the display unit 5 based on the three-dimensional displacement amount thus calculated.

Since the portable display device of the present embodiment has the above-described configuration inside the device housing, the display content can be operated by moving the device housing. It is a “portable display device that is not necessarily required”.
Here, “provided inside the apparatus casing” means that the portable display device according to the present embodiment includes a plurality of apparatuses such as a structure in which two casings are rotatably connected via a hinge portion. In the case of a configuration in which a case is connected, the case where the case is provided inside any one of the cases is included.

Next, the operation of the portable display device of this embodiment will be described with reference to the flowchart of FIG.
Here, description will be made on the assumption that the operator is going to view the map with the portable display device.

  For example, assuming that the area around Tokyo Station is displayed on the display unit 5, when the operator wants to make it the Ochanomizu direction, the operator moves the entire device housing to the left side (west side of the map). When the apparatus casing is moved in this way, the acceleration measuring unit 1 detects the acceleration and three-dimensional direction of the movement (step S1).

  Here, in the map display state, the outer camera unit 21 acquires environment information around the apparatus from the captured image, and the movement amount calculation unit 3 captures the measurement result by the acceleration measuring unit 1 and the outer camera unit 21. Based on the image, a three-dimensional displacement amount of the apparatus housing such as a movement direction, a movement distance, a rotation direction, and a rotation angle is calculated (step S2).

The operation in step S2 will be described in more detail. The image displayed on the outer camera unit 21 described above is not displayed on the display unit 5, but corrects the calculation of the three-dimensional displacement amount using the acceleration sensor. It is an image acquired as an internal process for the purpose.
Here, since the outer camera unit 21 faces in the opposite direction to the operator looking at the display unit 5, it captures an environment surrounding the operator who operates the portable display device.
The movement amount calculation unit 3 takes a predetermined number of sample points in this captured image (for example, a total of five points including four corner points and one central point). By calculating whether it has moved in the direction, the calculation result is used as auxiliary information for calculating the amount of change in the relative position of the outer camera unit 21 itself and the surrounding environment, that is, the three-dimensional displacement of the apparatus housing.

  Furthermore, the outer camera unit 21 is provided with an autofocus function, and by measuring the distance to the sample point by the autofocus function, how the outer camera unit 21 moves with respect to each sample point. Can be calculated more accurately.

  When the sample points all behave in the same manner (moved in the same direction on the image) by the three-dimensional displacement amount calculation using the photographed image by the outer camera unit 21, the environment has moved (for example, in front of the eyes). Or if the sample points are all on the side of the bus) or whether the camera has moved in the stationary system.

Thus, the movement amount calculation unit 3 uses the measurement result by the acceleration measurement unit 1 and the photographed image by the outer camera unit 21 so as to correct each other, thereby calculating the three-dimensional displacement amount of the device casing, The three-dimensional displacement amount of the housing is calculated more accurately.
Thus, from the three-dimensional displacement amount of the apparatus housing, the movement amount calculation unit 3 calculates the movement amount and movement direction of the display position of the map displayed on the display unit 5, and the display control unit 4 determines the movement amount and movement. Depending on the direction, the display position of the map displayed on the display unit 5 is scrolled and moved (step S3).

As described above, according to the first embodiment of the present invention, the display contents of the display unit 5 can be scrolled according to the amount and direction of movement of the apparatus housing by the operator moving the apparatus housing. it can.
For this reason, for example, when displaying a map without being conscious of the interface, the display contents can be operated with a natural feeling of feeling as if the paper with a hole is slid on the map. Become.

  In this way, even in a portable display device having only a relatively small display portion for portability, the display contents can be operated by moving the device casing itself, and thus the operation performed by the operator in the natural world The operation can be performed intuitively by the operation having the symmetry.

[Second Embodiment]
Next, a second embodiment of the present invention will be described.
This second embodiment is an acceleration measurement instead of calculating the three-dimensional displacement based on the information of the acceleration measurement unit 1 and the outer camera unit 21 in the first embodiment described above. This is calculated based on the unit 1 and distance measurement information by transmission / reception of ultrasonic waves.
Components similar to those in the first embodiment described above are denoted by the same reference numerals, and description thereof is omitted.

  As shown in FIG. 3, the portable display device as the second embodiment includes an acceleration measurement unit 1, a movement amount calculation unit 3, a display control unit 4, a display unit 5, and ultrasonic waves for ranging. An ultrasonic transmission unit 61 that transmits in a predetermined direction and an ultrasonic reception unit 62 that receives the transmitted ultrasonic waves are provided.

  The ultrasonic transmission unit 61 and the ultrasonic reception unit 62 measure a distance from an entity in a predetermined direction other than the display surface direction by the display unit 5, such as a distance between the apparatus housing and the left and right walls, for example. Functions as a distance measuring means. This distance measuring method using ultrasonic waves may be a known method. Further, as long as the distance is measured with a predetermined direction, various directions and various distance analysis algorithms may be used.

Next, the operation of the portable display device as the second embodiment will be described.
The operation according to the second embodiment is different from the operation according to the first embodiment described above in which step S2 is calculated using an image captured by the outer camera unit 21, and the ultrasonic transmission unit 61 and the ultrasonic The operation is performed using the distance information measured by the sound wave receiving unit 62.

  That is, for example, in a state where a map or the like is displayed, the ultrasonic transmission unit 61 and the ultrasonic reception unit 62 acquire the distance from the surroundings of the apparatus as the surrounding environment information, and the movement amount calculation unit 3 Based on the measurement result by the acceleration measurement unit 1 and the distance measurement result by the ultrasonic wave transmission unit 61 and the ultrasonic wave reception unit 62, the three-dimensional displacement amount of the apparatus housing such as the movement direction, the movement distance, the rotation direction, and the rotation angle. Is calculated.

In the three-dimensional displacement calculation using the distance measurement results by the ultrasonic transmission unit 61 and the ultrasonic reception unit 62, the environment around the observer does not change particularly like an unmanned bus (a person stands or sits down). It is possible to function most effectively in the state of the acceleration system in a situation where no person walks or walks nearby).
In other words, in addition to the acceleration and direction information measured by the acceleration measuring unit 1, the positional relationship between the left and right wall surfaces is changed, so that even in a moving vehicle, malfunction occurs due to the acceleration due to the movement of the vehicle. It is possible to control the display contents according to the operator's operation intention without causing any trouble.

  As described above, according to the second embodiment of the present invention, in the same manner as in the first embodiment described above, the operator moves the apparatus housing, and according to the movement amount and the movement direction, The display content of the display unit 5 can be scrolled without being conscious of the interface.

  Furthermore, since the ultrasonic transmitter 61 and the ultrasonic receiver 62 perform ultrasonic distance measurement, it is possible to recognize the distance from the device casing to the surrounding entity even in a dark place, and to adjust the brightness of the periphery of the device. Regardless of the measurement result obtained by the acceleration measuring unit 1, it is possible to recognize the moving state of the entity outside the apparatus, and to calculate the three-dimensional displacement amount with higher accuracy.

In addition, although 2nd Embodiment mentioned above was demonstrated as a structure provided with the ultrasonic transmission part 61 and the ultrasonic receiving part 62 instead of the outer side camera part 21 in 1st Embodiment mentioned above, for example in FIG. As shown, the outer camera unit 21 described above may be further provided in addition to the ultrasonic transmission unit 61 and the ultrasonic reception unit 62, and the captured image may be further added to calculate the three-dimensional displacement. .
According to this configuration, in addition to the measurement result by the acceleration measurement unit 1, the apparatus housing 3 is used by using the photographed image by the outer camera unit 21 and the distance measurement result by the ultrasonic transmission unit 61 and the ultrasonic reception unit 62. Since the dimensional displacement amount can be calculated, the display content can be scrolled more accurately according to the movement of the apparatus housing.

[Third Embodiment]
Next, a third embodiment of the present invention will be described.
In addition to the configurations of the first and second embodiments described above, the third embodiment further includes an inner camera unit 22 so that display content scrolling according to the movement of the apparatus housing can be performed more accurately. It can be performed without malfunction.
Components similar to those in the first and second embodiments described above are denoted by the same reference numerals, and description thereof is omitted.

As shown in FIG. 5, the portable display device according to the third embodiment includes an acceleration measurement unit 1, an outer camera unit 21, and an inner camera unit (display surface) that captures the approximate display surface direction of the display unit 5. Direction imaging unit) 22, movement amount calculation unit 3, display control unit 4, display unit 5, and face recognition unit 7 for recognizing a face in a captured image by the inner camera unit 22.
The inner camera unit 22 functions as an operator direction photographing unit that photographs the direction of the operator by facing substantially the display surface direction as described above.
A known method may be used as a method for recognizing a face from a captured image.

Next, the operation of the portable display device as the third embodiment will be described.
When the screen display by the display unit 5 is ON, the inner camera unit 22 always captures the direction of the display surface by the display unit 5, and the face recognition unit 7 recognizes the face in the captured image.
When the acceleration sensor detects a movement, in addition to the measurement result by the acceleration measuring unit 1, the three-dimensional image of the apparatus housing is used by using a photographed image by the outer camera unit 21 and a photographed image by the inner camera unit 22. The amount of displacement is calculated, and the display contents are scrolled according to the movement of the casing.

  Here, there may be a case where it is difficult to accurately recognize the movement of the camera if only the outer camera unit 21 is not a stationary system, but the photographed image by the inner camera unit 22 shows the operator. It is possible to grasp how the face recognized by the face recognition unit 7 and the operator's clothes below it move on the image, and change the relative position between the portable display device and the operator. It can be recognized more accurately.

In other words, in order to accurately perform the display content scrolling according to the movement of the housing, it is necessary to accurately recognize the relative positional relationship of the portable display device with respect to the operator.
For example, when the operator tries to operate the portable display device in a state of getting on the bus and the bus departs, the acceleration measuring unit 1 measures the movement of the portable display device as acceleration. However, since the inner camera unit 22 captures an image of the operator, the movement amount calculation unit 3 recognizes that the relative position between the portable display device and the operator has not changed from the captured image. For example, the display content of the display unit 5 is not scrolled by the control of the movement amount calculation unit 3.

  As described above, according to the third embodiment of the present invention, at least two of the peripheral photographing (outside camera unit 21) and the operator photographing (inside camera unit 22) are used as imaging means. And a three-dimensional displacement amount of the apparatus housing is calculated using a photographed image by the outer camera section 21 and a photographed image by the inner camera section 22 in addition to the measurement result by the acceleration measuring section 1. It is possible to perform display content scrolling more accurately according to the movement of the body.

In the above-described third embodiment, as illustrated in FIG. 5, the configuration including the outer camera unit 21 and the inner camera unit 22 as the imaging unit has been described, but instead of the outer camera unit 21, The structure provided with the ultrasonic transmission part 61 and the ultrasonic reception part 62 which were mentioned in 2nd Embodiment may be sufficient. In this case, the movement amount calculation unit 3 uses the captured image by the inner camera unit 22 and the distance measurement results by the ultrasonic transmission unit 61 and the ultrasonic reception unit 62 in addition to the measurement result by the acceleration measurement unit 1. The three-dimensional displacement amount of the housing is calculated.
Moreover, while providing the outer side camera part 21 and the inner side camera part 22 as an imaging means, the structure provided with the ultrasonic transmission part 61 and the ultrasonic reception part 62 may be sufficient. In this case, in addition to the measurement result by the acceleration measurement unit 1, the movement amount calculation unit 3 includes a captured image by the outer camera unit 21, a captured image by the inner camera unit 22, an ultrasonic transmission unit 61, and an ultrasonic reception unit. The three-dimensional displacement amount of the apparatus housing is calculated using the distance measurement result obtained by 62.
According to this configuration, a more accurate three-dimensional displacement amount can be calculated, and display contents can be scrolled more accurately according to the movement of the apparatus housing.

[Fourth Embodiment]
Next, a fourth embodiment of the present invention will be described.
In the fourth embodiment, in addition to the configuration of the third embodiment described above, it is recognized that the line of sight of the operator photographed by the inner camera unit 22 is facing the display unit 5. Only the display content scrolling according to the third embodiment described above is performed.
Components similar to those in the first to third embodiments described above are denoted by the same reference numerals, and description thereof is omitted.

As shown in FIG. 6, the portable display device according to the fourth embodiment includes an acceleration measurement unit 1, an outer camera unit 21, an inner camera unit 22 that captures an approximate display surface direction by the display unit 5, The movement amount calculation unit 3, the display control unit 4, the display unit 5, the face recognition unit 7 that recognizes the face in the photographed image by the inner camera unit 22, and the face recognized by the face recognition unit 7 And a line-of-sight direction recognition unit 8 that recognizes the line-of-sight direction of the eyeball.
A known method may be used as a method for face recognition or eyeball recognition from a captured image.

Next, the operation of the portable display device as the fourth embodiment will be described.
When the screen display by the display unit 5 is ON, the inner camera unit 22 always captures the direction of the display surface by the display unit 5, and the face recognition unit 7 recognizes the face in the captured image. Based on this face recognition information, the eye direction recognition unit 8 recognizes the movement of the eyeball in the face of the operator, and the eyeball looks at the display device from the positional relationship between the inner camera unit 22 and the display unit 5. Only when the line-of-sight direction recognizing unit 8 recognizes, the display content scrolling according to the movement of the apparatus housing is performed in the same manner as in the first and second embodiments described above.

  As described above, according to the fourth embodiment of the present invention, at least two of the peripheral photographing (outside camera unit 21) and the operator photographing (inside camera unit 22) are used as imaging means. And by recognizing the movement of the eyeball based on the image captured by the inner camera unit 22, the change in the relative position between the portable display device and the operator is recognized, and the operator's line of sight is displayed on the display unit 5. The display content scroll according to the third embodiment described above can be performed only when it is recognized that the user is facing the screen.

In the above-described fourth embodiment, as shown in FIG. 6, when it is recognized that the operator's line of sight is facing the display unit 5, in addition to the measurement result by the acceleration measurement unit 1, Although described as a configuration for calculating the three-dimensional displacement amount of the apparatus housing using the image captured by the outer camera unit 21, in addition to the measurement result by the acceleration measurement unit 1, as in the second embodiment described above, The configuration may be such that the three-dimensional displacement amount of the apparatus housing is calculated using the distance measurement results by the ultrasonic transmission unit 61 and the ultrasonic reception unit 62.
Moreover, both the outer side camera part 21, the ultrasonic transmission part 61, and the ultrasonic reception part 62 are provided, and in addition to the measurement result by the acceleration measurement part 1, the picked-up image by the outer camera part 21, and the ultrasonic transmission part 61 are included. Alternatively, the configuration may be such that the three-dimensional displacement amount of the apparatus housing is calculated using the distance measurement result by the ultrasonic receiving unit 62.
According to this configuration, a more accurate three-dimensional displacement amount can be calculated, and display contents can be scrolled more accurately according to the movement of the apparatus housing.

[Fifth Embodiment]
Next, a fifth embodiment of the present invention will be described.
In the fifth embodiment, in addition to the functions of the first to fourth embodiments described above, the screen display can be enlarged / reduced by the operator bringing his / her face close to or away from the portable display device. It is what I did.
Components common to the first to fourth embodiments described above are denoted by the same reference numerals, and description thereof is omitted.

  As shown in FIG. 7, the portable display device according to the fifth embodiment includes an acceleration measurement unit 1, an outer camera unit 21, an inner camera unit 22, a movement amount calculation unit 3, and a display control unit 4. A display unit 5, a face recognition unit 7 for recognizing a face in a captured image by the inner camera unit 22, and an enlargement / reduction process for performing an enlargement / reduction process of the screen display based on a result of photographing by the inner camera unit 22. A reduction button (enlargement / reduction command input means) 91.

Next, the operation of the portable display device as the fifth embodiment will be described.
When the enlargement / reduction button 91 is pressed by the operator while the screen display by the display unit 5 is ON, the face recognition unit 7 recognizes the face from the captured image in the display surface direction by the inner camera unit 22. The movement amount calculation unit 3 recognizes the distance from the inner camera unit 22 to the operator's face by focusing the inner camera unit 22 on the face (interpersonal distance measuring means).

Thus, while the enlargement / reduction button 91 is pressed, the distance information from the inner camera unit 22 to the operator's face is acquired, and the distance from the inner camera unit 22 to the operator's face has become shorter. When recognized, the display control unit 4 enlarges the screen display according to the amount of decrease in distance. That is, control is performed to increase the display magnification of the screen.
When it is recognized that the distance from the inner camera unit 22 to the operator's face has increased, the display control unit 4 reduces the screen display in accordance with the amount of increase in the distance. That is, control is performed to reduce the display magnification of the screen.

When the operator stops pressing the enlargement / reduction button 91, the display content enlargement / reduction control described above is stopped, and the screen display continues at the screen display magnification when the enlargement / reduction button 91 is no longer pressed. Will be.
In other words, the above-described display content enlargement / reduction control is performed only while the enlargement / reduction button 91 is pressed.

  As described above, according to the fifth embodiment of the present invention, when the portable display device is brought close to the face while the operator presses the enlargement / reduction button 91, screen display is performed according to the approached distance. When the portable display device is moved away from the face with the operator pressing the enlarge / reduce button 91, the display magnification of the screen display is reduced according to the distance. Can be controlled.

  In the above-described fifth embodiment, the distance between the portable display device and the operator's face has been described as being measured by focusing the inner camera unit 22. However, the face recognition unit 7 recognizes the distance. The measurement method is not limited to this as long as the distance between the face and the measured face can be measured. For example, a measurement method using a combination of an ultrasonic oscillator and a receiver may be used.

[Sixth Embodiment]
Next, a sixth embodiment of the present invention will be described.
In the sixth embodiment, in addition to the functions of the first embodiment described above, a display movement button 92 is provided, which has a function of easily moving the screen display contents greatly.
Components common to the first to fifth embodiments described above are denoted by the same reference numerals, and description thereof is omitted.

  As shown in FIG. 8, the portable display device as the sixth embodiment includes an acceleration measurement unit 1, an outer camera unit 21, a movement amount calculation unit 3, a display control unit 4, a display unit 5, A display movement button (stop command input means) 92 that turns off the screen scroll function only while being pressed is provided.

Next, the operation of the portable display device as the sixth embodiment will be described.
When the operator moves the apparatus housing in a state where the screen display on the display unit 5 is ON by the same function as that of the first embodiment described above, the display unit 5 depends on the movement amount and the moving direction. The display content of is scrolled.
Here, when the operator presses the display movement button 92, the display control unit 4 stops scrolling the display content on the display unit 5 only while the button is pressed.

Next, the operation of this embodiment will be described using a specific example.
For example, if the operator is displaying a map around Tokyo Station on the portable display device, it is necessary to move the device considerably to the left in order to see Hachioji Station next from Tokyo Station. In this case, when the operator depresses the display movement button 92 and moves the portable display device to the right, the position of the device shifts to the right, but the display content, that is, the displayed position (here, around Tokyo Station) is It will remain as it is. After that, if the user presses the display movement button 92 and moves the portable display device to the left, the display position can be moved considerably to the west on the map.

As a result, the display position can be easily displayed even for destinations that exceed the range of the operator's hand or that use most of the range within the limited range of the operator's hand. Can be moved.
In addition, even if the portable display device is moved away from the front of the operator by moving the portable display device, the operator can move at the display position. By pressing the display movement button 92 and fixing the display contents (stopping the scroll function) and moving the portable display device, the portable display device having the desired display contents is held in front of the operator. Can come.

As described above, according to the sixth embodiment of the present invention, the display content can be scrolled by the same function as in the first embodiment described above, and the operator presses the display movement button 92. Thus, the portable display device can be moved just like an operation feeling by holding the map and moving it.
For this reason, for example, if you want to see the current displayed content as the center, grab that part and bring it to your eyes, press the display move button 92, and scroll the displayed content. Without bringing it in front of the operator.

  As described above, the display content scroll in the first embodiment described above and the temporary scroll stop function by the display movement button 92 are combined, so that a virtual image displayed on the display unit 5 is touched by hand. The operator can operate more intuitively.

In the sixth embodiment described above, the display movement button 92 is added to the first embodiment described above. However, the present invention is not limited to this configuration, and any one of the first to fifth embodiments is described. Similarly, it can be realized as a configuration to which the display movement button 92 is added.
For example, when the display movement button 92 is added to the above-described fourth embodiment, the scroll function is stopped while the display movement button 92 is pressed, regardless of the movement of the eyeball or the line-of-sight direction of the operator. .

[About each embodiment]
Moreover, each embodiment mentioned above is a suitable embodiment of this invention, This invention is not limited to this, It can change variously based on the technical idea of this invention, and can be implemented.
For example, in each of the embodiments described above, the display unit 5 has been described as displaying a map or the like. However, display information including information that exceeds the display range that can be displayed at once by the display unit 5 is used as a display target. The present invention can be similarly applied as long as a part of the display information is displayed on the display unit 5. For example, the display content may be a Web page, a spreadsheet, a full-size picture, a gravure, or the like. .
For this reason, such various two-dimensional information can be scrolled and displayed with a natural operational feeling without making the interface conscious. For example, when displaying a full-size picture or gravure, the entire image is displayed small. Instead, it can also be used for viewing each part in detail.

Further, the display information to be displayed is not limited to the two-dimensional information, and may display three-dimensional images such as 3D-CG (3 Dimensional Computer Graphics).
For example, in the case of displaying 3D data of a bag, if the operator turns around while viewing the display contents of the portable display device around a place where the bag is virtually present, In this way, the portable display device calculates the three-dimensional displacement amount and controls the display content, and the image of the eyelid from the angle at which the portable display device exists with respect to the place where the eyelid is virtually present is displayed on the display unit. Will be displayed.
As described above, according to the configuration in which a three-dimensional image is displayed as a 3D image, the image displayed following the movement of the portable display device is controlled, and it can be handled as if there is an object there. It becomes like this.

  The above-described portable display device as each embodiment is applied to various information devices such as a mobile phone, an information device having a navigation function, a portable game machine, and a personal digital assistant (PDA). Can be realized.

It is a block diagram which shows the structural example of the portable display apparatus as the 1st Embodiment of this invention. It is a flowchart which shows the operation example of this portable display apparatus. It is a block diagram which shows the structural example of the portable display apparatus as the 2nd Embodiment of this invention. It is a block diagram which shows the other structural example of the portable display apparatus as the 2nd Embodiment of this invention. It is a block diagram which shows the structural example of the portable display apparatus as the 3rd Embodiment of this invention. It is a block diagram which shows the structural example of the portable display apparatus as the 4th Embodiment of this invention. It is a block diagram which shows the structural example of the portable display apparatus as the 5th Embodiment of this invention. It is a block diagram which shows the structural example of the portable display apparatus as the 6th Embodiment of this invention.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Acceleration measurement part 21 Outer camera part 22 Inner camera part 3 Movement amount calculation part 4 Display control part 5 Display part 61 Ultrasonic transmission part 62 Ultrasonic reception part 7 Face recognition part 8 Gaze direction recognition part 91 Enlarging / reducing button (Example of enlargement / reduction command input means)
92 Display movement button (an example of stop command input means)

Claims (9)

  1. Acceleration measuring means for measuring acceleration and moving direction in movement of the device housing;
    An external information recognition means for recognizing the movement state of an entity outside the apparatus;
    A movement amount calculation means for calculating a three-dimensional displacement amount of the apparatus housing based on a measurement result by the acceleration measurement means and a recognition result by the external information recognition means;
    A portable display device comprising: display control means for controlling display contents on the display means based on the three-dimensional displacement calculated by the movement amount calculation means.
  2.   The display control means uses display information including information that exceeds a display range that can be displayed at once by the display means as display information by the display means, and causes the display means to display a part of the display information. 2. The portable display device according to claim 1, wherein a display portion by the display unit is moved based on a three-dimensional displacement amount of the device casing calculated by the movement amount calculation unit.
  3. The external information recognizing means is a distance measuring means for measuring a distance from an entity in a predetermined direction,
    3. The movement amount calculating unit calculates a three-dimensional displacement amount of the apparatus housing based on a measurement result by the acceleration measuring unit and a measurement result by the distance measuring unit. The portable display device described.
  4. The external information recognition means is an imaging means for taking an image,
    The said movement amount calculating means calculates the three-dimensional displacement amount of the said apparatus housing | casing based on the measurement result by the said acceleration measurement means, and the picked-up image by the said imaging means. Portable display device.
  5. The external information recognizing means is a distance measuring means for measuring a distance to an entity in a predetermined direction, and an imaging means for taking an image,
    The movement amount calculation means calculates a three-dimensional displacement amount of the apparatus housing based on a measurement result by the acceleration measurement means, a measurement result by the distance measurement means, and a photographed image by the imaging means. The portable display device according to claim 1 or 2, characterized in that
  6. The imaging unit includes a display surface direction imaging unit that captures an approximate display surface direction by the display unit, and an outer direction imaging unit that captures a predetermined direction other than the display surface direction.
    The movement amount calculating means calculates a three-dimensional displacement amount of the apparatus housing based on a measurement result by the acceleration measuring means, a captured image by the display surface direction imaging unit, and a captured image by the outer direction imaging unit. 6. The portable display device according to claim 4, wherein the portable display device is calculated.
  7. Gaze direction recognition means for recognizing a human eyeball based on a photographed image by the display surface direction imaging unit and calculating a gaze direction;
    The display control means performs control to move the display portion on the display means in the display information only when the line-of-sight recognition means recognizes that the line of sight is facing the display means. The portable display device according to claim 6.
  8. The imaging unit includes at least a display surface direction imaging unit that captures an approximate display surface direction by the display unit,
    Face recognition means for recognizing a human face based on an image captured by the display surface direction imaging unit;
    Interpersonal distance measuring means for measuring the distance from the device housing to the face recognized by the face recognizing means,
    The display control means enlarges the display magnification every time the distance becomes shorter and reduces the display magnification every time the distance becomes longer, according to the distance measured by the interpersonal distance measuring means. The portable display device according to 4 or 5.
  9. A stop command input means for temporarily stopping the control of moving the display portion on the display means in the display information;
    While the input from the stop command input means is detected, the display control means temporarily stops the control to move the display portion on the display means in the display information. The portable display device according to any one of claims 1 to 8.
JP2005310973A 2005-10-26 2005-10-26 Portable display device Active JP4977995B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005310973A JP4977995B2 (en) 2005-10-26 2005-10-26 Portable display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005310973A JP4977995B2 (en) 2005-10-26 2005-10-26 Portable display device

Publications (2)

Publication Number Publication Date
JP2007121489A true JP2007121489A (en) 2007-05-17
JP4977995B2 JP4977995B2 (en) 2012-07-18

Family

ID=38145421

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005310973A Active JP4977995B2 (en) 2005-10-26 2005-10-26 Portable display device

Country Status (1)

Country Link
JP (1) JP4977995B2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008210019A (en) * 2007-02-23 2008-09-11 Nintendo Co Ltd Information processing program and information processing apparatus
JP2012509544A (en) * 2008-11-20 2012-04-19 アマゾン テクノロジーズ インコーポレイテッド Motion recognition as an input mechanism
WO2012111272A1 (en) * 2011-02-14 2012-08-23 パナソニック株式会社 Display control device and display control method
WO2013042530A1 (en) * 2011-09-22 2013-03-28 Necカシオモバイルコミュニケーションズ株式会社 Display device, display control method, and program
JP2013537670A (en) * 2010-08-04 2013-10-03 アップル インコーポレイテッド 3D user interface effect on display by using motion characteristics
JP2014529061A (en) * 2011-07-28 2014-10-30 シズベル テクノロジー エス.アール.エル. Method for ensuring continuity of service of personal navigation device and device
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US8884928B1 (en) 2012-01-26 2014-11-11 Amazon Technologies, Inc. Correcting for parallax in electronic displays
WO2014181403A1 (en) 2013-05-08 2014-11-13 富士通株式会社 Input device and input program
US8947351B1 (en) 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
US9035874B1 (en) 2013-03-08 2015-05-19 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
US9063574B1 (en) 2012-03-14 2015-06-23 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9122064B2 (en) 2008-11-26 2015-09-01 Nec Corporation Display device, terminal device, and display method
US9123272B1 (en) 2011-05-13 2015-09-01 Amazon Technologies, Inc. Realistic image lighting and shading
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US9285895B1 (en) 2012-03-28 2016-03-15 Amazon Technologies, Inc. Integrated near field sensor for display devices
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth
US9411413B2 (en) 2010-08-04 2016-08-09 Apple Inc. Three dimensional user interface effects on a display
US9423886B1 (en) 2012-10-02 2016-08-23 Amazon Technologies, Inc. Sensor connectivity approaches
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9159133B2 (en) * 2012-11-05 2015-10-13 Qualcomm Incorporated Adaptive scale and/or gravity estimation

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08247796A (en) * 1995-03-10 1996-09-27 Miyota Kk Vibration-compensated display apparatus
JPH10240436A (en) * 1996-12-26 1998-09-11 Nikon Corp Information processor and recording medium
JP2001159951A (en) * 1999-12-02 2001-06-12 Nec Corp Information processor and method for processing information
JP2002007027A (en) * 2000-06-27 2002-01-11 Campus Create Co Ltd Image information display device
JP2003196017A (en) * 2001-12-25 2003-07-11 Gen Tec:Kk Data input method and device
JP2004128712A (en) * 2002-09-30 2004-04-22 Fuji Photo Film Co Ltd Portable terminal device
JP2004317813A (en) * 2003-04-16 2004-11-11 Sony Corp Image display device and image blur preventing method
JP2004343622A (en) * 2003-05-19 2004-12-02 Motorola Inc Image display device
WO2005041167A1 (en) * 2003-10-28 2005-05-06 Matsushita Electric Industrial Co., Ltd. Image display device and image display method
JP2005122100A (en) * 2003-06-02 2005-05-12 Fuji Photo Film Co Ltd Image displaying system, image displaying apparatus, and program
JP2005221907A (en) * 2004-02-09 2005-08-18 Sanyo Electric Co Ltd Display device
WO2006095573A1 (en) * 2005-03-08 2006-09-14 Sharp Kabushiki Kaisha Portable terminal device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08247796A (en) * 1995-03-10 1996-09-27 Miyota Kk Vibration-compensated display apparatus
JPH10240436A (en) * 1996-12-26 1998-09-11 Nikon Corp Information processor and recording medium
JP2001159951A (en) * 1999-12-02 2001-06-12 Nec Corp Information processor and method for processing information
JP2002007027A (en) * 2000-06-27 2002-01-11 Campus Create Co Ltd Image information display device
JP2003196017A (en) * 2001-12-25 2003-07-11 Gen Tec:Kk Data input method and device
JP2004128712A (en) * 2002-09-30 2004-04-22 Fuji Photo Film Co Ltd Portable terminal device
JP2004317813A (en) * 2003-04-16 2004-11-11 Sony Corp Image display device and image blur preventing method
JP2004343622A (en) * 2003-05-19 2004-12-02 Motorola Inc Image display device
JP2005122100A (en) * 2003-06-02 2005-05-12 Fuji Photo Film Co Ltd Image displaying system, image displaying apparatus, and program
WO2005041167A1 (en) * 2003-10-28 2005-05-06 Matsushita Electric Industrial Co., Ltd. Image display device and image display method
JP2005221907A (en) * 2004-02-09 2005-08-18 Sanyo Electric Co Ltd Display device
WO2006095573A1 (en) * 2005-03-08 2006-09-14 Sharp Kabushiki Kaisha Portable terminal device

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008210019A (en) * 2007-02-23 2008-09-11 Nintendo Co Ltd Information processing program and information processing apparatus
JP2012509544A (en) * 2008-11-20 2012-04-19 アマゾン テクノロジーズ インコーポレイテッド Motion recognition as an input mechanism
US9304583B2 (en) 2008-11-20 2016-04-05 Amazon Technologies, Inc. Movement recognition as input mechanism
US8788977B2 (en) 2008-11-20 2014-07-22 Amazon Technologies, Inc. Movement recognition as input mechanism
US9122064B2 (en) 2008-11-26 2015-09-01 Nec Corporation Display device, terminal device, and display method
US9880395B2 (en) 2008-11-26 2018-01-30 Nec Corporation Display device, terminal device, and display method
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US9557811B1 (en) 2010-05-24 2017-01-31 Amazon Technologies, Inc. Determining relative motion as input
US9778815B2 (en) 2010-08-04 2017-10-03 Apple Inc. Three dimensional user interface effects on a display
US8913056B2 (en) 2010-08-04 2014-12-16 Apple Inc. Three dimensional user interface effects on a display by using properties of motion
US9417763B2 (en) 2010-08-04 2016-08-16 Apple Inc. Three dimensional user interface effects on a display by using properties of motion
US9411413B2 (en) 2010-08-04 2016-08-09 Apple Inc. Three dimensional user interface effects on a display
JP2013537670A (en) * 2010-08-04 2013-10-03 アップル インコーポレイテッド 3D user interface effect on display by using motion characteristics
WO2012111272A1 (en) * 2011-02-14 2012-08-23 パナソニック株式会社 Display control device and display control method
US9164582B2 (en) 2011-02-14 2015-10-20 Panasonic Intellectual Property Management Co., Ltd. Display control device and method detecting eye position of a user
US9123272B1 (en) 2011-05-13 2015-09-01 Amazon Technologies, Inc. Realistic image lighting and shading
US9041734B2 (en) 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
JP2014529061A (en) * 2011-07-28 2014-10-30 シズベル テクノロジー エス.アール.エル. Method for ensuring continuity of service of personal navigation device and device
US10088924B1 (en) 2011-08-04 2018-10-02 Amazon Technologies, Inc. Overcoming motion effects in gesture recognition
WO2013042530A1 (en) * 2011-09-22 2013-03-28 Necカシオモバイルコミュニケーションズ株式会社 Display device, display control method, and program
US8947351B1 (en) 2011-09-27 2015-02-03 Amazon Technologies, Inc. Point of view determinations for finger tracking
US9223415B1 (en) 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
US8884928B1 (en) 2012-01-26 2014-11-11 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US10019107B2 (en) 2012-01-26 2018-07-10 Amazon Technologies, Inc. Correcting for parallax in electronic displays
US9471153B1 (en) 2012-03-14 2016-10-18 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9063574B1 (en) 2012-03-14 2015-06-23 Amazon Technologies, Inc. Motion detection systems for electronic devices
US9652083B2 (en) 2012-03-28 2017-05-16 Amazon Technologies, Inc. Integrated near field sensor for display devices
US9285895B1 (en) 2012-03-28 2016-03-15 Amazon Technologies, Inc. Integrated near field sensor for display devices
US9423886B1 (en) 2012-10-02 2016-08-23 Amazon Technologies, Inc. Sensor connectivity approaches
US9035874B1 (en) 2013-03-08 2015-05-19 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9483113B1 (en) 2013-03-08 2016-11-01 Amazon Technologies, Inc. Providing user input to a computing device with an eye closure
US9804671B2 (en) 2013-05-08 2017-10-31 Fujitsu Limited Input device and non-transitory computer-readable recording medium
WO2014181403A1 (en) 2013-05-08 2014-11-13 富士通株式会社 Input device and input program
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US10055013B2 (en) 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
US9367203B1 (en) 2013-10-04 2016-06-14 Amazon Technologies, Inc. User interface techniques for simulating three-dimensional depth

Also Published As

Publication number Publication date
JP4977995B2 (en) 2012-07-18

Similar Documents

Publication Publication Date Title
US10514758B2 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
EP2826689B1 (en) Mobile terminal
US10198870B2 (en) Information processing apparatus, information processing system, and information processing method
US10318017B2 (en) Viewing images with tilt control on a hand-held device
JP5962403B2 (en) Information processing apparatus, display control method, and program
US8964091B2 (en) Digital device and method for controlling the same
US9651782B2 (en) Wearable tracking device
US10217288B2 (en) Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor
US9910505B2 (en) Motion control for managing content
TWI579732B (en) Multi display apparatus and control method thereof
EP2976767B1 (en) Display device and method for controlling the same
US9432661B2 (en) Electronic device, image display method, and image display program
US9798395B2 (en) Electronic control apparatus and method for responsively controlling media content displayed on portable electronic device
US10133342B2 (en) Human-body-gesture-based region and volume selection for HMD
US9373195B2 (en) Display control device, display control method, and program
JP2014167800A (en) Control device, input device, control system, control method, and hand-held device
JP2017058493A (en) Virtual reality space video display method and program
US8619152B2 (en) Mobile terminal and operating method thereof
US9817232B2 (en) Head movement controlled navigation among multiple boards for display in a headset computer
JP5802667B2 (en) gesture input device and gesture input method
US7535486B2 (en) Display control device, display control method, program, and portable apparatus
EP2927634B1 (en) Single-camera ranging method and system
US6160899A (en) Method of application menu selection and activation using image cognition
WO2014147686A1 (en) Head-mounted device for user interactions in an amplified reality environment
US20140098188A1 (en) Multi display device and method of photographing thereof

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080919

RD01 Notification of change of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7421

Effective date: 20110919

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110929

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111025

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111219

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120321

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120403

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150427

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Ref document number: 4977995

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250