WO2015099215A1 - Head-mounted display apparatus and method for operating same - Google Patents

Head-mounted display apparatus and method for operating same Download PDF

Info

Publication number
WO2015099215A1
WO2015099215A1 PCT/KR2013/012138 KR2013012138W WO2015099215A1 WO 2015099215 A1 WO2015099215 A1 WO 2015099215A1 KR 2013012138 W KR2013012138 W KR 2013012138W WO 2015099215 A1 WO2015099215 A1 WO 2015099215A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
user
image
resolution
mounted display
Prior art date
Application number
PCT/KR2013/012138
Other languages
French (fr)
Korean (ko)
Inventor
김형준
손종현
조택일
최은지
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2013/012138 priority Critical patent/WO2015099215A1/en
Publication of WO2015099215A1 publication Critical patent/WO2015099215A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/0093Other optical systems; Other optical apparatus with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0147Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

The present invention relates to a head-mounted display apparatus and a method for operating same. A method for operating a head-mounted display apparatus according to an embodiment of the present invention comprises the steps of: receiving an image having a first resolution; displaying a partial region of the first image having the first resolution in accordance with a second resolution display mode, the second resolution being lower than the first resolution; sensing a head movement of a user; displaying a different region of the image having the first resolution in accordance with the sensed movement of the user. Accordingly, the convenience of use for the users can be improved.

Description

How a head mounted display device and the operation thereof

The present invention relates to a head mounted display device and its method of operation, the present invention relates to a head mounted display device and a method of operation that can improve the ease of use of the user.

A head mounted display device is a device that can view an image to be worn on a user's head. On the other hand, a head mounted display device, since the wear on the user's head, is a limited means for receiving user input.

In this way, the head and various research for user input is going on in the display device mounted.

An object of the present invention is a head mounted display device and a method of operation that can improve the ease of use of the user to provide.

Method of operating a head-mounted display device according to an embodiment of the present invention for achieving the above object, according to a small second resolution display mode than the step of receiving an image of the first resolution, the first resolution, the first resolution, and displaying a partial area of ​​the image, according to the method comprising: detecting a movement of a user's head and the user's head movement is detected, and a step of displaying a different portion of the image region of the first resolution.

On the other hand, a sensor for a head mounted display device according to an embodiment of the present invention for achieving the above object, and the display for displaying a predetermined image, the lens portion to enlarge the image displayed on the display, detecting the user's head movement unit, receives the image of the first resolution, the first resolution than in accordance with the small second resolution display mode, and controlling to display a partial region of the image of the first resolution, based on the user's head movement is detected, the first and a processor for controlling to display a different portion of the image area of ​​the resolution.

A head mounted display device according to an embodiment of the present invention, the second according to the resolution display mode, in the display the partial region of the image of the first resolution state, according to the user's head movement, the other of the image of the first resolution, It may indicate some areas. With this, without the scaling-down of the high-resolution video, fast, it can be displayed by reproducing the image. Accordingly, a user of the use convenience can be increased.

In particular, the phase of the user's head, down, left, or in correspondence with the right movement, move the display image area, or in response to the leading or trailing movement of a user's head, it is possible to zoom in or out the displayed image, the user is able to watch the desired image area.

On the other hand, in response to the user's eyes, it is possible to vary the menu position, the user's ease of use can be increased.

1 is a front view of a head mounted display device according to an embodiment of the present invention.

Figure 2 is a rear view of the head mounted display device of Figure 1;

3 is a cross-sectional view of a head mounted display device of Figure 1;

Figure 4 is a simplified internal block diagram of a head-mounted display device of Figure 1;

Figure 5 is a flow chart showing an operation method of a head mounted display device according to an embodiment of the present invention.

Figures 6 to 16 is a view for explaining a method of operation of Fig.

Hereinafter will be described with reference to the drawings the present invention in more detail.

Suffixes "module" and "portion" for components used in the following description is given merely as being taken into consideration only the easiness of writing this specification, it does not give a particularly important meaning or role by themselves. Accordingly, the "module" and "unit" may be used interchangeably with each other.

A head mounted display device described herein, as a head-mounted display device can be worn on a user's head, it is possible to display a desired image. In particular, the display displaying an image, and comprising a lens to magnify the image displayed, a user can, to recognize the enlarged image displayed on the display image.

Hereinafter, description of the head mounted display device according to an embodiment of the invention.

Figure 1 is a front view of a head mounted display device according to an embodiment of the present invention, Figure 2 is a rear view of the head mounted display device of Figure 1, Figure 3 is a cross-sectional view of a head mounted display device of Figure 1;

Referring to the figure, a user 700, a head-mounted display can wear the device 100 to the head. As it is shown in Figure 1, in a form to cover the eye, and can be worn, in order to improve immersion of a display image, the external light or the like is preferably to be cut off.

The head mounted display device, the user 700 wearing 100, lying down or sitting, it is possible to view the image to be reproduced.

On the other hand, as shown in FIG. 2, it includes a head-mounted display apparatus 100 includes a lens portion (195L, 195R), and provides the image to the user's eye, which is enlarged through the lens portion (195L, 195R).

3 shows a top cross-sectional view of a head mounted display device 100. Are spaced apart a predetermined interval on both sides (710L, 710R) of the user 700, the head and mount the lens portion (195L, 195R) of the display device 100 is located, the image is respectively displayed on the display (300L, 300R), the lens It is expanded through a portion (195L, 195R), and enters the user both eyes (710L, 710R). Accordingly, the user 700 is able to, as shown in the figure, the recognition of the magnified image (300L, 300R). At this time, the magnified image of the (300L, 300R) is a virtual image image (virtual image).

Figure 4 is a simplified internal block diagram of a head-mounted display device of Figure 1;

Referring to the drawings, a head-mounted display apparatus 100 includes a sensor unit 130, a communication module 135, a memory 140, an audio output unit 160, a processor 170, a display 180, a power supply section 190, can include a driving unit 193 and lens unit 195. the

The sensor unit 130, the motion information of the head mounted display device 100, in particular, it is possible to detect motion information of a user's head, and generates a sensing signal.

For example, the sensor unit 130 may include at least one of a or the like, a motion sensor (not shown), (not shown), proximity sensors, pressure sensors (not shown).

A motion sensor (not shown) detects the location of the acceleration sensor, the head-mounted display apparatus 100 by using a gyro sensor or the like moves.

Acceleration sensor, may be provided with an acceleration sensor for X-axis, Y-axis, Z-axis direction. On the other hand, the gyro sensor is a sensor for measuring the angular velocity, it is possible to detect the direction to return to the reference direction.

A proximity sensor (not shown) may detect the presence or absence, etc. of the object existing in the vicinity of the object or the head-mounted display apparatus 100 to approach to the head-mounted display device 100 without a mechanical contact.

(Not shown) pressure sensor may detect the presence and size, etc. of the pressure echoed pressure is applied to head-mounted display device 100. For example, it is possible to detect pressure applied by the user's hand.

Communication module 135 can provide an interface for communicating with an external device. For this purpose, the communications module 135 may include at least one of the mobile communication module (not shown), a wireless Internet module (not shown), a local area communication module (not shown), and a GPS module (not shown) . For example, through the wireless Internet module (not shown), it is possible to perform the WiFi communication, through the short-range communication module (not shown), it may be performed (Near Field Communication) NFC communication.

On the other hand, the communication module 135, it is possible to exchange with the mobile terminal (not shown) or an external server (not shown), data. Specifically, the communication module 135 may receive various data, such as from the mobile terminal, an image content. On the other hand, it can transmit the status information to a mobile terminal, a head-mounted display device 100.

On the other hand, the communication module 135 can receive visual information from the mobile terminal (not shown) or an external server (not shown), the user (700).

Memory 140, a head mounted, and a program for the treatment or control of processor 170 may be stored in the display device 100, it may perform a function for temporary storage of data that is input or output.

On the other hand, the memory 140, may be stored for a certain period of time or temporarily the content data that is received from an external device.

Audio output unit 160 may output an audio signal. For example, it is possible to output audio signals of the content to be reproduced on the display (180). On the other hand, the audio output unit 160 may include a speaker, or, to the outside can also be provided with an audio output terminal for an audio output.

Processor 170, controls the overall operations of the respective units in the head-mounted display device 100.

For example, processor 170, based on user input, and reproduces the image contents received on the stored or the communication module 1345 in the memory 140, each of the video and audio signals, a display (180 ) and it can be output to the audio output unit 160.

As another example, processor 170, based on the motion information, in particular, motion information of a user's head, the head-mounted display apparatus 100 is detected by the sensor unit 130, it can be controlled to perform a corresponding action is.

In relation to the embodiment of the invention, the processor 170, receives the image of the first resolution, in accordance with the small second resolution display mode than the first resolution, and controlling to display a partial region of the image of the first resolution, , it can be controlled so as to display different portion of the image region of the first resolution in accordance with the user 700, the head movement is detected.

For example, the processor 170 is, correspondingly to up, down, left, or right movements of the user 700, the head can be controlled so as to display different portion of the image of the first resolution.

As another example, processor 170, user 700, to the front or back of the head corresponding to the movement can be controlled so as to display to zoom in on a portion of the first resolution that is displayed, or zooming out.

On the other hand, processor 170, in the menu mode, the region corresponding to the attention of the user 700, may control to display a menu.

On the other hand, the processor 170 may vary the user 700 according to the head movement, the frequency of the audio signal is output, at least one of the size or phase that is detected.

On the other hand, processor 170, based on the visual information of the user 700 received through the communication module 135, it is possible to adjust the focusing of the image to be displayed.

On the other hand, the processor 170, according to the content type of the video image displayed on the display 180, it is possible to vary the operation corresponding to the movement of the user 700.

On the other hand, processor 170, processor 170, user 700 may be controlled to vary the distance between the eye and the lens section 195. The

Display 180 may display the text or image or the like. In particular, the display 180 can display a predetermined video.

On the other hand, the display 180, the menu display mode, the area corresponding to the eyes of the user 700, it is possible to display the menu.

Power supply 190, under the control of the processor 170, it is possible to supply the power required for the operation of each component.

On the other hand, a drive 193, under the control of the processor 170, the user 700 can vary the distance between the eye and the lens section 195. The For this purpose, the drive unit 193, may be provided with a lens unit movable member (not shown) such as a stepping motor. By operation of the motor, the user 700, the distance between the eye and the lens unit 195 can be varied.

5 is a view for explaining the head is a flow chart showing the operation method of mounting a display device, method of operation of Fig. 6 to Fig. 16 Fig. 5 according to an embodiment of the present invention.

First, a head mounted display device 100, it receives the image of the first resolution (S510).

The communication module 135 of the head mounted display device 100 receives an image of a first resolution from an external device. And sends an image of a first resolution that is received by processor 170.

For example, the image of the first resolution, UD (Ultra Definition) image, a panorama (panorama) image, 21: cinema may be wide (wide cinema) image or the like having a resolution of nine.

On the other hand, if the display 180 in the head mounted display device (100), UD image, a panorama image, 21: if it does not support a 9-resolution image, a processor (170) for receiving such image through the scaling operation, the display 180, should be performed on a down-scaling (scaling down) with a resolution. In this case, it may occur due to downscaling playing time delay element.

To address this point in the present invention, in the processor 170, and to output a partial region without a separate scale, to show a partial region of the image of the first resolution on the display 180.

For example, the display 180 is, if possible the display of the HD image of 1920X1080, processor 170, upon receiving the image of the UD resolution of 3840X2160, without scaling down the UD image into an HD image, UD image of corresponding to the HD picture area, or may output the image portion area to the less, UD image.

And, it displays a partial region of the image of the first resolution in accordance with the following, a head mounted display device 100, a second resolution smaller than the first resolution display mode (S520).

A head mounted display 180 of the display device 100, according to the display resolution as possible, it is possible to display only a part of the image area of ​​the first resolution output from the processor 170,.

As it described above, when a portion of UD image output from the processor 170, display 180, it is possible to display only a part of the UD image. Thus, the user is able to view only a portion of the input image area.

Figure 6 illustrates that the display the user 700, the image 610, the image 620 corresponding to the partial region of the wear of the head-mounted display device 100, the first resolution. In this case, the horizontal size of the image 620 corresponding to the partial region is found to be S1.

On the other hand, in Figure 6, with the image 620 corresponding to the partial region of the image of the first resolution to the display 180, it illustrates that the object 625 showing a part display mode or a second resolution display mode displayed . As a result, the user is watching the image, which is able to recognize that in some of the entire image region image region.

Next, a head mounted display device 100, detects motion of the user 700, the head (S530).

The sensor unit 130, it is possible to detect head movements of the user 700. Specifically, it is possible to via the acceleration sensor, the acceleration detected for the X axis, Y axis direction, Z-axis, through a gyro sensor, detects the angular velocity.

That the processor 170, based on a sensing signal sensed by the sensor unit 130, the head movement of the user 700, up, down, if the left or right movement, in front, moving backward, the rotary movement can determine whether or not.

Next, a head mounted display device 100, display some other region of the image of the first resolution in accordance with the user 700, the head movement is detected (S540).

Processor 170 is, according to the user's head movement is detected, and performs a corresponding operation.

In particular, the processor 170, when in the state in which the output of a partial area of ​​the image of the first resolution to the display 180, the user's head motion is detected, the display 180, the other partial region of the image of the first resolution, a can be output.

Specifically, the processor 170, in response to the up, down, left, or right movements of the user 700, the head can be controlled so as to display different portion of the image of the first resolution.

6 is, in the case where movement of the user 700, the head 705 of the, upper and lower, the left or right movement, the image area being displayed on the display 180, in response thereto, up, down, left, or illustrate the movement to right. As a result, the user, it is possible for the image mad did not have a viewing area of ​​the region of the first resolution can easily be verified.

7 is as shown in Figure 6, the image (9670) of the portion of the first image 610 of resolution area, but display, that is the vertical dimension of a part, the same as the vertical dimension of the first resolution image one horizontal size ( S2) that illustrate that the smaller.

In this state, when the motion of the user 700, the head 705 of the left or right movement, the image area being displayed on the display 180. In response thereto, illustrates the movement to the left or right.

On the other hand, processor 170, in response to the movement or to the user's 700 head, an image, which is expressed can be controlled to move to the left or right.

Specifically, it is also possible that the motion of the user 700, the head 705 of the, if the mobile phase, or up and down, as in the case of the left or right movement, moving image region is to the left or right, which is displayed on the display 180, Do.

On the other hand, in Figure 7, with the image 670 corresponding to the partial region of the image of the first resolution to the display 180, it illustrates that the object (675) showing a part display mode or a second resolution display mode displayed . As a result, the user is watching the image, which is able to recognize that in some of the entire image region image region.

On the other hand, processor 170, user 700, to the front or back of the head corresponding to the movement can be controlled so as to display to zoom in on a portion of the first resolution that is displayed, or zooming out.

On the other hand, it Figures 6 and not shown in Figure 7, the user 700 if the relevant image regions of the other image areas, or the entire image area to it, user interest in an area other than the image area that is currently viewing is present, processor 170, may be controlled to display for induction-view of the user, an indicator. Thereby, the viewing for the user of interest or significant region is possible.

Figure 8, the motion of the user 700, the head 705 of, illustrates that the display to zoom out the image that is being displayed on the display 180. If the move backward.

That is, the first resolution image 610. The image 620 corresponding to the partial region doedaga display of the, according to the movement moves backward, the image 622 corresponding to the entire area of ​​the image of the first resolution may be displayed .

The horizontal dimension of the image 620 corresponding to the partial region by S1 in the drawing, to the horizontal size of the image 622 corresponding to the entire region in S3, alternatively one shown as the size of S3 is greater, its the size may be the same. That is, the image 622 corresponding to the entire region is scaled down, it is possible to be displayed.

Thus, it is possible that the user can view the image 622 of the entire region of the first resolution image, thereby improves the user's ease of use.

On the other hand, unlike in the figure, when the motion of the user 700, the head 705 of the moving backward, it is possible to zoom in the image being displayed on the display 180. The

Next, FIG. 9, the motion of the user 700, the head 705 of, illustrates that the display when the forward movement, the zoom-in image being displayed on the display 180. The

That is, the image 610 image 620 corresponding to the partial region of the first resolution image 610, the first resolution is doedaga display image 622 corresponding to the entire area, in the future in accordance with the movement behavior of the of the display It can be.

On the other hand, unlike in the figure, when the motion of the user 700, the head 705, the movement to the next, it is possible to zoom out the image that is being displayed on the display 180. The

On the other hand, the processor 170 may vary the user 700 according to the head movement, the frequency of the audio signal is output, at least one of the size or phase that is detected.

Figure 10 is, when the motion of the user 700, the head 705 of, in moving forward, and the display to zoom in the image being displayed on the display 180, and illustrates the increase in size of the audio signal is output accordingly .

As shown in the figure, an image zoom-in when, to increase the user immersion, head-mounted audio output unit 170 of the display device 100, the first sound is larger second sound than the magnitude (amplitude) (sound 1) ( can output the sound 2). In addition, to increase the frequency, or, as the Doppler effect, to reduce the phase difference gradually, or it can also be a combination thereof.

On the other hand, when the motion of the user 700, the head 705 of the mobile back as shown in Fig. 9, the zoom-out the picture being displayed on the display 180, and the small or the size of the audio signal is output and therefore, the frequency the smaller or larger, or is, or can also be a combination of the phase difference.

On the other hand, the motion of the user 700, the head 705, the phase as shown in Fig. 7 or 8, down, left, or right movement of the case, the processor 170, the audio output from the audio output unit 170 at least one of the signal, the frequency, amplitude, phase can always be varied.

In particular, it is possible to output a directional audio signal. For example, changing the channel coding between in the frequency band of the audio signal, or control the Complex value between the channels, for adjusting the phase between the channels, it can be adjusted to each frequency-dependent gain.

On the other hand, the processor 170, if the user 700 moves the head 705, the up, down, left, or right movement as shown in Fig. 7 or 8, depending on the object changes in the displayed image, the It can be controlled to output an audio signal for the object.

Is performed, the audio encoding operation, by an object code of the reproduced video content is, the specific object coding is performed, if the audio data, received by the head-mounted display device 100, a processor 170, a user 700 in response to the motion of the head 705, it is possible to perform a playback audio signal by the object.

Figure 11 illustrates that if the motion of the head 705 of the user 700 in the left movement, to move the left image 620 is displayed, displaying an image 1120 including a specific object.

At this time, the object 1125 showing a part display mode or a second resolution, the display mode may be displayed together. As a result, the user is watching the image, which is able to recognize that in some of the entire image region image region.

Referring to the figures, it can be seen that the image 620 is displayed is left before movement, the image including a partial area of ​​the vehicle, an image 1120 is displayed, the left movement, the image including the guitar player.

The audio coding for image contents, by car, guitar player, when the object coding has been performed, processor 170, and the third sound corresponding to the image 620 is displayed the left before movement, associated with a car (sound 3 ) to be output, and may be in correspondence with the image 1120 to be displayed, the left movement, the output of the fourth sound (sound 4) in relation to other players.

On the other hand, when even if the object coding has not been performed, the processor 170, user 700, head 705, the motion is, the phase as shown in Fig. 7 or 8, down, left of, or right movement, the display in accordance with the movement of the image is, it is the audio signal output, the frequency, size, at least one of the phases may be varied.

Figure 12 illustrates the use of a plurality of head mounted display device, for example.

The first user (700a) and a second user (700b) cases between, running to play the same content, for example, 2 when running the player game content, the first user the first head one (700a) wears mounted display device (100a) and a second user (700b) a second head mounted display device (100b) is worn, it is possible to exchange data with one another under or indirect.

And, if the video resolution of the same content, the first resolution, each of the head-mounted display apparatus (100a, 100b) may display an image of a second resolution lower than the first resolution. In particular, it is possible to display a game image that the user is running.

As shown in the figure, the entire two-person game content image 1210, the first case including a user for the image and a for second user image, the first head-mounted display device (100a), the first for the first user image (1220 ) display, and a second head-mounted display device (100b), the second you later to display the image for a user (1230).

At this time, the first image 1220 for the user, the second user with the image 1230 for each object (1225,1235) of a portion display mode or a second resolution, the display mode may be displayed, respectively. As a result, it is possible to determine whether the image for which the user of the two-person video.

On the other hand, if in Figure 12, which for each user, in some images are displayed, is, up, down, left, or right movement as shown in Fig. 6 or 7, it may be an image for different users is displayed . In other words, the first user (700a) that if the right to move the head 705, a first head-mounted display device (100a) is, by replacing the first user for the image 1220 in the second image (1230) for the user It can be displayed.

On the other hand, it is also possible zoom in, zoom-out. That is, as shown in Figure 8, when the backward movement moves the first head-mounted display device (100a), the first while displaying an image 1220 for the user, to zoom-out, it is also possible to display the entire image 1210 .

On the other hand, a head mounted display device 100, because the display device can be worn on the head 705 of the user 700, it is preferable to carry out the image display, in accordance with user vision.

In one method of a video display corresponding to the user eyesight is, as noted in Figure 4 describes, by moving the position of the lens unit 195, the user's eyes, and how to vary the position of the lens unit (195) is.

Alternatively, a method for position without movement of the lens section 195, performs signal processing on the display image.

These methods are preferably both performed based on the user's visual information.

Thus, in the embodiment of the present invention, the user of the vision information, received from an external device, and based on visual information, and provide a method for performing signal processing on the display image.

Figure 13, illustrates that the receiving user visual information from the external device 1300, and, according to the visual information that is received, adjusting the focus of the content.

Case 13 as shown in (a), that is the focus of the partial area 1315 of the displayed image 1310 is not met, the user 700 will not be able to recognize them.

A head mounted display device 100, to the video content playback, it is possible to perform a visual information transmission request to the external device 1300, the user of the vision information 1305 corresponding to an external device 1300, a communication module ( 135) can be received via.

The external device 1300 of the, in the user visual information, for example, may be a store diopter information and a mobile terminal (not shown) (not shown) or the server.

Processor 170, in response to receiving visual information 1305 is, play may perform a video signal process for the image.

For example, processor 170, in response to the user visual information, determine the areas to be focused in the image, and to enhance, sharpness, etc. of the determined focus region, it is possible to emphasize the focusing.

When determining the focusing area, the processor 170, depending on the depth of the depth in the image is calculated, and the calculated, a determination of the focusing area. For example, as the user is not a good vision, so it does not look good long video, to depth, select small, the image area can perform focusing.

On the other hand, processor 170, in the menu mode, the region corresponding to the attention of the user 700, may control to display a menu.

Even though not shown in the rear view of the head mounted display device 100 of Figure 2, between the left-eye lens portion (195L) and the right-eye lens unit (195R), a camera (not shown) to take the user line of sight may be further arranged is.

Processor 170 may receive a user taking a line of sight from the camera, and to know the state of the user's eye movements.

Then, the processor 170, may, in accordance with the movement of your eyes, and determine the menu display area, controls to display a menu to the area as shown in FIG. 14.

Figure 14 (a) is, if the user's left eye, each eye of the (710L) and the right eye (710R) (701L, 701R) are both moved to the right, the menu 1425 in right region 1420 of the display image 1410 It illustrates that are displayed. Accordingly, a user using convenience can be increased.

Then, the processor 170, while the menu is displayed, based on the user's head movements, it is possible to select the item.

Processor 170, may be determined by as shown in FIG. 14 (b), the user 700 if the nodding head 705, select the focused item in a menu.

On the other hand, processor 170 may determine that that, as shown in 14 (c) also, the user 700 moves the up and down or the head 705, or the up and down movement in the focusing menu. Thereby, the user is easy, it is possible to select an item in the menu.

On the other hand, processor 170, in accordance with the content type that is displayed is reproduced, it is possible to vary the operation corresponding to the user's motion.

Figure 15 (a) illustrates that the object 1515 represents the movie content 1510 to the content types shown hamkkye.

At this time, the processor 170, as shown in FIG. 15 (b), the user 700 is a back nodding head 705, 'yes', as shown in FIG. 15 (c), the user 700, the head 705 If the horizontal low, 'no', 15 as shown in (d), when the left movement, "play moved to an earlier time", when or right as shown in Figure 15 (e), operations such as "playback moved to a later point in time, a can be controlled to perform.

Figure 16 (a) illustrates that the object 1615 represents the broadcast image 1610 and the content type displayed hamkkye.

At this time, as the processor 170, as shown in FIG. 16 (b), the user 700 when moved down by the head 705, a "volume down", Fig. 16 (c), the user 700's head ( moving to 705) onto the, "volume up", 16 (as shown in d), when the left movement, when or right, such as "previous channel switching, FIG. 16 (e), the operation such as" since the channel change " a can be controlled to perform.

On the other hand, the drawings and description of the Figure 15 and Figure 16, may be divided, and some moving areas displayed according to resolutions.

That is, in the case of Figure 15 with the contents of the HD content 16, in contrast to some areas of the mobile display, UD image as shown in FIG. 5 or 6, but other operations, according to the content type can be performed.

A head mounted display device and its operation method according to an embodiment of the present invention is not in the configuration and method of the embodiments described as above can be applied to only the embodiments, each embodiment so that various modifications may be made all of or may be a part is optionally combined in configuration.

On the other hand, the operation method of a head mounted display device of the present invention can also be embodied as processor readable code on a recording medium on which the processor provided in the head-mounted display device can be read. The recording medium having processor-readable any type of recording device in which data readable by a processor storage. Including those where the processor is an example of a recording medium that can be read are a ROM, RAM, CD-ROM, magnetic tapes, floppy disks, optical data storage devices, and, implemented in the form of carrier waves such as data transmission through the Internet . Further, the recording medium having processor-readable be distributed over network coupled computer systems so that the processor can read the code stored and executed in a distributed fashion.

In addition, more than the been shown and described a preferred embodiment of the invention, the invention is not limited to the embodiment of the above-described particular, technology pertaining the art without departing from the subject matter of the present invention invention claimed in the claims field in the embodiment and various modifications are possible as well as by those of ordinary skill, this modified embodiment would not restricted to individually understood from the technical spirit or prospect of the present invention.

The present invention is applicable to a head mounted display device which can be a head-mounted display device and a method of operation that can view an image by wearing on a user's head, and more particularly, to improve the ease of use of the user.

Claims (17)

  1. In the method of operation of a head mounted display device can be worn on the user's head,
    Receiving an image of the first resolution;
    According to the first small second resolution display mode than the first resolution, the method comprising: displaying a partial area of ​​the image of the first resolution;
    Further comprising: detecting a movement of the user's head;
    Method of operating a head-mounted display device comprising: a; according to the user's head motion in which the sensing, displaying a different portion of the image region of the first resolution.
  2. According to claim 1,
    The other partial area display step,
    Up, down, corresponding to the left or right movement, the operation method of a head mounted display device, characterized in that for displaying the other partial region of the image of the first resolution of the user's head.
  3. According to claim 1,
    The other partial area display step,
    In response to movement of the front or the rear of the user's head, an operation method of a head mounted display device, characterized in that the display to zoom a portion of the first resolution to which the display or to zoom out.
  4. According to claim 1,
    In the menu mode, the area corresponding to the line of sight of the user, displaying a menu; method of operation of a head mounted display device according to claim 1, further comprising a.
  5. According to claim 1,
    Method of operating a head-mounted display apparatus is characterized in that according to the user's head movement in which the detection, a variable at least one of a frequency, magnitude or the phase of the audio signal is output.
  6. According to claim 1,
    Receiving a user's visual information from the outside;
    Method of operating a head-mounted display device according to claim 1, further including; determining, based on visual information that is received, controls the focusing of the image to be displayed.
  7. According to claim 1,
    According to content type of the first resolution image, the operation method of a head mounted display device, characterized in that for varying the operation corresponding to the user's movement.
  8. According to claim 1,
    Method of operating a head-mounted display device according to claim 1, further including; the partial area when displayed, the method comprising the other partial region displays an indicator for viewing.
  9. In the head-mounted display device can be worn on the user's head,
    A display for displaying a predetermined image;
    Lens portion to enlarge the image displayed on the display;
    Sensor unit for detecting a movement of the user's head;
    According to a smaller second resolution display mode, receives the image of the first resolution than the first resolution, and controlling to display a partial region of the image of the first resolution, based on the user's head movement in which the detection, wherein a head mounted display device comprising: a; a processor for controlling to display a different portion of the image region of the first resolution.
  10. 10. The method of claim 9,
    Wherein the processor,
    Head-mounted display apparatus in response to the up, down, left, or right movement of the user's head, characterized in that the control to display the different portion of the image of the first resolution.
  11. 10. The method of claim 9,
    Wherein the processor,
    In response to movement of the front or the rear of the user's head, a head mounted display device, characterized in that for controlling to display to zoom in on a portion of the first resolution to which the display or to zoom out.
  12. 10. The method of claim 9,
    Wherein the processor,
    In the menu mode, the head-mounted display apparatus of a region corresponding to the line of sight of the user, characterized in that the control to display the menu.
  13. 10. The method of claim 9,
    Wherein the processor,
    A head mounted display device, characterized in that depending on the user's head movement in which the detection, a variable at least one of a frequency, magnitude or the phase of the audio signal is output.
  14. 10. The method of claim 9,
    From an external communication module to receive the user's sight information; further comprises, and
    Wherein the processor,
    A head mounted display device, characterized in that on the basis of visual information that is received, controls the focusing of the image to be displayed.
  15. 10. The method of claim 9,
    Wherein the processor,
    According to content type of the first resolution image, a head mounted display device, characterized in that for varying the operation corresponding to the user's movement.
  16. 10. The method of claim 9,
    A head mounted display device according to claim 1, further including; a driver for varying the distance between the user's eye and the lens unit.
  17. 10. The method of claim 9,
    Wherein the processor,
    A head mounted display device of the display when the partial area, the other part region characterized in that for controlling to display an indicator for viewing.
PCT/KR2013/012138 2013-12-24 2013-12-24 Head-mounted display apparatus and method for operating same WO2015099215A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2013/012138 WO2015099215A1 (en) 2013-12-24 2013-12-24 Head-mounted display apparatus and method for operating same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2013/012138 WO2015099215A1 (en) 2013-12-24 2013-12-24 Head-mounted display apparatus and method for operating same

Publications (1)

Publication Number Publication Date
WO2015099215A1 true WO2015099215A1 (en) 2015-07-02

Family

ID=53479047

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/012138 WO2015099215A1 (en) 2013-12-24 2013-12-24 Head-mounted display apparatus and method for operating same

Country Status (1)

Country Link
WO (1) WO2015099215A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9298283B1 (en) 2015-09-10 2016-03-29 Connectivity Labs Inc. Sedentary virtual reality method and systems
CN107193364A (en) * 2016-03-14 2017-09-22 宏达国际电子股份有限公司 Virtual reality system, control method and non-transient state computer readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099301A (en) * 1995-06-24 1997-01-10 Victor Co Of Japan Ltd Head mount display device
US20100079356A1 (en) * 2008-09-30 2010-04-01 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US20110234584A1 (en) * 2010-03-25 2011-09-29 Fujifilm Corporation Head-mounted display device
US20120274750A1 (en) * 2011-04-26 2012-11-01 Echostar Technologies L.L.C. Apparatus, systems and methods for shared viewing experience using head mounted displays
US20130093789A1 (en) * 2011-06-23 2013-04-18 James Chia-Ming Liu Total field of view classification for head-mounted display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099301A (en) * 1995-06-24 1997-01-10 Victor Co Of Japan Ltd Head mount display device
US20100079356A1 (en) * 2008-09-30 2010-04-01 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US20110234584A1 (en) * 2010-03-25 2011-09-29 Fujifilm Corporation Head-mounted display device
US20120274750A1 (en) * 2011-04-26 2012-11-01 Echostar Technologies L.L.C. Apparatus, systems and methods for shared viewing experience using head mounted displays
US20130093789A1 (en) * 2011-06-23 2013-04-18 James Chia-Ming Liu Total field of view classification for head-mounted display

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9298283B1 (en) 2015-09-10 2016-03-29 Connectivity Labs Inc. Sedentary virtual reality method and systems
US9804394B2 (en) 2015-09-10 2017-10-31 Connectivity Labs Inc. Sedentary virtual reality method and systems
US10345588B2 (en) 2015-09-10 2019-07-09 Connectivity Labs Inc. Sedentary virtual reality method and systems
CN107193364A (en) * 2016-03-14 2017-09-22 宏达国际电子股份有限公司 Virtual reality system, control method and non-transient state computer readable medium

Similar Documents

Publication Publication Date Title
US8433336B2 (en) Method for guiding route using augmented reality and mobile terminal using the same
US8289406B2 (en) Image stabilization device using image analysis to control movement of an image recording sensor
JP4864295B2 (en) An image display system, an image display device and program
US7307653B2 (en) Image stabilizer for a microcamera module of a handheld device, and method for stabilizing a microcamera module of a handheld device
US20060147197A1 (en) Method for improving vision of a low-vision person and viewing aid
EP2112547A2 (en) Image processing apparatus, image processing method, program, and recording medium
WO2014042458A1 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
WO2014104473A1 (en) Head mounted display and method of video communication using the same
US8768043B2 (en) Image display apparatus, image display method, and program
JP4877391B2 (en) Camera equipment, and imaging method
US20100321533A1 (en) Image photographing apparatus and method of controlling the same
US20140253693A1 (en) Information processing apparatus, method, and non-transitory computer-readable medium
JP2008227813A (en) Image processor, and method and system for processing image
US10324294B2 (en) Display control device, display control method, and computer program
CN1488093A (en) Image information displaying device
WO2003019287A1 (en) Remote image projector for wearable devices
US8976270B2 (en) Imaging device and imaging device control method capable of taking pictures rapidly with an intuitive operation
WO2005122128A1 (en) Wearable type information presentation device
WO2008072374A1 (en) Electronic camera
WO2013103275A1 (en) Method and apparatus for implementing multi-vision system by using multiple portable terminals
JP2005172851A (en) Image display apparatus
WO2006038577A1 (en) Electronic device
CN103813108A (en) Array camera, mobile terminal, and methods for operating the same
EP3029552A1 (en) Virtual reality system and method for controlling operation modes of virtual reality system
JPWO2006064655A1 (en) Information presentation apparatus and an information presentation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13900133

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13900133

Country of ref document: EP

Kind code of ref document: A1