KR20160144245A - Projector apparatus - Google Patents

Projector apparatus Download PDF

Info

Publication number
KR20160144245A
KR20160144245A KR1020150080812A KR20150080812A KR20160144245A KR 20160144245 A KR20160144245 A KR 20160144245A KR 1020150080812 A KR1020150080812 A KR 1020150080812A KR 20150080812 A KR20150080812 A KR 20150080812A KR 20160144245 A KR20160144245 A KR 20160144245A
Authority
KR
South Korea
Prior art keywords
projector
user
gaze
communication module
information
Prior art date
Application number
KR1020150080812A
Other languages
Korean (ko)
Inventor
추현승
박동한
반키 안드레아
Original Assignee
성균관대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 성균관대학교산학협력단 filed Critical 성균관대학교산학협력단
Priority to KR1020150080812A priority Critical patent/KR20160144245A/en
Publication of KR20160144245A publication Critical patent/KR20160144245A/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/32Details specially adapted for motion-picture projection
    • G03B21/43Driving mechanisms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3129Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
    • H04N9/3135Driving therefor

Abstract

The present invention discloses a projector apparatus. A projector apparatus according to an embodiment of the present invention is a projector apparatus comprising: a projector module for outputting an image; A projector driving unit for adjusting a video output direction of the projector module; A communication module for receiving the user's gaze information from the gaze tracking device for sensing the user's gaze information and a video output direction of the projector based on the gaze information of the user received from the communication module, And a control unit for outputting a control signal for controlling the control unit.

Description

PROJECTOR APPARATUS

The present invention relates to a projector apparatus.

Recently, projector devices have been improved in image quality and price, but they are small in size and portable, but do not consider the screen output direction.

The screen output direction of the current projector apparatus is manually adjusted and fixed by the user. If the user is exposed to a fixed image output direction of the projector apparatus for a long time, the user is inconvenient to maintain a specific posture, thereby causing burdens on the neck, waist, and the like.

In order to solve this problem, a projector device is required to automatically reduce the inconvenience of the user by adjusting the image output direction of the projector in accordance with the user's posture change or the gaze direction change.

On the other hand, Japanese Laid-Open Patent Publication No. 2014-0143021 (entitled " Display Area Change System of Projector ") proposes a display area changing method for changing a display area of a projector ≪ / RTI >

According to the present invention, a sensing signal, a video signal, or sight line information transmitted from a visual line tracking device is received through a communication module, a video output direction of the projector is determined based on the received information, And a projector unit for moving the projector unit.

It is to be understood, however, that the technical scope of the present invention is not limited to the above-described technical problems, and other technical problems may be present.

According to an aspect of the present invention, there is provided a projector apparatus including: a projector module outputting an image; A projector driver for adjusting a video output direction of the projector module; A communication module for receiving the sight line information of the user from a line-of-sight tracking device for sensing line-of-sight information of a user, and a line-of- And a control unit for outputting a control signal for controlling the projector driving unit according to the control signal.

According to an embodiment of the present invention, the image output direction is moved according to the user's gaze information obtained through gaze tracking, thereby compensating for the uncomfortable posture that the user had to take in the fixed image output direction, It increases the overall usability because it does not have burden even if it looks.

In addition, according to an embodiment of the present invention, the projector apparatus provides mobility to the projector screen, thereby providing a screen that moves organically with the user's posture, thereby providing an experience that seems to have multiple screens, .

In addition, according to an embodiment of the present invention, a projector apparatus can be applied to apply various ideas presented in an augmented reality by fusion of motion techniques such as a surveillance camera.

1 is a simplified schematic diagram of a projector system in accordance with an embodiment of the present invention.
2 is a configuration diagram of a projector apparatus according to an embodiment of the present invention.
3A to 3D are diagrams for explaining an example of a projector apparatus proposed in the present invention.
4 is a view for explaining an example of a projector apparatus for tracking a user's gaze by utilizing a gyro sensor mounted on a hair-wearing device proposed in the present invention.
5 is a view for explaining an example of a projector apparatus utilizing a camera proposed in the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "including" an element, it is to be understood that the element may include other elements as well as other elements, And does not preclude the presence or addition of one or more other features, integers, steps, operations, components, parts, or combinations thereof.

The following examples are intended to further illustrate the present invention and are not intended to limit the scope of the present invention. Accordingly, the same scope of the invention performing the same function as the present invention will also fall within the scope of the present invention.

1 is a simplified schematic diagram of a projector system in accordance with an embodiment of the present invention.

The projector apparatus 110 according to the embodiment of the present invention adjusts the projector image output direction of the projector apparatus 110 based on the sight line information transmitted from the gaze tracking apparatus 100. [ That is, the projector apparatus 110 can freely adjust the image output direction of the projector to up, down, left, and right sides, such as a front surface, a ceiling, and a side wall surface, based on sight line information changed in real time transmitted from the visual line tracking apparatus 100.

For example, there is a head wearable wearable device (for example, a specific headset, a specific earphone, etc.) equipped with a gyro sensor, as the gaze tracking device 100 for tracking the movement of the user's gaze to be changed. Specifically, the gaze tracking apparatus 100 may process the sensed signal or the sensed signal obtained through the gyro sensor and provide the extracted gaze information to the projector apparatus 110. [ At this time, the projector apparatus 110 can adjust the image output direction of the projector based on the sensing signal or sight line information.

As another example, the gaze tracking device 100 for tracking the movement of the user's eyes may include a face recognition device (e.g., a camera) located at the front portion of the user. Specifically, the line-of-sight tracking apparatus 100 including a camera recognizes a user's face, and then processes the extracted line-of-sight information by processing a video signal or a video signal, 110). At this time, the projector apparatus 110 can adjust the video output direction of the projector according to the control signal generated based on the video signal or sight line information.

Meanwhile, the projector apparatus 110 can adjust the image output direction by controlling the driving unit that can rotate according to the control signal generated based on the sensing signal, the image signal, or the sight line information transmitted from the visual line tracking apparatus 100.

At this time, the projector driving unit of the projector apparatus 100 may be realized through a remote fan motor method such as a surveillance camera. In the remote fan motor method, the image output direction can be adjusted at various angles by controlling the remote fan motor attached to the projector module.

In addition, the driving unit 116 of the projector apparatus 100 can be realized through a movable mirror. The movable mirror type can mount the movable mirror to reflect the beam to the module of the projector and control the movement of the mirror so that the image output direction can be adjusted at various angles.

Here, the projector apparatus 110 includes a communication module for transmitting a sensing signal, a video signal, or sight line information processed by the visual line tracking apparatus 100, and the communication module may include a Bluetooth / WiFi And can support short-range wireless communication. That is, the projector apparatus 110 and the gaze tracking apparatus 100 can change the image output direction by reflecting the user's gaze information changed in real time through wireless communication between the apparatuses.

The projector apparatus 110 according to the present invention can be used in a video output direction in a variety of directions in which a user's gaze, such as a front direction in which a user seated when sitting down, or a sky direction in which a user gazes when lying down, So that the video output can be performed.

2 is a configuration diagram of a projector apparatus according to an embodiment of the present invention.

The projector apparatus 110 according to an embodiment of the present invention includes a communication module 112, a control unit 114, a driving unit 116, and a projector module 118.

The communication module 112 receives the gaze information of the user obtained by signal processing from the gaze tracking device 100 that senses the gaze information of the user.

 The communication module 112 receives information from the gaze tracking device 100.

For example, the communication module 112 may receive the sensing signal sensed by the gyro sensor from the visual tracker 100 including the gyro sensor. Or the sight line information in which the detection signal is processed.

In another example, the communication module 112 may receive the video signal recorded through the camera from the gaze tracking device 100 including a camera that recognizes the user's face. Or sight line information in which a video signal is processed.

At this time, the communication module 112 may support short-range wireless communication such as Bluetooth / WiFi for exchanging information transmitted / received between the visual tracking device 100 and the projector device 110. [

The control unit 114 determines the video output direction of the projector based on the visual information of the user received from the communication module 112 and outputs a control signal for controlling the projector driving unit 116 in accordance with the determined video output direction.

Here, the line of sight information may be the result of signal processing in the signal processing unit of the gaze tracking apparatus 100, or may be a signal processing result in the signal processing unit of the projector apparatus 110.

For example, the signal processing section may be located within the visual tracking device 100 or within the projector device 110.

Specifically, the signal processing unit determines whether or not the detection signal of the X-axis, Y-axis, and Z-axis sensed by the gyro sensor is changed within a predetermined angle set in advance, and the change is maintained It can be determined that the user's gaze has changed.

 The control signal is generated based on the sight line information transmitted from the line-of-sight tracking apparatus, or based on information received from the line-of-sight tracking apparatus (for example, a detection signal provided from the gyro sensor, It may be generated.

For example, when the gaze tracking apparatus is a device including a gyro sensor, the control signal may control the driving unit 116 of the projector such that the projector screen is located at a portion where the tilt value of the gyro sensor is a reference point.

The projector driving unit 116 receives the control signal of the control unit 114 and adjusts the video output direction so that the video output of the projector is output to the corresponding position.

For example, the projector driving unit 116 may be implemented by mounting a support including a motor or a remote fan motor. For example, the motor and the remote fan motor mounted on the projector driving unit 116 can move at various angles according to the control signal, so that the image can be outputted to a desired position by the user.

The projector driving unit 116 may be a support including a motor and may be controlled to control the motor so that the projector module 118 can illuminate the front, the front, the ceiling, and the ceiling.

For example, the projector driving unit 116 adjusts the video output direction according to the control signal generated based on the visual information of the user received through the communication module 112. The projector driving unit 116 may be implemented by mounting a motor and a remote fan motor to the projector module 118 and may receive and control a control signal for a video output direction from the control unit 114. [

Meanwhile, the projector driving unit 116 may be implemented by mounting a mirror for reflecting the beam of the projector to the projector module 118, receiving a control signal for the image output direction from the control unit 114, And the image output direction can be adjusted according to the movement of the mirror.

The projector module 118 outputs an image. And outputs the projector image to the position adjusted by the projector driving unit 116.

Hereinafter, an example of the projector apparatus 110 proposed in the present invention will be described. At this time, description of the same configuration as described above will be omitted.

3A to 3D are diagrams for explaining an example of a projector apparatus proposed in the present invention.

The projector apparatus 110 of FIGS. 3A and 3B receives information for determining the image output direction of the projector through the communication module 112 from the visual line tracking apparatus 100 including the gyro sensor. 3A and 3B are diagrams for explaining an example according to the presence or absence of a signal processing unit in the visual line tracking apparatus 100. Specifically, in the case of FIG. 3A, the visual line tracking apparatus 100 detects a sensed In FIG. 3B, the line-of-sight tracking apparatus 100 transmits line-of-sight information processed through the signal processing unit 104 to the sensing signal sensed by the gyro sensor.

3A, the gaze tracking apparatus 100 is a wearable apparatus that is worn on a user's head and includes a gyro sensor 102. The gaze tracking apparatus 100 includes a gyro sensor 102, And a communication module 106 for transmitting to the communication module 112.

Here, the gaze tracking device 100 may be a specific headset, an earphone, and a head wearable wearable device, and can detect the sensed signal obtained by measuring the angular velocity of the gyro sensor 102. At this time, the projector apparatus 110 can track whether the position of the user's gaze is changed through the signal processing unit 113. [

The control unit 114 generates a control signal to drive the driving unit 116 and the projector module 118 outputs a video signal at the position of the user's line of sight .

3B, the line-of-sight tracking apparatus 100 is a wearable apparatus that is worn on a user's head and includes the gyro sensor 102. The wearer's apparatus includes a signal processing unit 110 for extracting a user's line of sight information from a sensing signal obtained from a gyro sensor, And a communication module 106 for transmitting the gaze information to the communication module 112 of the projector apparatus 110. [

Here, the line-of-sight tracking apparatus 100 may be a specific headset, earphone, and head wearable wearable device.

However, the signal processing units 104 and 114 can extract gaze information when the value of the sensing signal obtained from the gyro sensor is maintained for a predetermined time within the predetermined range. Here, the predetermined range may be a range of angles the user can rotate the face with respect to the front, top, left, right, and diagonal directions. If the user's eyesight position is maintained for a certain time within the range, Can be extracted.

For example, the reference range can be specified from 0 ° to 90 ° or from 0 ° to 45 ° in the left, right, upward, or diagonal direction. If the user's face rotates within the range and continues for a certain period of time, It is possible to extract gaze information.

The projector apparatus 110 in FIGS. 3C and 3D receives information for determining the image output direction of the projector through the communication module 112 from the visual line tracking apparatus 100 including a camera for face recognition. FIG. 3C and FIG. 3D are views for explaining an example according to the presence or absence of a signal processing unit in the visual line tracking apparatus 100. 3C, the gaze tracking apparatus 100 transmits the video signal recorded through the camera, and in the case of FIG. 3D, the gaze tracking apparatus 100 transmits the video signal recorded through the camera to the signal processor 113 And transmits the gaze information processed through the signal processing.

3C, the line-of-sight tracking apparatus 100 may include a camera 103 for photographing a user's face image and a communication module 106 for transmitting a face image to the communication module 112 of the projector apparatus 110 have. At this time, the projector apparatus 110 recognizes the face portion of the user from the face image received from the gaze tracking apparatus 100, and detects the user's gaze information from the rotation state in the upper, lower, left, And a signal processor for extracting the signal.

Here, the gaze tracking apparatus 100 may position the camera on the front portion of the user. At this time, the camera transmits image information about the user's face to the projector apparatus 110. [

3D, the line-of-sight tracking apparatus 100 includes a camera 103 for photographing a face image of the user, a face image recognizing unit 105 for recognizing the face portion of the user from the face image received from the camera, A signal processing unit 105 for extracting the user's line of sight information from the rotation state in the right direction and a communication module 106 for transmitting the line of sight information to the communication module 112 of the projector apparatus 110. [

However, the signal processing units 106 and 113 can extract the gaze information when the area of the image signal obtained from the camera is maintained for a predetermined time within the predetermined range.

Here, the predetermined range may be calculated in consideration of a change in the area of the face portion imaged on the camera when the user rotates the face with respect to the front, top, left, right, and diagonal directions. If the area continues for a predetermined time, It is determined that the visual line position has been changed and the visual line information can be extracted.

According to an embodiment of the present invention, a projector apparatus according to an embodiment of the present invention is configured such that a video output direction is moved according to a user's sight line information obtained through visual tracking, By supplementing the uncomfortable posture, it increases the overall usability because it does not have the burden of looking at the screen for a long time.

In addition, according to an embodiment of the present invention, the projector apparatus provides mobility to the projector screen, thereby providing a screen that is organically moved together with the user's attitude, thereby providing an experience that the user has multiple screens.

In addition, according to an embodiment of the present invention, a projector apparatus can be applied to apply various ideas presented in an augmented reality by fusion of motion techniques such as a surveillance camera.

An exemplary projector apparatus proposed by the present invention will be described with reference to FIGS. 4 to 5. FIG.

4 is a view for explaining an example of a projector apparatus for tracking a user's gaze using a gyro sensor mounted on a hair-wearing device proposed in the present invention.

Referring to FIG. 4, the gyro sensor mounted on the head wearable device transmits a sensed value, which senses the movement of the user's gaze, to the projector device, or performs a separate signal process for sensing the movement of the user's gaze , Gaze duration, etc.), and sends the generated gaze information to the projector apparatus.

At this time, the driving unit 116 of the projector apparatus is controlled on the basis of the information received from the gyro sensor (sight line tracking apparatus). The projector driving unit 116 can move the projector module to a position where the user's line of sight has moved freely, such as a ceiling, a front surface, a side surface, etc., so that a video output can be performed.

5 is a view showing an example of a projector apparatus utilizing a camera proposed in the present invention.

5, the image information sensing the motion of the user's gaze may be transmitted to the projector device through a camera installed on the front side of the user, or a separate signal processing for detecting the movement of the user's gaze (e.g., Change, gaze duration, etc.), and sends the generated gaze information to the projector apparatus.

At this time, the driving unit 116 of the projector apparatus is controlled based on the information received from the camera (visual line tracking apparatus). The projector driving unit 116 can move the projector module 118 to a position where the user's line of sight has moved freely, such as a ceiling, a front surface, a side surface, etc., so that video output can be performed.

It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.

It is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents. .

100: eye tracking device
110: projector device

Claims (8)

In the projector apparatus,
A projector module for outputting an image;
A projector driver for adjusting a video output direction of the projector module;
A communication module for receiving gaze information of the user from a gaze tracking device for sensing gaze information of a user,
And a control unit for determining a video output direction of the projector based on the sight line information of the user received from the communication module and outputting a control signal for controlling the projector driving unit in accordance with the determined video output direction.
The method according to claim 1,
Wherein the gaze tracking device is a wearable device that is worn on a user's head and includes a gyro sensor, the device comprising: a signal processor for extracting gaze information of a user from a sense signal obtained from the gyro sensor; To the projector module.
The method according to claim 1,
Wherein the gaze tracking device is a wearable device that is worn on a user's head and includes a gyro sensor and includes a communication module for transmitting a sensing signal obtained from the gyro sensor to a communication module of the projector device,
Wherein the projector apparatus further comprises a signal processing section for extracting the user's gaze information from the sensed signal received from the gaze tracking apparatus.
The method according to claim 2 or 3,
The signal processing unit
And extracts the gaze information when the value of the sensing signal obtained from the gyro sensor is maintained within a predetermined range for a predetermined period of time.
The method according to claim 1,
The eye tracking apparatus includes a camera for capturing a face image of a user,
A signal processing unit for recognizing the user's face portion from the facial image received from the camera and extracting the user's gaze information from the rotation state in the upper, lower, left, or right direction of the user's facial portion,
And a communication module for transmitting the gaze information to a communication module of the projector apparatus.
The method according to claim 1,
The line-of-sight tracking apparatus includes a camera for photographing a user's face image,
And a communication module for transmitting the face image to a communication module of the projector apparatus,
The projector apparatus recognizes the face portion of the user from the face image received from the gaze tracking apparatus and extracts the user's gaze information from the rotation state in the upper, lower, left, or right direction of the user's face portion Wherein the projector apparatus further comprises:
The method according to claim 5 or 6,
The signal processing unit
And extracts the line of sight information when the area of the image signal obtained from the camera is maintained within a predetermined range for a predetermined period of time.
The method according to claim 1,
Wherein the projector driving unit is a support including a motor or a remote fan motor.
KR1020150080812A 2015-06-08 2015-06-08 Projector apparatus KR20160144245A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150080812A KR20160144245A (en) 2015-06-08 2015-06-08 Projector apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150080812A KR20160144245A (en) 2015-06-08 2015-06-08 Projector apparatus

Publications (1)

Publication Number Publication Date
KR20160144245A true KR20160144245A (en) 2016-12-16

Family

ID=57735881

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150080812A KR20160144245A (en) 2015-06-08 2015-06-08 Projector apparatus

Country Status (1)

Country Link
KR (1) KR20160144245A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018171041A1 (en) * 2017-03-23 2018-09-27 广景视睿科技(深圳)有限公司 Moving intelligent projection system and method therefor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018171041A1 (en) * 2017-03-23 2018-09-27 广景视睿科技(深圳)有限公司 Moving intelligent projection system and method therefor

Similar Documents

Publication Publication Date Title
AU2019282933B2 (en) Smart glasses, method and device for tracking eyeball trajectory, and storage medium
US10591735B2 (en) Head-mounted display device and image display system
US10686972B2 (en) Gaze assisted field of view control
KR102213725B1 (en) Tracking head movement when wearing mobile device
US9552060B2 (en) Radial selection by vestibulo-ocular reflex fixation
US9213163B2 (en) Aligning inter-pupillary distance in a near-eye display system
JP6378781B2 (en) Head-mounted display device and video display system
CN110275603B (en) Distributed artificial reality system, bracelet device and head-mounted display
US9215362B2 (en) Image capturing system and image capturing method
US20200211512A1 (en) Headset adjustment for optimal viewing
JP2016206374A (en) Display unit, control method of display unit, and program
US20160170482A1 (en) Display apparatus, and control method for display apparatus
KR20220148921A (en) Eyewear that uses muscle sensors to determine facial expressions
US20110135290A1 (en) Camera adjusting system and method
US11619813B2 (en) Coordinating an eye-mounted imager with an external camera
KR101467529B1 (en) Wearable system for providing information
KR20130059827A (en) Glasses type camera using by pupil tracker
US11435601B2 (en) Saccade detection and endpoint prediction for electronic contact lenses, with adjusted operation
KR20190067523A (en) Glass type terminal and operpation method of the same
US20210392318A1 (en) Gaze tracking apparatus and systems
JP2017092628A (en) Display device and display device control method
KR20160144245A (en) Projector apparatus
JP7247371B2 (en) Image processing device, system, image processing method, and image processing program
JP2016090853A (en) Display device, control method of display device and program
JP2016219897A (en) Display device, control method for the same and program

Legal Events

Date Code Title Description
E601 Decision to refuse application
E801 Decision on dismissal of amendment