KR20170044399A - Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium - Google Patents

Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium Download PDF

Info

Publication number
KR20170044399A
KR20170044399A KR1020150144047A KR20150144047A KR20170044399A KR 20170044399 A KR20170044399 A KR 20170044399A KR 1020150144047 A KR1020150144047 A KR 1020150144047A KR 20150144047 A KR20150144047 A KR 20150144047A KR 20170044399 A KR20170044399 A KR 20170044399A
Authority
KR
South Korea
Prior art keywords
image
projection apparatus
image projection
projecting device
projecting
Prior art date
Application number
KR1020150144047A
Other languages
Korean (ko)
Inventor
송세준
김중형
양성광
이윤기
김희경
박정철
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020150144047A priority Critical patent/KR20170044399A/en
Priority to PCT/KR2016/011554 priority patent/WO2017065556A1/en
Publication of KR20170044399A publication Critical patent/KR20170044399A/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/142Adjusting of projection optics
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/147Optical correction of image distortions, e.g. keystone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Geometry (AREA)
  • Optics & Photonics (AREA)

Abstract

An image projection apparatus, a projection area enlarging method thereof, and a non-transitory computer readable recording medium are provided. An image projection apparatus according to an embodiment of the present invention includes an optical module for projecting light in a surface direction extending from a placement surface on which the image projection apparatus is disposed, A communication module for communicating with the other image projecting device through the optical module, and a sensor module for sensing the device, and a communication module for transmitting the image through the optical module based on the distance from the other image projecting device, Processor.

Description

FIELD OF THE INVENTION [0001] The present invention relates to an image projection apparatus, a projection area enlargement method thereof, and a non-transitory computer readable recording medium,

The present invention relates to an image projecting apparatus, a projection area enlarging method thereof, and a non-transitory computer readable recording medium, and more particularly to an image projecting apparatus which calculates an enlarged projection area by a plurality of image projecting apparatuses, And more particularly, to an image projection apparatus, a projection area enlarging method, and a non-transitory computer readable recording medium.

A projector is a type of display device that displays an image by projecting an input image signal on a screen using light emitted from a light source. Such a projector is widely used in a conference room, a theater, a home theater of a home, and the like.

However, existing projectors are large, heavy, and inconvenient because they are installed in a fixed position. Accordingly, although a pico projector which emphasizes portability has been developed, a troublesome process of installing a tripod and focusing the camera is required to view desired images. In addition, if there is a person or an object between the projector side and the project, a shadow is generated, which is an obstacle to viewing the contents.

Conventionally, when a plurality of projectors are used to expand a screen, a projection image is shot by a separate device other than the projector to calculate an overlap area, and correction for the overlap area is performed. Since the image analysis process is indispensably required, there is a problem that it takes much time to expand the screen.

According to an embodiment of the present invention for solving the above-mentioned problems, there is provided an image processing method for calculating a total projection area using a relative position between a plurality of devices, and generating a content having a size and a shape corresponding to the entire projection area A projection apparatus, a projection area enlarging method thereof, and a non-transitory computer readable recording medium may be provided.

An image projection apparatus according to an embodiment of the present invention includes an optical module for projecting light in a surface direction extending from a placement surface on which the image projection apparatus is disposed, A communication module for communicating with the other image projecting device through the optical module, and a sensor module for sensing the device, and a communication module for transmitting the image through the optical module based on the distance from the other image projecting device, Processor.

The processor may generate at least a part of the content according to the distance from the other image projecting device, and combine the image projected by the other image projecting device and the image to form one content have.

The processor may divide one content into a plurality of sub images and control the optical module to project one of the plurality of sub images according to the distance to the other image projection apparatus, And transmits the other one of the images to the other image projection apparatus through the communication module.

The sensor module outputs a sensing value for sensing at least one of a distance from the other image projecting device, a rotation angle of the image projecting device, and an angle between the image projecting device and the other image projecting device, The communication module may receive information of the other image projection apparatus, and the processor may process an image projected through the optical module based on the sensed value and the received information.

The other image projection apparatus may include a sensor module for sensing a rotation angle of the other image projection apparatus, and the information of the other image projection apparatus may be rotation angle information.

The processor may provide a screen for setting which of the first projection area of the image projection apparatus and the second projection area of the other image projection apparatus is to be centered.

In addition, the processor can adjust the transparency of a region of the first projection area of the image projection apparatus that overlaps with the second projection area of the other image projection apparatus.

The processor may provide a guide message for guiding a position at which the other image projection apparatus should be disposed, when an enlargement ratio of the image is set.

The optical module may project light when the one side of the image projection device is disposed close to the placement surface and stop the light projection if one side of the image projection device is spaced from the placement surface .

According to another aspect of the present invention, there is provided a method of extending a projection area of an image projection apparatus, including the steps of sensing an image projection apparatus positioned within a predetermined distance from the image projection apparatus, And projecting the image based on the distance from the other image projecting device to a surface direction extending from a placement surface on which the image projecting device is disposed.

The projecting step may include generating at least a part of the content according to the distance from the other image projecting device, combining the image projected by the other image projecting device and the image to form one content can do.

The projecting step may include a step of dividing one content into a plurality of sub images in accordance with the distance to the other image projecting device, projecting one of the plurality of sub images, And transmitting the other one to the other image projection apparatus.

The detecting step may output a sensing value for sensing at least one of a distance from the other image projecting device, a rotation angle of the image projecting device, and an angle between the image projecting device and the other image projecting device , The communicating step may receive the information of the other image projection apparatus, and the projecting step may process the projected image based on the sensed value and the received information.

The information of the other image projecting device may be rotation angle information detected by the other image projecting device.

The method may further include the step of providing a screen for setting which of the first projection area of the image projection apparatus and the second projection area of the other image projection apparatus is to be centered.

The method may further include adjusting the transparency of a region of the first projection area of the image projection device that overlaps with the second projection area of the other image projection device to be different from each other.

When the enlargement ratio of the image is set, the method may further include providing a guide message for guiding a position where the other image projection apparatus should be disposed.

In addition, the projecting step may include a step of projecting light when one surface of the image projection apparatus is arranged close to the placement surface, and stopping the light projection when one surface of the image projection apparatus is separated from the placement surface have.

According to another aspect of the present invention, there is provided a non-transitory computer readable recording medium including a program for executing a method of expanding a projection area of an image projection apparatus, Projecting the image to a surface direction extending from a placement surface on which the image projection device is disposed based on the distance from the other image projection device; And a projection area enlarging method of the image projection apparatus.

According to various embodiments of the present invention as described above, a user can project an image to a screen of an enlarged size quickly and easily only by moving the plurality of image projection apparatuses close to each other.

FIGS. 1A and 1B are views for explaining the concept of an image projection apparatus according to an embodiment of the present invention;
2 is a schematic block diagram for explaining a configuration of an image projection apparatus according to an embodiment of the present invention,
FIG. 3A is a block diagram for explaining a configuration of an image projection apparatus according to an exemplary embodiment of the present invention,
3B is a diagram illustrating an example of a module stored in a memory of an image projection apparatus according to an embodiment of the present invention,
3C is a diagram illustrating examples of an external device capable of receiving data by an image projection apparatus according to an exemplary embodiment of the present invention,
4 is a view illustrating an optical module of an image projection apparatus according to an embodiment of the present invention,
FIGS. 5 to 7B are diagrams for explaining a projection state enlarging state determination of an image projection apparatus according to an embodiment of the present invention;
Figures 8A-8C illustrate various embodiments of locations where a key is placed when activating an area extension function through a key;
FIG. 9 is a diagram illustrating a plurality of image projecting apparatuses according to an embodiment of the present invention interlocking to project a single content image; FIG.
FIGS. 10A and 11B are diagrams illustrating a region of a plurality of image projecting devices according to an exemplary embodiment of the present invention,
12A to 12C are diagrams illustrating information necessary for determining a total projection area in an image projection apparatus according to an exemplary embodiment of the present invention;
13A to 13E are diagrams for explaining an overlap area in which projection areas overlap,
14A to 15C are diagrams for explaining a method of transmitting image data according to various embodiments of the present invention;
16A to 16F are diagrams for explaining a guide message providing method for guiding the position of an image projection apparatus for creating a specific enlarged area,
17 to 19 are flowcharts for explaining a projection area expansion method of an image projection apparatus according to various embodiments of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. The following terms are defined in consideration of the functions of the present invention, and may vary depending on users, operators, customs, and the like. Therefore, the definition should be based on the contents throughout this specification.

Terms including ordinals such as first, second, etc. may be used to describe various elements, but the elements are not limited by terms. Terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. The term " and / or " includes any combination of a plurality of related items or any of a plurality of related items.

The terminology used herein is for the purpose of describing the embodiments only and is not intended to limit and / or to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms such as, including, having, etc. are intended to designate the presence of stated features, integers, operations, elements, components, or combinations thereof, and may include one or more other features, , But do not preclude the presence or addition of one or more other features, elements, components, components, or combinations thereof.

FIGS. 1A and 1B are views for explaining the concept of an image projection apparatus 100 according to an embodiment of the present invention. For example, the image projection apparatus 100 may be realized by a projector using a short focal length optical system. Also, the image projection apparatus 100 may be a projector using a projection lens mirror system.

 As shown in FIG. 1A, the image projection apparatus 100 according to an exemplary embodiment of the present invention can project an image when it is positioned on a horizontal plane (bottom plane) such as a table. Also, as shown in FIG. 1B, when the image projection apparatus 100 is positioned on a vertical plane such as a wall surface, the image can be projected. The image projection apparatus 100 according to an embodiment of the present invention differs from a general projector in that an image is projected on a mounting surface. In addition, the image projection apparatus 100 according to one embodiment of the present invention is different from a general projector in that it is portable.

In this way, the image projection apparatus 100 can project an image when the image is close to the object surface (projection surface) to be projected, and can use the image projection function only by a simple operation in which the user puts down the image projection apparatus 100 And efficient power management is also possible. That is, when the image projection apparatus 100 is disposed so that one surface of the image projection apparatus 100 is disposed close to the placement surface, light is projected, and when one surface of the image projection apparatus 100 is spaced from the placement surface, can do.

2, the image projection apparatus 100-1 may include an optical module 160-1, a communication module 120-1, a sensor module 140-1, and a processor 110-1 .

The optical module 160-1 can project an image in a surface direction extending from a placement surface where the image projection apparatus 100-1 is disposed. For example, optical module 160-1 may include a lens and a mirror. The size of the image projected by the optical module 160-1 can be determined by the distance between the lens and the mirror. The size of the image projected from the image projection apparatus 100-1 is determined so that even if an image projected from the plurality of image projectors 100-1 and 100-2 is not photographed by another apparatus, The size and shape of the entire projection area can be grasped by using the relative position information between the apparatuses 100-1 and 100-2. In addition, the projection distance can be fixed according to the specifications of the optical module 160-1 due to the characteristics of the ultra-long focal optical system. The configuration of the optical module 160-1 will be described below in detail with reference to FIG.

The sensor module 140-1 may include various kinds of sensors. For example, the sensor module 140-1 may include an ultrasonic sensor 144, a proximity sensor 147, and the like capable of sensing the adjacent image projection apparatus 100-2. The sensor module 140-1 may include a gyro sensor 142, an acceleration sensor 143, a geomagnetic sensor, and the like, which can detect the rotation angle of the image projection apparatus 100-1. For example, when the image projection apparatus 100-1 is placed horizontally on the floor, the front end direction can be defined as an x-axis direction, the lateral direction as a y-axis direction, and the direction perpendicular to the bottom surface as a z- have. In this case, the rotation angle of the image projection apparatus 100-1 means the degree of rotation about the z-axis. The angle of rotation may alternatively be expressed as a yaw angle.

The sensor module 140-1 may use at least one of the distance, angle, and rotation angle information of the image projection apparatus 100-1, which is information required to calculate the entire projection area, using the various sensors, It is possible to output a sensing value for sensing one.

The communication module 120-1 may perform a function of communicating with the other image projecting device 100-2 or the external device 400. [ For example, the communication module 120-1 may transmit and receive image data, rotation angle information, control commands, and the like with the other image projection apparatus 100-2 using a WiFi mirroring scheme. The communication module 120-1 can use various communication methods other than the WiFi mirroring method.

The processor 110-1 controls the overall configuration of the image projection apparatus 100-1. When the processor 110-1 communicates with the other image projecting device 100-2 through the communication module 120-1, the processor 110-1 transmits the optical module 160-2 based on the distance to the other image projecting device 100-2 -1). ≪ / RTI >

The processor 110-1 generates at least a part of the content as an image according to the distance from the other image projecting device 100-2 and outputs the image projected from the other image projecting device 100-2 and the optical module 160-1 ) Can be combined to form one content. Here, a part of the content means a spatial part. For example, in the case of moving picture contents, a part of the content means a part of space for each frame constituting the moving picture.

The processor 110-1 may divide one content into a plurality of sub images according to the distance from the other image projection apparatus 100-2. The processor 110-1 may control the optical module 160-1 to project one of the plurality of sub images. In addition, the processor 110-1 may control the communication module 120-1 to transmit the remaining one of the plurality of sub images to the other image projection apparatus 100-2. By combining the images projected from the plurality of image projection apparatuses 100-1 and 100-2, one content can be formed on the projection plane. Each of the plurality of image projection apparatuses 100-1 and 100-2 can share not only a plurality of sub images but also sync information for matching a sync.

For example, the processor 110-1 may determine whether or not the distance from the other image projecting device 100-2, the rotation angle of the image projecting device 100-1, An angle between the device 100-1 and the other image projecting device 100-2) and the information of the other image projecting device 100-2 received via the communication module 120-1 The size and shape of the entire projection area can be determined.

The other image projection apparatus 100-2 may also include a sensor module 140-2 itself. The sensor module 140-2 may sense the position information including the rotation angle of the other image projection apparatus 100-2. For example, the information of the other image projection apparatus 100-2 used in the processor 110-1 may be the position information of the other image projection apparatus.

The processor 110-1 may generate an image having a size and a shape corresponding to the size and shape of the entire projection area. For example, the processor 110-1 may generate a projection image in the largest rectangular shape that may be included within the entire projection area.

According to an embodiment of the present invention, one of the plurality of image projecting apparatuses 100 for extending the projection area may be determined as a master image projecting apparatus which is responsible for overall control. For example, a method of determining an external device 400 that transmits image data and an image projection apparatus 100 that is first paged as a master, a method of selecting a master from a user, a method in which an image projection area is widest And a method of determining the image projection apparatus 100 occupying an area as a master may be applied.

FIG. 3A is a block diagram for explaining the configuration of the image projection apparatus 100 according to an embodiment of the present invention in detail. 3A, an image projection apparatus 100 includes at least one processor 110, a communication module 120, a memory 130, a sensor module 140, an input device 150, an optical module 160, An audio module 180, a camera module 191, an indicator 192, a motor 193, a power management module 194, a battery 195, and a wireless charging module 196.

The processor 110 may operate an operating system or an application program to control a plurality of hardware or software components connected to the processor 110, and may perform various data processing and operations including multimedia data. For example, the processor 110 may be implemented in the form of a system on chip (SoC). In addition, the processor 110 may further include a graphics processing unit (GPU) (not shown).

The communication module 120 includes a wireless communication module and performs a wireless communication function with the outside. The communication module 120 may include various modules such as a WiFi module 121, a Bluetooth module 122, and a radio frequency (RF) module 123. When the WiFi module 121 or the BT module 122 is used, various connection information such as an SSID and a session key may be transmitted and received first, and communication data may be transmitted and received after communication connection is established.

Although the WiFi module 121 and the BT module 122 are shown as separate blocks in FIG. 3A, they may be implemented in the form of an IC (integrated chip) or an IC package including various types of wireless communication modules.

The RF module 123 can transmit and receive data through transmission and reception of RF signals. The RF module 123 may include a transceiver, a power amplifier module (PAM), a frequency filter, or a low noise amplifier (LNA). Although the WiFi module 121 and the BT module 122 share one RF module 123 in FIG. 3A, they can transmit and receive RF signals through separate RF modules 123, respectively.

The communication module 120 can perform communication according to various communication standards such as IEEE, ZigBee, 3G (3rd Generation), 3GPP (3rd Generation Partnership Project), LTE (Long Term Evolution)

The memory 130 may store various data, programs, or applications for driving and controlling the image projection apparatus 100. For example, the memory 130 may store information about the size and shape of the reference image. The memory 130 may include an internal memory 131. For example, the internal memory 131 may be a volatile memory such as a dynamic RAM (RAM), a static RAM (SRAM), or a synchronous dynamic RAM (SDRAM), a one time programmable ROM (OTPROM), a programmable ROM (PROM) and non-volatile memory such as programmable ROM, electrically erasable and programmable ROM (EEPROM), mask ROM, flash ROM, NAND flash memory, and NOR flash memory.

The sensor module 140 can detect the state of the image projection apparatus 100 and convert the sensed information into an electrical signal. The sensor module 140 includes a gesture sensor 141, a gyro sensor 142, an acceleration sensor 143, an ultrasonic sensor 144, an infrared sensor 145, a hall sensor 146, a proximity sensor 147, (Not shown). The sensor module 140 may include an E-nose sensor (not shown), an EMG sensor (not shown), an EEG sensor (not shown), an electrocardiogram sensor (not shown) An iris sensor (not shown) or a fingerprint sensor (not shown), a pressure sensor (not shown), and the like.

The sensor module 140 can sense whether the image projection apparatus 100 is close to the placement surface. For example, the sensor module 140 may determine proximity to the placement surface using an illuminance sensor 148 attached to one side of the image projection apparatus 100. In addition, the sensor module 140 can detect the infrared ray-labeled identification mark using the infrared ray sensor 145. FIG. Accordingly, the sensor module 140 can sense the size and shape of the projected image.

In addition, the sensor module 140 may determine the degree of rotation of the image projection apparatus 100 and sense how much the image projection apparatus 100 has rotated based on the reference direction. The sensor module 140 may sense whether the other image projecting device 100 'is adjacent to the other and the distance to the other image projecting device 100'.

The input device 150 may receive user input. For example, the input device 150 may include a physical button, an optical key or a keypad.

The optical module 160 can project an image. The optical module 160 may include an illumination module 161 and a projection module 162. The projection module 162 can display an image by projecting light onto the projection surface. The projection module 162 can project light in the direction of the surface extending from the placement surface on which the image projection apparatus 100 is disposed.

The optical module 160 can project light when the image projection apparatus 100 is located on the placement surface or when the image projection apparatus 100 is located within a predetermined distance from the placement surface. For example, when the image projection apparatus 100 is not close to the placement plane, the power management module 194 blocks power supply to the optical module 160 to prevent the optical module 160 from operating .

Various methods such as DLP, LCOS, 3LCD, and laser method can be used for the projection module 162 to project light.

The interface 170 may include a high-definition multimedia interface (HDMI) 171 and a universal serial bus (USB) 172. The interface 170 may be connected to an external device by wire to send and receive data.

The audio module 180 can bidirectionally convert voice and electrical signals. The audio module 180 may process audio information input or output through the speaker 181 or the microphone 182 or the like.

The camera module 191 can capture a still image and a moving image. The camera module 191 may include an image sensor or the like, and may analyze the property of the projected image by photographing the projected image.

The indicator 192 may indicate a state of the image projection apparatus 100 or a part of the image projection apparatus 100, for example, a boot state, a message state, or a charged state.

The motor 193 can convert the electrical signal into mechanical vibration.

The power management module 194 manages the power of the image projection apparatus 100. For example, when one surface of the image projection apparatus 100 is disposed close to the placement surface on which the image is to be projected, the power management module 194 can apply power to the optical module 160.

The battery 195 may store or generate electricity, and may supply power to the image projection apparatus 100 using the stored or generated electricity. For example, the battery 195 may include a rechargeable battery or a solar battery.

The wireless charging module 196 may include circuitry capable of wireless charging, such as a coil loop, a resonant circuit, or a rectifier. The wireless charging module 169 can use a magnetic resonance method, a magnetic induction method, an electromagnetic wave method, or the like as a wireless charging method.

Each of the above-described components of the image projection apparatus 100 according to various embodiments of the present invention may be composed of one or more components, and the name of the component may be changed according to the type of the device . The image projection apparatus 100 according to various embodiments of the present invention may include at least one of the above-described components, and some of the components may be omitted or further include other additional components. In addition, since some of the components of the image projection apparatus 100 according to various embodiments of the present invention are combined and configured as an entity, functions of corresponding components before being combined can be performed in the same manner have.

3B is an example of a module stored in the memory 130 of the image projection apparatus 100 according to an embodiment of the present invention. 3B, the memory 130 includes an operating system 132, a signal processing module 133, an apparatus position determination module 134, a projection plane analysis module 135, a projection image extensibility determination module 136, An input processing module 137 may be included.

The operating system 132 controls the overall operation of the image projection apparatus 100.

The signal processing module 133 may perform buffering or signal decoding so that the content received through the communication module 120 can be projected through the optical module 160. [ The signal processing module 133 can perform processing on the image data received by the image projection apparatus 100. The signal processing module 133 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, resolution conversion, and the like on the video data.

The device position determination module 134 may determine whether the image projection apparatus 100 is located on the placement plane. For example, the device position determination module 134 can determine whether the image projection apparatus 100 is located on a placement surface or within a predetermined distance.

Alternatively, the device position determination module 134 may determine whether the surface on which the image projection apparatus 100 is positioned is perpendicular to the gravity direction or horizontal (for example, whether the surface is horizontal or vertical to the ground surface).

The projection plane analysis module 135 can correct the image by analyzing the projection plane of the projector. For example, the projection plane analysis module 135 may correct the geometry information color information of the projected image.

The projected image scalability determination module 136 may determine whether there is at least one other image projection apparatus in the vicinity of the image projection apparatus 100 to determine the expansion possibility of the projected image.

The user input processing module 137 detects the user input in the output projection image and processes the data corresponding thereto.

3C is a diagram showing an example of external devices through which the image projection apparatus 100 can receive data. According to an embodiment of the present invention, the image projection apparatus 100 can receive content to be projected from the external device 400 through the communication module 120 or the interface 170. [

Examples of the external device 400 include a flexible device 400-1, a clock device 400-2, a tablet PC 400-3, a mobile device 400-4, a display device 400-5, A computer 400-6, a desktop computer 400-7, a wearable device 400-8 such as smart glasses, and the like. The external device 400 is not limited to these, and any device capable of transmitting contents to the image projection apparatus 100 by wire or wireless is possible. The image projection apparatus 100 may store the content received from the external device 400 in the memory 130. [

4 is a diagram illustrating an example of an optical module 160 according to an embodiment of the present invention. Referring to FIG. 4, the illumination optical system 11 and the projection lens unit 13 may be disposed inside the housing 16. In FIG. The light projected from the illumination optical system 11 can be changed in the direction of light vertically through the prism 12. The projection lens unit 13 is composed of a combination of lenses 14, and the lens combination 14 functions to prevent the image from being distorted even at a small projection distance. In addition, the projection lens unit 13 may include a mirror 15, and may reflect light to the side through the mirror 15. The projection lens unit 13 uses a very short focal length lens for combination of lenses so that projection can be performed on the side projection surface even at a short projection distance. In another example, the optical module 160 may be configured by horizontally positioning the illumination optical system 11 and the projection lens unit 13 except for the prism 12.

Hereinafter, features of the image projection apparatus 100 according to an embodiment of the present invention will be described with reference to the drawings.

The image projection apparatus 100-1 according to an exemplary embodiment of the present invention determines whether or not the image can be expanded without a separate apparatus and uses the relative position information with the other image projection apparatus 100-2 The extended total projection area can be determined.

Referring to FIG. 5, the processor 110 may detect proximity of the other image projection apparatus 100-2 using the sensor module 140. FIG. When it is detected that the other image projection apparatus 100-2 is approaching within a predetermined distance, the processor 110 can determine that the projection area can be expanded. For example, the processor 110 can interoperate and extend the projection area using a magnetic sensor, a proximity sensor, an infrared sensor, a depth camera, a UWB radar, or the like included in the sensor module 140 It can be seen that the other image projection apparatus 100-2 is close. In order to expand the projection area, it is necessary to approach within a certain distance. Therefore, the processor 110 can activate the projection area expanding function only when the approach of the other image projection apparatus 100-2 is detected.

As another example, the processor 110 may activate the projection area expanding function only when the approach of the other image projecting device 100-2 is sensed and communicated with the sensed image projecting device 100-2.

For example, as shown in FIG. 6, the activation of the projection area expanding function can be guided to the color of the button. 6, since the two image projection apparatuses 100-1 and 100-2 approach within a certain distance, the processor 110 displays the color of the button in a color (for example, green) indicating activation . Conversely, since the two image projection apparatuses 100-1 and 100-2 are not within a certain distance in the right drawing of FIG. 6, the processor 110 sets the color of the button to a color (for example, red) Can be displayed.

As another example, when the processor 110 activates the projection area expanding function, the sensor module 140 can start to detect whether the nearby other image projection apparatus 100-2 is present. Alternatively, the processor 110 may detect the presence of the other image projection apparatus 100-2 using the communication module 120. [

If the projection area expanding function is activated but there is no nearby other image projecting device 100-2, the processor 110 may provide a guide message to guide the user that the other image projecting device 100-2 is close to the user . For example, the processor 110 may provide a guide message to a user using at least one of a laser, an LED, a mechanical sound, a voice guidance, a vibration, and a guidance image included in the projection image.

Referring to FIG. 7A, it can be seen that the command to activate the projection area expanding function is input to the image projection apparatus 100-1 by the user, but the other image projection apparatus 100-2 does not approach within a certain distance. The processor 110 may control the optical module 160 to project a guide message such as 'Keep two devices close'.

As another example, as shown in FIG. 7B, the processor 110 may be configured to send a control command to the faced external device 400 to send a guide message " bring the two devices close together " Module 120 can be controlled.

According to another embodiment of the present invention, after detecting that the other image projection apparatus 100-2 is located nearby, the processor 110 determines whether the image projection apparatus 100-2 can communicate with the other image projection apparatus 100-2 again . If the communication connection is not established, the processor 110 may provide a guide message for the communication connection.

In another embodiment of the present invention, in the case where the other image projection apparatus 100-2 is not perceived around for a preset time, the processor 110 determines that the other image projection apparatus 100-2 is not present around A guide message may be provided.

According to an embodiment of the present invention, a user input may be performed so that a plurality of image projection apparatuses 100 can be connected to the same network. Alternatively, a user input may be performed to activate the projection area enhancement function. For example, the processor 110 may receive such user input in various manners, as shown in Figures 8A-8C.

8A, the external device 400 and the first image projection apparatus 100-1 are paired with each other so that the screen of the external apparatus 400 is projected by the first image projection apparatus 100-1 . The processor 110-1 can receive user input through an input device 150-1 (e.g., a physical button) provided outside the image projection apparatus 100-1.

8B is a view showing an embodiment in which a button capable of receiving a user input is included in a projection screen. When the sensor module 140 detects a user gesture of pressing a button included in the projection screen, the processor 110 may determine that a network connection or a projection area expansion function activation input has been received.

As another example, as shown in FIG. 8C, the first image projection apparatus 100-1 may receive a user input through the paired external device 400. FIG.

9 is a diagram showing that the projection areas of the first image projection apparatus 100-1 and the second image projection apparatus 100-2 are integrated and extended to the entire projection area. It can be confirmed that the 'A' image provided by the external device 400 is projected on the entire projection area. 9, the plurality of image projecting apparatuses 100-1 and 100-2 are arranged horizontally, and the processor 110 calculates the total projected area only by the distance between the plurality of image projecting apparatuses 100-1 and 100-2 Can be calculated. The case where the arrangement of the plurality of image projection apparatuses 100-1 and 100-2 is shifted will be described in detail again with reference to Fig.

10A and 10B are views showing a state in which a plurality of image projection apparatuses 100-1 and 100-2 according to an embodiment of the present invention are connected to external apparatuses 400-1 and 400-2, Fig.

Referring to FIG. 10A, the first image projection apparatus 100-1 is paired with the first external apparatus 400-1 to project an 'A' image provided by the first external apparatus 400-1. Similarly, the second image projection apparatus 100-2 is paired with the second external apparatus 400-2 to project a 'B' image provided by the second external apparatus 400-2.

When a plurality of image projection apparatuses 100-1 and 100-2 interlock with each other to project one content in the entire projection area, images projected from any of the image projection apparatuses 100-1 and 100-2 are interlocked There is a need to judge whether to project.

For example, as shown in FIGS. 10A and 10B, the processor 110 may select to project an image projected from an image projecting apparatus 100 requesting a projection region expanding function. In FIG. 10A, in response to a user input through a button provided in the first image projection apparatus 100-1, the processor 110 may control to project the 'A' image in the extended entire projection area. In contrast, in FIG. 10B, in response to a user input through a button provided in the second image projection apparatus 100-1, the processor 110 can control to project the 'B' image in the extended entire projection area.

In another embodiment, the processor 110 may provide a screen for setting which of the first projection area and the second projection area to extend the projection area. The processor 110 can determine the entire projection area around the area set by the screen. For example, the processor 110 may provide a screen including a UI to determine a total projection area around the set area. In the case of Figs. 9 to 10B, the same type of total projection area will be determined regardless of any projection area. However, as shown in Figs. 11A and 11B, when the image projecting directions of the plurality of image projecting apparatuses 100-1 and 100-2 are different from each other, it is possible to represent the entire projecting region ) To be determined.

In addition to the above-described method of providing a screen or a UI, a method of extending a projection area around a video projection apparatus 100 in which a user command for activating a projection area expanding function is input, a method of connecting an external device 400 A method of extending a projection area around an image projection apparatus 100, a method of extending a projection area around a final image projection apparatus 100 installed by a user, a method of extending an image projection apparatus 100 And a method of expanding the projection area around the center can be applied.

Hereinafter, a method of calculating a total projection area according to an embodiment of the present invention will be described with reference to FIGS. 12A to 12C.

The image projection apparatus 100 according to an exemplary embodiment of the present invention has a predetermined size and shape of a projection area. As a result, the entire projection area can be calculated using only the relative arrangement information of the plurality of image projection apparatuses 100. [ That is, the processor 110 can grasp the overlapping of the plurality of projection areas without performing an image processing process through a separate apparatus.

12A, when the plurality of image projection apparatuses 100-1 and 100-2 are arranged horizontally, the processor 110 can calculate the entire projection area only by the distance information between the apparatuses. This is because the image projection apparatus 100 according to an embodiment of the present invention using the ultra-long focal length optical system has a certain projection area. For example, the processor 110 may use the ultrasonic sensor 144 to measure the distance between the devices.

Next, as shown in Fig. 12B, when the projection directions of the plurality of image projecting apparatuses 100-1 and 100-2 are the same (when the rotation angles are the same), the processor 110 determines the distance The total projection area can be calculated through the information and angle information between the devices. The processor 110 can determine the horizontal and vertical distances to the other image projection apparatus 100-2 using the distance information and the angle information. For example, the processor 110 may determine the horizontal and vertical distances by converting the (r, [theta]) coordinate values into the (x, y) coordinate values.

12C shows a case where the image projection apparatus 100-1 has a rotation about the z-axis (axis in the direction perpendicular to the placement plane) as the rotation axis and the image projection direction is not coincident with the image projection apparatus 100-2 Fig. In this case, the processor 110 can obtain the rotation angle information of the image projection apparatus 100-1 using the sensor module 140. [ For example, the sensor module 140 can sense the rotation angle of the image projection apparatus 100-1 through the geomagnetism sensor.

The processor 110 can determine the entire projection area by combining the distance information between the plurality of the image projection apparatuses 100-1 and 100-2, the angle information, and the inclination information of the image projection apparatus 100-1 . The processor 110 may control the communication module 120 to receive rotation angle information of the other image projection apparatus 100-2.

13A to 13E are diagrams showing a plurality of image projection apparatuses 100 arranged in various forms to expand the screen.

Fig. 13A shows a case where a plurality of image projectors 100-1 and 100-2 are horizontally arranged. Fig. 13B shows a case where a plurality of image projectors 100-1 and 100-2 are arranged vertically FIG. The processor 100 determines the total projection area and determines whether or not the first projection area by the first image projection apparatus 100-1 and the second projection area by the second image projection apparatus 100-2 It is possible to determine the overlapping overlap area. Since the light is superimposed on the overlapping area, the processor 110 adjusts the transparency of the image projected on the insulated area so that the brightness of the entire projected area is constant.

When a larger number of image projection apparatuses 100 are used as shown in FIGS. 13C and 13D, regions having overlapping overlapping regions are generated. For example, the processor 110 may maintain the brightness of the entire projection area constant by adjusting the alpha value for each overlapping area.

13E, if the plurality of image projectors 100-1 and 100-2 are rotated with respect to the direction of gravity, the processor 110 may perform correction to rotate and reduce the projected image. In the case where the rotation and reduction correction are performed, as shown in the lower drawing of Fig. 13E, the overlap area has a parallelogram shape instead of a rectangular shape.

14A to 14C are diagrams illustrating a method of transmitting image data in the image projection apparatus 100 according to an embodiment of the present invention.

Referring to FIG. 14A, a method of dividing a content into a plurality of sub-images in the external device 400 may be applied. The image projected from the first image projection apparatus 100-1 will be displayed on the left side of the entire projection area and the image projected from the second image projection apparatus 100-2 will be displayed on the right side of the entire projection area. The processor 110-1 can receive the data obtained by dividing the contents into a plurality of sub images (for example, the left A screen and the right A screen) in advance in the external device 400 through the communication module 120-1 have. The processor 110-1 transmits the right side A screen data, which is the remaining data excluding the left side A screen data, which is data to be projected through the optical module 160-1, to the other image projection apparatus 100-2. (120-1).

FIG. 14B is a diagram showing an embodiment in which all devices 400, 100-1, and 100-2 share entire image data. In this case, the processors 110-1 and 110-2 of the respective image projection apparatuses 100-1 and 100-2 divide the contents into a plurality of sub images, and the own optical modules 160-1 and 160-2 The sub-image to be projected can be determined.

Referring to FIG. 14C, the processor 110-1 may divide the content (A screen data) received from the external device 400 into a plurality of sub images. The processor 110-1 projects one of the plurality of sub images (left A screen data) to the projection surface through the optical module 160-1, and the other one of the plurality of sub images (right A screen data) To the other image projection apparatus 100-2.

15A to 15C are diagrams for explaining a procedure of data transmission in a plurality of image projection apparatuses 100 according to various embodiments of the present invention. The data transmitted between the image projection apparatuses 100 may include not only image data but also data on the projection area.

As shown in FIG. 15A, a method of distributing and transmitting data by the image projection apparatus 100 connected to the external device 400 may be applied. That is, this corresponds to an embodiment in which the image projection apparatus 100 connected to the external device 400 performs all functions related to data distribution.

FIG. 15B is a diagram illustrating a manner in which a plurality of image projection apparatuses 100 sequentially transmit data. For example, the image projection apparatus 100 may transmit only data used in another image projection apparatus among all the data, or may transmit the entire data as it is.

As another example, for fast data transmission, a method of transmitting data as shown in FIG. 15C may be employed.

The image projection apparatus 100-1 according to an embodiment of the present invention can calculate the entire projection area and generate content suitable for the other projection apparatus 100-2 if the other image projection apparatus 100-2 exists within a specific distance. However, when it is desired to enlarge an image at a specific ratio or size, a position at which a plurality of the image projection apparatuses 100-1 and 100-2 should be arranged is determined. In this case, the processor 110 may provide a guide message to guide the position where the other image projection apparatus 100-2 should be placed. The processor 110 may provide various guide messages as shown in Figures 16A-16F. For example, the guide message may be provided in the form of at least one of an optical signal, a voice signal, a vibration signal, and a video signal.

16A is a diagram showing the use of LED colors as an example of a method of using a light source. Referring to FIG. 16A, the image projection apparatus 100-1 may include an LED 192-1 as an indicator. For example, the position where the other-image projection apparatus 100-2 is to be arranged may be a horizontal position (point P). When the other image projection apparatus 100-2 is not disposed at the corresponding position, the processor 110-1 can control the LED 192-1 to be displayed as a red light source. If the other image projection apparatus 100-2 is disposed at the corresponding position, the processor 110-1 can control the LED 192-1 to be displayed as a green light source.

As another example, the processor 110 may provide a guide to the user by varying the blink rate of the LED 192-1 in proportion to the distance away from the target point.

As another example, as shown in Fig. 16B, it is also possible to indicate the direction in which the light source should be moved using the shape of the light source. 16B, the indicator 192 may include a plurality of LED lamps to represent a shape like an arrow.

Referring to FIG. 16C, the processor 110-1 may display a position at which the other image projection apparatus 100-2 should be arranged using a laser.

16D and 16E are diagrams illustrating an embodiment in which the processor 110-1 provides a guide message for guiding the position where the other image projection apparatus 100-2 should be arranged according to a desired image ratio. When the other image projection apparatus 100-2 is disposed at the point P3 in Fig. 16D and Fig. 16, the enlarged total projected area has a 4: 3 aspect ratio. If the other image projection apparatus 100-2 is disposed at the point P4, the enlarged total projection area has a 16: 9 aspect ratio.

When the other image projection apparatus 100-2 is disposed at a position where the entire projection area has a specific image ratio, the processor 110-1 controls the optical module 160 to project the guidance message indicating the image ratio Can be controlled. 16D, the processor 110-1 may control the optical module 160 to project a guidance message of 16: 9. Similarly, in the case of FIG. 16E, the processor 110-1 may control the optical module 160 to project a 4: 3 announcement message.

16E is a diagram showing an embodiment for projecting a guide message guiding the direction in which the other image projection apparatus 100-2 should move. If the other image projection apparatus 100-2 is disposed at a position where the desired extended screen can be formed, a guide message such as 'placed at the screen extension position' may be provided through the projection screen.

In addition, the processor 110 may provide the distance between the plurality of image projecting apparatuses 100 as a voice signal or a vibration signal. For example, the processor 110 can control the audio module 180 such that a voice signal is generated as the distance between the plurality of the image projecting apparatuses 100 is increased. As another example, the processor 110 may control the motor 193 so that the vibration is strongly generated as the distance between the plurality of image projecting apparatuses 100 becomes closer.

According to the various embodiments as described above, the user can quickly and easily project the image on the screen of the expanded size only by moving the plurality of the image projection apparatuses close to each other.

17 to 19 are flowcharts for explaining a projection area enlarging method of the image projection apparatus 100 according to various embodiments of the present invention.

Referring to FIG. 17, the image projection apparatus 100 detects whether an adjacent image projection apparatus 100-2 is present (S1710). For example, the image projection apparatus 100 can detect whether the other image projection apparatus 100-2 is located within a predetermined distance range by using an ultrasonic sensor, a proximity sensor, or the like.

, And the image projection apparatus 100 communicates with the sensed image projection apparatus (S1720). Since the process of transmitting the area data and the image data is required, the image projection apparatus 100 can further judge the possibility of communication connection.

When the image projection apparatus 100 is in communication with the other image projection apparatus 100-2, the image projection apparatus 100 displays the image on the basis of the distance to the sensed image projection apparatus 100-2, (S1730). ≪ / RTI >

FIG. 18 is a flowchart illustrating a method for expanding the projection area of the image projection apparatus 100 according to an embodiment of the present invention. FIG. 18 specifically shows the interlocking operation with the other image projecting device 100-2 described in step S1730 in more detail.

Steps S1810 and S1820 correspond to steps S1710 and S1720, and a description thereof will be omitted.

When the image projection apparatus 100 senses the other image projection apparatus 100-2 and confirms that the communication is established, the first projection area of the image projection apparatus 100 and the second projection area of the other image projection apparatus 100-2, Regions are combined to determine a total projection area (S1830). For example, the image projection apparatus 100 can calculate the distance between the first projection area and the second projection area by using the relative distance, angle, and rotation angle information of the image projection apparatus 100 with respect to the other image projection apparatus 100-2 It is possible to judge whether or not it is placed. By combining such information, the image projection apparatus 100 can determine the size and shape of the entire projection area.

The image projection apparatus 100 generates a content having a size and a shape corresponding to the size and shape of the entire projection area (S 1840). Then, the image projection apparatus 100 judges the area occupied by the first projection area and the second projection area of the entire projection area, and judges a content part to be projected by each of the image projection apparatuses 100, 100-2 have. That is, the image projecting apparatus 100 may be configured so that the image projected by the other image projecting apparatus 100-2 and the image projected by the image projecting apparatus 100 are combined to form one content, 2) of at least one portion of the content.

Then, the image projection apparatus 100 can divide the content into a plurality of sub images (S1850). Then, the image projection apparatus 100 projects a sub-image to be projected in the first projection area of the plurality of sub-images onto the projection surface, and a sub-image to be projected in the second projection area, ) (S1860).

19 is a flowchart illustrating a process of determining whether the image projection apparatus 100 can execute the projection area expanding function according to an embodiment of the present invention.

First, the image projection apparatus 100 can receive a user command for activating the projection area expanding function (S1910). In response to the user command, the image projection apparatus 100 determines whether the other image projection apparatus 100-2 is located within a certain distance (S1920). The image projection apparatus 100 can determine whether the other image projection apparatus 100-2 exists within a predetermined distance using a proximity sensor, an ultrasonic sensor, or the like. If there is no other image projection apparatus 100-2 within a predetermined distance (S1920-N), the image projection apparatus 100 may provide a guide message for distance movement (S1950). For example, the image projection apparatus 100 may provide a guide message for specifying a specific position as well as a distance movement.

If it is determined that the other image projection apparatus 100-2 exists within a certain distance (S1920-Y), the image projection apparatus 100 checks whether each image projection apparatus is in the communication connected state (S1930). If it is determined that the communication connection is not established (S1930-N), the image projection apparatus 100 may provide a guide for communication connection because the projection area extension function can be performed on the assumption that data is transmitted and received S1960).

If it is determined that the communication connection has been established (S1930-Y, S1970-Y), the image projection apparatus 100 can execute the projection area expanding function in conjunction with the other image projection apparatus 100-2.

The various embodiments of the projection area expansion method of the other image projection apparatus 100 will be omitted from the description of the embodiment of the image projection apparatus 100.

The methods described above may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The above hardware devices may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. This is possible. Therefore, the scope of the present invention should not be limited to the described embodiments, but should be determined by the equivalents of the claims, as well as the claims.

100: image projection device 110: processor
120: communication module 140: sensor module
160: Optical module

Claims (19)

In the image projection apparatus,
An optical module for projecting light in a surface direction extending from a placement surface on which the image projection device is disposed;
A sensor module for sensing an image projecting device positioned within a predetermined distance from the image projecting device;
Communication module; And
And a processor configured to project the image through the optical module based on a distance between the sensed image projecting device and the communication module.
The method according to claim 1,
The processor comprising:
Wherein at least a part of the content is generated as the image according to the distance from the other image projecting device and the image projected from the other image projecting device is combined with the image to form one content. Device.
The method according to claim 1,
The processor comprising:
Wherein the control unit controls the optical module to divide one content into a plurality of sub images and project one of the plurality of sub images in accordance with a distance from the other image projection apparatus, And transmits the image to another image projecting apparatus through the communication module.
The method of claim 3,
The sensor module outputs a sensing value for sensing at least one of a distance from the other image projecting device, a rotation angle of the image projecting device, and an angle between the image projecting device and the other image projecting device,
The communication module includes:
Receiving information of the other image projection apparatus,
The processor comprising:
And processes the image projected through the optical module based on the sensed value and the received information.
5. The method of claim 4,
The other image projecting device may include a sensor module for sensing a rotation angle of the other image projecting device,
And the information of the other image projection apparatus is rotation angle information.
3. The method of claim 2,
The processor comprising:
Wherein a screen for setting which of a first projection area of the image projection apparatus and a second projection area of the other image projection apparatus is to be expanded is provided.
The method according to claim 1,
The processor comprising:
And adjusts the transparency of an area of the first projection area of the image projection device that overlaps with the second projection area of the other image projection device to be different from each other.
The method according to claim 1,
The processor comprising:
Wherein when the enlargement ratio of the image is set, a guide message for guiding the position where the other image projection apparatus is to be arranged is provided.
The method according to claim 1,
The optical module includes:
And projecting light when one surface of the image projection apparatus is arranged close to the placement surface, and stopping the light projection when one surface of the image projection apparatus is spaced apart from the placement surface.
A method of extending a projection area of an image projection apparatus,
Detecting an image projecting device located within a predetermined distance from the image projecting device;
Communicating with the sensed image projecting device; And
Projecting an image based on a distance from the other image projecting device to a surface direction extending from a placement surface on which the image projecting device is disposed.
11. The method of claim 10,
Wherein the projecting comprises:
Wherein the at least one portion of the content is generated as the image in accordance with a distance between the at least one image projecting device and the at least one image projecting device, Way.
11. The method of claim 10,
Wherein the projecting comprises:
Dividing one content into a plurality of sub-images according to a distance from the other image projection apparatus;
Projecting one of the plurality of sub-images; And
And transmitting the other one of the plurality of sub images to the other image projection apparatus.
13. The method of claim 12,
Wherein the sensing step outputs a sensing value for sensing at least one of a distance from the other image projecting device, a rotation angle of the image projecting device, and an angle between the image projecting device and the other image projecting device,
Wherein the communicating step comprises: receiving information of the other image projection apparatus,
Wherein the projecting step processes the projected image based on the sensed value and the received information.
14. The method of claim 13,
Wherein the information of the other image projecting device is rotational angle information sensed by the other image projecting device.
12. The method of claim 11,
Further comprising the step of providing a screen for setting which of the first projection area of the image projection apparatus and the second projection area of the other image projection apparatus is to be centered and the projection area is to be extended, Area extension method.
11. The method of claim 10,
And adjusting the transparency of a region of the first projection area of the image projection device that overlaps with the second projection area of the other image projection device to be different from each other.
11. The method of claim 10,
And providing a guide message for guiding a position at which the other image projection apparatus should be disposed, when the enlargement ratio of the image is set.
11. The method of claim 10,
Wherein the projecting comprises:
And projecting light when one surface of the image projection apparatus is arranged close to the placement surface, and stopping the light projection when one surface of the image projection apparatus is spaced apart from the placement surface.
A non-transitory computer readable recording medium containing a program for executing a projection area enlargement method of an image projection apparatus,
A method of extending a projection area of an image projection apparatus,
Detecting an image projecting device located within a predetermined distance from the image projecting device;
Communicating with the sensed image projecting device; And
Projecting an image based on a distance from the other image projection apparatus in a direction of a surface extending from a placement surface on which the image projection apparatus is disposed.
KR1020150144047A 2015-10-15 2015-10-15 Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium KR20170044399A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150144047A KR20170044399A (en) 2015-10-15 2015-10-15 Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium
PCT/KR2016/011554 WO2017065556A1 (en) 2015-10-15 2016-10-14 Image projection apparatus, method for expanding projection area thereof, and non-temporary computer-readable recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150144047A KR20170044399A (en) 2015-10-15 2015-10-15 Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium

Publications (1)

Publication Number Publication Date
KR20170044399A true KR20170044399A (en) 2017-04-25

Family

ID=58517369

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150144047A KR20170044399A (en) 2015-10-15 2015-10-15 Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium

Country Status (2)

Country Link
KR (1) KR20170044399A (en)
WO (1) WO2017065556A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022234942A1 (en) * 2021-05-07 2022-11-10 삼성전자주식회사 Electronic apparatus and control method therefor
WO2024014703A1 (en) * 2022-07-14 2024-01-18 삼성전자주식회사 Image output device and control method for same

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108168070B (en) * 2017-12-26 2023-12-05 珠海格力电器股份有限公司 Equipment installation method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5217630B2 (en) * 2008-05-28 2013-06-19 株式会社ニコン Projector and multi-projection system
KR101526998B1 (en) * 2008-10-16 2015-06-08 엘지전자 주식회사 a mobile telecommunication device and a power saving method thereof
JP2013195498A (en) * 2012-03-16 2013-09-30 Nikon Corp Multi-projector system
JP6427858B2 (en) * 2013-09-19 2018-11-28 セイコーエプソン株式会社 Display system, image display apparatus, and display system control method
JP2015159522A (en) * 2014-01-21 2015-09-03 セイコーエプソン株式会社 Projector, control method of projector, and display device
JP2015161830A (en) * 2014-02-27 2015-09-07 株式会社リコー Image projection system and image projection device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022234942A1 (en) * 2021-05-07 2022-11-10 삼성전자주식회사 Electronic apparatus and control method therefor
WO2024014703A1 (en) * 2022-07-14 2024-01-18 삼성전자주식회사 Image output device and control method for same

Also Published As

Publication number Publication date
WO2017065556A1 (en) 2017-04-20

Similar Documents

Publication Publication Date Title
US11169600B1 (en) Virtual object display interface between a wearable device and a mobile device
KR102393297B1 (en) A eletronic device and a method
KR102444075B1 (en) Electronic device, peripheral device, and control method thereof
WO2018099013A1 (en) Portable intelligent projection system
US11910178B2 (en) Orientated display method and apparatus for audio device, and audio device
US11689877B2 (en) Immersive augmented reality experiences using spatial audio
KR20160061133A (en) Method for dispalying image and electronic device thereof
KR20210046822A (en) Generation of shockwaves from 3D depth videos and images
WO2021061326A1 (en) Automated video capture and composition system
KR20170044399A (en) Image projection apparatus, projection area expanding method of thereof and non-transitory computer readable recording medium
JP6631014B2 (en) Display system and display control method
US11483569B1 (en) Device with dynamic transcode throttling
US20240082697A1 (en) Context-sensitive remote eyewear controller
TWI676130B (en) Display device
KR20170044383A (en) Image projection apparatus, image projection method of thereof and non-trnasitory computer readable recording medium
CN112381729B (en) Image processing method, device, terminal and storage medium
US11531390B1 (en) Augmented reality with eyewear triggered IoT
KR20170065160A (en) Projector and method for operating thereof
JP2015159460A (en) Projection system, projection device, photographing device, method for generating guide frame, and program
EP4172731A1 (en) Dynamic sensor selection for visual inertial odometry systems
KR20170044387A (en) Image projection apparatus, image compensation method of thereof and non-transitory computer readable recording medium
KR20170023491A (en) Camera and virtual reality system comorising thereof
US20230048968A1 (en) Electronic apparatus and controlling method thereof
US11394954B1 (en) Portable 3D projector with sharing to multiple viewers
US11902534B2 (en) Device with dynamic transcode throttling