KR20170063273A - Smart device group controling apparatus based on position of smart device and method thereof - Google Patents

Smart device group controling apparatus based on position of smart device and method thereof Download PDF

Info

Publication number
KR20170063273A
KR20170063273A KR1020150169455A KR20150169455A KR20170063273A KR 20170063273 A KR20170063273 A KR 20170063273A KR 1020150169455 A KR1020150169455 A KR 1020150169455A KR 20150169455 A KR20150169455 A KR 20150169455A KR 20170063273 A KR20170063273 A KR 20170063273A
Authority
KR
South Korea
Prior art keywords
image
smart terminal
unit
information
grid
Prior art date
Application number
KR1020150169455A
Other languages
Korean (ko)
Inventor
김중기
권우현
황세찬
Original Assignee
주식회사 지나인
황세찬
권우현
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 지나인, 황세찬, 권우현 filed Critical 주식회사 지나인
Priority to KR1020150169455A priority Critical patent/KR20170063273A/en
Publication of KR20170063273A publication Critical patent/KR20170063273A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • H04W4/206

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The present invention relates to a position measuring apparatus for measuring position information of a smart terminal, A location map forming unit for setting a grid within a predetermined setting area and mapping location information of the smart terminal measured by the position measuring unit to each cell of the grid to form a location map; An image processing unit for detecting image information for each pixel in the pre-stored video signal; And a rendering unit for transmitting the image information detected for each cell by the image processing unit to the smart terminals mapped to the cells by the position map forming unit, respectively.

Description

TECHNICAL FIELD [0001] The present invention relates to a location-based smart terminal group control apparatus and a smart-

More particularly, the present invention relates to an apparatus and method for controlling a location-based smart terminal group, and more particularly, Based smart terminal group control apparatus and method capable of producing a group performance using a smart terminal.

Most of the performances are performed in the nighttime or in a room where a large number of people are concentrated in order to produce an efficient and vivid lighting effect. At this time, Or receive a light stick from a performance agency.

However, when the audience purchases the light beam, there is a cost problem to purchase the light beam in addition to the cost of purchasing the performance ticket. On the other hand, when the light beam is paid to the audience by collectively purchasing the light beam from the performance company, In addition, there is a problem that the cost of purchasing a light bar must be borne.

Recently, in order to increase the visibility of the support phrase, a performance message is created on the smart terminal and various messages are displayed on the display of the smart terminal.

However, this method is a method in which each user of the smart terminal inputs and outputs a desired phrase, and it is not enough to reflect the intention or concept of the performance, and the unity and inconsistency are insufficient.

Background Art [0002] The background art of the present invention is disclosed in Korean Patent Registration No. 10-1384127 (Apr.

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and it is an object of the present invention to provide a smart terminal which measures position information of a plurality of smart terminals and outputs image information corresponding to the measured position information to each smart terminal, Based smart terminal group control apparatus and method capable of producing a group performance using a terminal.

Another object of the present invention is to provide an apparatus and method for controlling a location-based smart terminal group that can output various image information required for performances and the like through a smart terminal in real time, thereby realizing various image performances suitable for performance.

A location-based smart terminal group control apparatus according to an aspect of the present invention includes a location measurement unit for measuring location information of a smart terminal; A location map forming unit for setting a grid within a predetermined setting area and mapping location information of the smart terminal measured by the position measuring unit to each cell of the grid to form a location map; An image processing unit for detecting image information for each pixel in the pre-stored video signal; And a rendering unit for transmitting the image information detected for each cell by the image processing unit to the smart terminals mapped to the cells by the position map forming unit, respectively.

The position map forming unit of the present invention may further include: a setting area storing unit for storing the setting area; And generating a cell by setting the grid in the setting area and comparing the area information of the generated cell with the position information of the smart terminal measured by the position measuring part, And a position mapping unit for mapping the information to form the position map.

The setting area of the present invention is set in advance so as to correspond to the actual viewing area in the theater.

The image processing unit of the present invention may further include an image transform unit for transforming the resolution of the image of the image signal to correspond to the resolution of the grid. A matching unit for matching each pixel of the image transformed by the image transform unit with each cell of the grid; And an image information detector for detecting image information for each pixel matched by the matching unit.

The image converting unit may extract a predetermined number of images per second from the entire image in the frame per second of the image signal, and convert the resolution of the extracted image.

The image converting unit converts the resolution of the image into a resolution corresponding to the number of x-axis cells and the number of y-axis cells of the grid.

The extended part of the present invention is characterized in that the image information is transmitted to the smart terminal in a time interval period at which the image extracted by the image conversion part is reproduced.

According to an aspect of the present invention, there is provided a method of controlling a location-based smart terminal group, comprising: measuring a location of a smart terminal; Setting a grid within a predetermined setting area by the position map forming unit and mapping the position information of the smart terminal measured by the position measuring unit to each cell of the grid to form a position map; Converting the resolution of the image of the image signal stored in the image processing unit to correspond to the resolution of the grid, detecting each pixel of each pixel by matching each pixel of the converted image with each cell of the grid; And each of the plurality of image units transmits image information detected for each pixel by the image processing unit to a smart terminal mapped to each of the cells; And outputting the image information received by the smart terminal from the extended unit.

The step of forming the location map may include generating the cell by setting the grid in the setting area, comparing the location of the generated cell with the location information of the smart terminal, comparing the position of the generated cell with the location information of the smart terminal, The location map is formed by mapping the location information.

The setting area of the present invention is set in advance so as to correspond to the viewing area in the theater.

The step of detecting image information for each pixel by matching each pixel of the image of the present invention with each cell of the grid may include extracting a predetermined number of images per second from the entire image in frames per second of the image signal, And converting the resolution.

The step of detecting image information for each pixel by matching each pixel of the image and each cell of the grid according to the present invention converts the resolution of the image into a resolution corresponding to the number of x-axis cells and the number of y-axis cells of the grid The method comprising the steps of:

In the step of delivering each of the image information detected for each pixel by the image processing unit of the present invention to each of the smart terminals mapped to the respective cells, the outgoing unit may display the image information by using the image extracted by the image conversion unit To the smart terminal in a period of time.

According to the present invention, each location information of a smart terminal is measured, and image information corresponding to the measured location information is outputted to each of the smart terminals, so that a group performance using a plurality of smart terminals can be produced and various image performances .

1 is a block diagram of a location-based smart terminal group control apparatus according to an embodiment of the present invention.
2 is a diagram illustrating an example of a layout of AP devices and smart terminals in a setting area according to an embodiment of the present invention.
3 is a conceptual diagram of a location map according to an embodiment of the present invention.
4 is a block diagram of a position measuring unit according to an embodiment of the present invention.
5 is a block diagram of an image processing unit according to an embodiment of the present invention.
6 is a diagram illustrating an example of image information according to an exemplary embodiment of the present invention.
7 is a diagram illustrating an example of video information output of each smart terminal according to an embodiment of the present invention.
8 is a flowchart of a method for controlling a group of location-based smart terminals according to an exemplary embodiment of the present invention.

Hereinafter, a location-based smart terminal group control apparatus and method according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. In this process, the thicknesses of the lines and the sizes of the components shown in the drawings may be exaggerated for clarity and convenience of explanation. Further, the terms described below are defined in consideration of the functions of the present invention, which may vary depending on the user, the intention or custom of the operator. Therefore, definitions of these terms should be made based on the contents throughout this specification.

FIG. 1 is a block diagram of a location-based smart terminal group control apparatus according to an embodiment of the present invention. FIG. 2 is a diagram illustrating an arrangement of an AP device and a smart terminal in a setting area according to an embodiment of the present invention. 3 is a conceptual view of a position map according to an embodiment of the present invention. FIG. 4 is a block diagram of a position measuring unit according to an embodiment of the present invention. 6 is a view illustrating image information according to an embodiment of the present invention. FIG. 7 is a view illustrating an example of video information output of each smart terminal according to an embodiment of the present invention. Based smart terminal group control method according to an embodiment of the present invention.

Referring to FIG. 1, a location-based smart terminal group control apparatus according to an embodiment of the present invention includes a location measurement unit 20, a location map formation unit 30, an image processing unit 40, and a lead- .

The location-based smart terminal group control apparatus according to the present embodiment outputs video information corresponding to the position of the viewer at each of the smart terminals owned by each of the audience members at a venue or the like, thereby realizing various video performances through the entire smart terminal .

First, the position measuring unit 20 measures the position information of the smart terminal 10.

The smart terminal 10 is provided for each of the audiences who have visited the performance site or the like. The customer holding the smart terminal 10 seats in the seat when the seat in the performance hall is designated in advance and sees the performance, You can also watch performances at The smart terminal 10 may include various devices including a smart phone or the like having the object Internet function.

In this case, the position measuring unit 20 may use various methods for measuring the position information of the smart terminal 10.

For example, as shown in FIG. 2, a plurality of access points (AP) devices are provided in an actual viewing area 21 inside a theater, and each AP device outputs a unique beacon signal. Such an AP device can be installed outdoors, and this embodiment can be applied to various venues such as indoor and outdoor.

Each smart terminal 10 receives a beacon signal from a plurality of AP devices and measures its own location information using the reception strength of the beacon signals. The smart terminal 10 then transmits the measured location information and identification information to the location measurement unit 20.

Accordingly, the position measuring unit 20 can recognize the position information of each of the smart terminals 10 in the actual viewing area 21 as shown in FIG.

For reference, in the above-described embodiment, the position measuring unit 20 measures a position information of the smart terminal 10 by installing a plurality of AP apparatuses in the actual viewing area 21 and using the AP apparatus Respectively. However, the technical scope of the present invention is not limited to the above-described embodiments, and various methods for measuring the location information of the smart terminal 10 may be employed. For example, in addition to the above-described method, each of the smart terminals 10 detects its own location information through GPS (Global Positioning System) information and transmits the location information together with its own identification information to the location measurement unit 20, A method in which the position measuring unit 20 measures the position information of each of the smart terminals 10 may be employed.

The location map forming unit 30 forms a location map by setting a grid within a predetermined setting area and mapping the position information of the smart terminal 10 measured by the position measuring unit 20 to each cell of the grid.

Referring to FIG. 4, the location map forming unit 30 includes a setting area storing unit 31 and a location information mapping unit 32.

The setting area storage unit 31 stores a setting area. The setting area can be preset in advance corresponding to the actual viewing area 21 in the theater. In this case, the setting area can be variously set according to the shape and size of the viewing area 21, the seating arrangement structure, and the like.

The position information mapping unit 32 generates a plurality of cells by setting a grid in the setting area detected by the setting area storage unit 31, and detects the area information of each of the generated cells. The location information mapping unit 32 then compares the area information of each cell with the location information of the smart terminal 10 measured by the location measurement unit 20, 10 are respectively mapped to form a position map.

That is, the location information mapping unit 32 compares each location information of the smart terminal 10 with the area information of each cell. For example, the location information of the smart terminal 10 is included in any one of the cell area information The corresponding smart terminal 10 recognizes that the corresponding smart terminal 10 is located in the corresponding cell and repeats this process for each of the smart terminals 10 to form a location map for the entire smart terminal 10. [

The image processing unit 40 detects image information for each pixel in the pre-stored image signal. Referring to FIG. 5, the image processing unit 40 includes an image converting unit 41, a matching unit 42, and an image information detecting unit 43.

The image converting unit 41 converts the resolution of the image of the image signal to correspond to the resolution of the grid. That is, the image converting unit 41 converts the resolution of the image to the resolution corresponding to the number of x-axis cells and the number of y-axis cells of the grid, thereby matching the resolution of the image with the resolution of the grid of the setting area.

The image converting unit 41 extracts a predetermined number of images per second from the entire image within a frame per second of the image signal, and converts the resolution of the extracted image. The number of image frames per second may be variously set in a normal video signal. For example, when the frame per second of the video signal is 40 frames and the predetermined number of images per second is 10, the image converting unit 41 periodically detects an image of 10 frames out of 40 frames to convert the resolution of the image have.

The matching unit 42 matches each pixel of the image converted by the image converting unit 41 with each cell of the grid.

The image information detection unit 43 detects image information, for example, a color value, for each pixel matched by the matching unit 42. [

The extension 50 transmits the image information detected for each pixel by the image processing unit 40 to the smart terminal 10 mapped to each cell by the location map forming unit 30. [ That is, when the image information of each pixel detected by the image processing unit 40 is detected, the extending unit 50 detects the cells of the grid matched to each of the pixels, and sets the image information of each pixel to a cell To the smart terminal 10 of FIG.

In this case, as the image is extracted according to the predetermined number of images per second by the image converting unit 41, the extending unit 50 outputs the image information to the image converting unit 41 at a period of time intervals at which the image extracted by the image converting unit 41 is reproduced To the smart terminal (10).

6 and 7, each of the smart terminals 10 receives the image information of the cell matched with the position information of the smart terminal 10, and outputs the received image information, The smart terminals 10 can be grouped and output one video signal.

Hereinafter, a location-based smart terminal group control method according to an embodiment of the present invention will be described in detail with reference to FIG.

Referring to FIG. 8, the position measuring unit 20 measures the position information of the smart terminal 10 (S10).

In this case, the smart terminal 10 receives a beacon signal output from a plurality of AP devices, measures its own position information using the reception strength of the beacon signals, (20).

Accordingly, the position measuring unit 20 can recognize the position information of each of the smart terminals 10 in the actual viewing area 21 as shown in FIG.

In addition, each of the smart terminals 10 detects its own location information via GPS information and transmits the location information together with its own identification information to the location measurement unit 20, 10) may be measured.

When the position information of the smart terminal 10 is measured by the position measuring unit 20, the position map forming unit 30 generates a plurality of cells by setting a grid within the pre-stored setting area, Information. Then, the location map forming unit 30 compares the area information of each cell with the location information of the smart terminal 10 measured by the position measuring unit 20, and stores at least one smart terminal 10) to form a position map (S20). In this case, when the location information of the smart terminal 10 is included in any one of the cell area information, the location information mapping unit 32 recognizes that the corresponding smart terminal 10 is located in the corresponding cell, Is repeated for each of the smart terminals 10, thereby forming a position map for the entire smart terminal 10.

Next, the image processing unit 40 extracts a predetermined number of images per second from the entire image in frames per second (S30), and converts the resolution of the image into a resolution corresponding to the number of x-axis cells and the number of y- (S40), thereby matching the resolution of the image with the resolution of the grid of the setting area.

After converting the resolution of the image into the resolution corresponding to the number of x-axis cells and the number of y-axis cells of the grid, the image processing unit 40 detects image information for each pixel matched by the matching unit 42 (S50).

When image information is detected for each pixel by the image processing unit 40, the extension unit 50 transmits the image information detected for each pixel to the smart terminal 10 mapped to each cell by the position map forming unit 30 S60). In this case, the image data is transmitted at a period of time intervals during which the image extracted by the image converting unit 41 is reproduced.

Accordingly, each of the smart terminals 10 receives the image information of the cell matched with the position information of the smart terminal 10 from the extended portion 50, and outputs the received image information from the outgoing portion 50 (S70), whereby a plurality of the smart terminals 10 are grouped It is possible to output one video signal.

As described above, the present embodiment measures the position information of each of the plurality of smart terminals 10 and outputs an image corresponding to the measured position information to each of the smart terminals 10, So that various video performances suitable for the live performance can be implemented.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, I will understand. Accordingly, the true scope of the present invention should be determined by the following claims.

10: Smart terminal
20:
21: Actual viewing area
30: position map forming section
31: Setting area storage section
32: Position information mapping unit
40:
41:
42:
43:
50: Leader

Claims (13)

A position measuring unit for measuring position information of the smart terminal;
A location map forming unit for setting a grid within a predetermined setting area and mapping location information of the smart terminal measured by the position measuring unit to each cell of the grid to form a location map;
An image processing unit for detecting image information for each pixel in the pre-stored video signal; And
Based smart terminal group control apparatus according to claim 1, wherein the location information of each of the smart terminals is transmitted to the mobile terminal.
The apparatus of claim 1, wherein the position map forming unit
A setting area storage unit for storing the setting area; And
The grid is set in the setting area to generate a cell, and the area information of the generated cell is compared with the position information of the smart terminal measured by the position measuring part, and the position information of the smart terminal And a location mapping unit for mapping the location map to the location map to form the location map.
The device according to claim 1, wherein the setting area is set in advance to correspond to an actual viewing area in a venue.
The apparatus of claim 1, wherein the image processing unit
An image converter for converting a resolution of an image of the image signal to correspond to a resolution of the grid;
A matching unit for matching each pixel of the image transformed by the image transform unit with each cell of the grid; And
And an image information detecting unit for detecting image information for each pixel matched by the matching unit.
5. The image processing apparatus according to claim 4,
Based smart terminal group controller extracts a predetermined number of images per second from the entire image within a frame per second of the video signal, and converts the resolution of the extracted image.
5. The image processing apparatus according to claim 4,
And converts the resolution of the image into a resolution corresponding to the number of x-axis cells and the number of y-axis cells of the grid.
5. The apparatus according to claim 4, wherein the extending portion
And transmits the image information to the smart terminal in a time interval period at which the image extracted by the image conversion unit is reproduced.
Measuring the position of the smart terminal;
Setting a grid within a predetermined setting area by the position map forming unit and mapping the position information of the smart terminal measured by the position measuring unit to each cell of the grid to form a position map;
Converting the resolution of the image of the image signal stored in the image processing unit to correspond to the resolution of the grid, detecting each pixel of each pixel by matching each pixel of the converted image with each cell of the grid;
And each of the plurality of image units transmits image information detected for each pixel by the image processing unit to a smart terminal mapped to each of the cells; And
And outputting the image information received from each of the smart terminals to the smart terminal.
9. The method of claim 8, wherein forming the location map comprises:
The cell is created by setting the grid in the setting area, and the position of the generated cell is compared with the position information of the smart terminal, and the position information of the smart terminal is mapped to each cell according to the comparison result to form the position map Based smart terminal group control method.
The method of claim 8, wherein the setting area is preset in correspondence with a viewing area in a theater.
9. The method of claim 8, wherein the step of detecting image information for each pixel by matching each pixel of the image with each cell of the grid
Extracting a predetermined number of images per second from the entire image in frames per second of the video signal, and converting the resolution of the extracted image.
9. The method of claim 8, wherein the step of detecting image information for each pixel by matching each pixel of the image with each cell of the grid
And converting the resolution of the image into a resolution corresponding to the number of x-axis cells and the number of y-axis cells of the grid.
9. The method of claim 8, wherein, in the step of transmitting each of the image information detected for each pixel by the image processing unit to the smart terminals mapped to the respective cells,
Wherein the extended unit delivers the image information to the smart terminal at a period of time during which the image extracted by the image conversion unit is reproduced.
KR1020150169455A 2015-11-30 2015-11-30 Smart device group controling apparatus based on position of smart device and method thereof KR20170063273A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150169455A KR20170063273A (en) 2015-11-30 2015-11-30 Smart device group controling apparatus based on position of smart device and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150169455A KR20170063273A (en) 2015-11-30 2015-11-30 Smart device group controling apparatus based on position of smart device and method thereof

Publications (1)

Publication Number Publication Date
KR20170063273A true KR20170063273A (en) 2017-06-08

Family

ID=59221130

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150169455A KR20170063273A (en) 2015-11-30 2015-11-30 Smart device group controling apparatus based on position of smart device and method thereof

Country Status (1)

Country Link
KR (1) KR20170063273A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101970358B1 (en) * 2017-12-12 2019-04-18 엘지전자 주식회사 Central server and performance system including central server

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101970358B1 (en) * 2017-12-12 2019-04-18 엘지전자 주식회사 Central server and performance system including central server
WO2019117409A1 (en) * 2017-12-12 2019-06-20 엘지전자 주식회사 Central server and dramatic performance system including same
CN111542864A (en) * 2017-12-12 2020-08-14 百赫娱乐有限公司 Central server and performance system including the same
US11452192B2 (en) 2017-12-12 2022-09-20 Hybe Co., Ltd Central server and dramatic performance system including same

Similar Documents

Publication Publication Date Title
US10853992B1 (en) Systems and methods for displaying a virtual reality model
RU2679115C2 (en) External control lighting systems based on third party content
JP2017224282A (en) Method and system for transmitting information
US9766057B1 (en) Characterization of a scene with structured light
JP2018505584A (en) Interactive binocular video display
CN104254869A (en) A method and system for projecting a visible representation of infrared radiation
KR102197704B1 (en) Augmented Reality Based Parking Guidance System in Indoor Parking Lot
ATE553597T1 (en) SELF-REGULATING STEREOSCOPIC CAMERA SYSTEM
US20220277544A1 (en) Homography through satellite image matching
JP6568854B2 (en) Multi-screen screening video generation method, storage medium thereof, and video management apparatus using the same
WO2013008584A1 (en) Object display device, object display method, and object display program
KR101083245B1 (en) Regional information extraction method, regional information output method, and apparatus for the same
CN105306730A (en) System and method for automatically switching contextual model
CN109374002A (en) Air navigation aid and system, computer readable storage medium
JP2015204548A (en) image projection system
KR20170063273A (en) Smart device group controling apparatus based on position of smart device and method thereof
KR20220073684A (en) Method, system and non-transitory computer-readable recording medium for supporting user experience sharing
KR102056727B1 (en) Image advertising intermediation service system using image security apparatus
CN110796706A (en) Visual positioning method and system
US11227440B2 (en) Systems and methods for providing an audio-guided virtual reality tour
KR20220002654A (en) Systems and methods for dynamically loading area-based augmented reality content
WO2018105436A1 (en) Cooperative display system
US20160119614A1 (en) Display apparatus, display control method and computer readable recording medium recording program thereon
CN109214482A (en) A kind of indoor orientation method, device, terminal device and storage medium
JP2017084181A (en) Method for adjusting use of space

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right