KR101369358B1 - Display control system and recording medium thereof - Google Patents

Display control system and recording medium thereof Download PDF

Info

Publication number
KR101369358B1
KR101369358B1 KR1020130093271A KR20130093271A KR101369358B1 KR 101369358 B1 KR101369358 B1 KR 101369358B1 KR 1020130093271 A KR1020130093271 A KR 1020130093271A KR 20130093271 A KR20130093271 A KR 20130093271A KR 101369358 B1 KR101369358 B1 KR 101369358B1
Authority
KR
South Korea
Prior art keywords
user
control menu
foot
display
control
Prior art date
Application number
KR1020130093271A
Other languages
Korean (ko)
Inventor
홍주희
Original Assignee
홍주희
(주) 보우테크
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 홍주희, (주) 보우테크 filed Critical 홍주희
Priority to KR1020130093271A priority Critical patent/KR101369358B1/en
Application granted granted Critical
Publication of KR101369358B1 publication Critical patent/KR101369358B1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0334Foot operated pointing devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a display control system and a recording medium thereof.
According to an aspect of the present invention, a display control system for controlling a screen of a display in a user interaction manner, comprising: projecting a control menu image such that a control menu is displayed at a predetermined controlled position in a sensing area set on a bottom surface of the display; Projector; Three-dimensional sensing means for sensing a foot position of a user located in the sensing area by a three-dimensional sensing method; And controlling the position of the control menu image projected by the projector, receiving sensing information regarding the position of the user's foot detected by the three-dimensional sensing means, and coordinates on the bottom surface of the control menu image projected by the projector. A display screen control according to a control menu operation when it is determined to correspond to a preset control menu operation by comparing a value with a coordinate value on the bottom surface of the foot position of the user sensed by the 3D sensing means. Disclosed is a display control system comprising: computing means for performing.

Description

Display control system and recording medium thereof

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a display control system and a recording medium thereof, wherein in controlling a screen of a display of a user interaction type such as a kiosk or a digital signage, a control menu is projected onto a floor near a foot of a user, and a control menu projection surface is provided. A display control system and a recording medium configured to detect a change in a user's foot position or foot motion in a three-dimensional sensing manner and recognize the change as a control command so that the user can control the screen of the display without using a hand. It is about.

In order to control signage of user interaction methods such as kiosks, various attempts have been made in the past.

For example, in the case of a touch screen display, a control menu is displayed on a touch screen, and the display directly detects a user's touch position or a change state thereof and receives a control command.

As another example, when the user's hand motion detection means of the air touch (hand motion) method is employed, the user's hand position or hand motion that is changed at the front of the display may be captured by an optical sensor such as an infrared sensor and an imaging means such as a camera. It sensed through the user received a control command according to the user's hand position or hand motion change.

As an example of an air touch kiosk, a depth camera is provided at the top of a display provided in front of a user to recognize a distance difference (z-axis value) between the user's hand and the depth camera generated according to a user's hand movement. The kiosk device is configured to recognize the user's input operation as a control command.

To this end, the depth camera is provided with an imaging means such as a CCD camera, and an infrared light emitter / receiver, and compares the hand position value measured by the infrared light emitter / receiver with the image of the user's hand motion. It is configured to sense the position and / or motion of the user's hand and to execute the control command input of the kiosk by comparing it with the menu input operation previously set in the computer device.

However, the conventional control command detection method using a hand has a fundamental problem that a user holding an object in a hand or a user with a disability having an obstacle in a hand cannot use it.

In addition, since the user's motion is recognized in a three-dimensional space without presenting a separate reference position or coordinates, there is a limit that the probability of malfunction is relatively high, especially when the user is far from the depth camera. .

In addition, in the case of the touch screen method, an unspecified number of users may touch the screen of the touch screen with contaminated hands, thus causing hygiene problems.

Republic of Korea Patent No. 10-1212893 (Registration date December 10, 2012)

An object of the present invention for solving the problems according to the prior art, in controlling the screen of the display of the user interaction system such as kiosk or digital signage, projecting the control menu on the floor near the user's feet, the control menu A display control system and method configured to detect a change in a user's foot position or foot movement on a projection surface by a three-dimensional sensing method and recognize the change as a control command, thereby allowing a user to control the screen of the display without using a hand; The recording medium is provided.

According to an aspect of the present invention for achieving the above object, a display control system for controlling the screen of the display in a user interaction manner, the control menu is located at a predetermined controlled position in the sensing area set on the front surface of the display A projector for projecting a control menu image to be displayed; Three-dimensional sensing means for sensing a foot position of a user located in the sensing area by a three-dimensional sensing method; And controlling the position of the control menu image projected by the projector, receiving sensing information regarding the position of the user's foot detected by the three-dimensional sensing means, and coordinates on the bottom surface of the control menu image projected by the projector. A display screen control according to a control menu operation when it is determined to correspond to a preset control menu operation by comparing a value with a coordinate value on the bottom surface of the foot position of the user sensed by the 3D sensing means. Disclosed is a display control system comprising: computing means for performing.

Preferably, the computing means may control the projector to project the control menu image at positions spaced apart by a predetermined interval based on the position of the foot of the user sensed by the 3D sensing means.

Preferably, the computing means may control the projector to project the control menu image when the foot position of the user sensed by the 3D sensing means does not change for more than a predetermined waiting time.

Preferably, the computing means, when the foot position of the user sensed by the three-dimensional sensing means does not change over a predetermined waiting time on a predetermined position of the control menu image, the control menu image projected by the projector The coordinate values on the bottom surface of the controller and the coordinate values on the bottom surface of the foot position of the user detected by the three-dimensional sensing means may be compared with each other to determine that they correspond to a preset control menu manipulation operation.

Preferably, when the user's feet detected by the three-dimensional sensing means are plural, the computing means may recognize one foot as a foot for menu manipulation input by a predetermined selection criterion.

Preferably, when the user's feet detected by the three-dimensional sensing means has a plurality of feet, the computing means selects and recognizes each of the plurality of feet as a foot for a menu manipulation input within a predetermined number, and recognizes each of the plurality of feet. The number and positions of control menu images projected by the projector may be controlled to provide a separate control menu image.

Preferably, the computing means, wherein the position of the user's foot detected by the three-dimensional sensing means is on the boundary of the display area and the other area of the control menu image, or the boundary of the two or more control menu area In this case, it may be determined that the foot of the user is located in an area where the foot area of the user is larger.

Preferably, the computing means may use the coordinate value of the toe point of the foot closer to the three-dimensional sensing means as the coordinate value on the bottom surface of the foot position of the user.

Preferably, the computing means, when comparing the coordinate value on the bottom surface of the control menu image projected by the projector and the coordinate value on the bottom surface of the foot position of the user detected by the three-dimensional sensing means, The change state of the coordinate value on the bottom surface of the foot position of the user may be used as a determination element.

Preferably, when the user's foot position detected by the three-dimensional sensing means is located within the area of the control menu image, the computing means may control the control menu image until the user's foot position is out of the area of the control menu image. You may not change the projection position of.

Preferably, the control menu image may be configured to include a plurality of control menu selection areas to enable the selection of one of a plurality of control menus.

According to yet another aspect of the present invention, a computer-readable recording medium having recorded thereon a program for executing each function of the computing means of the display control system is disclosed.

According to the present invention, the display screen can be controlled only by the change of the foot position or the foot motion of the user, so that a user holding an object in the hand or a disabled user who has a handicap factor can have a user such as a kiosk without difficulty. There is an advantage in that an interactive signage can be used.

In addition, since the computing means recognizes the position of the user's foot or the motion of the control menu coordinates of the control menu projected on the floor in advance, air touch (hand motion) used in the absence of a separate menu means. Compared to the method, there is an advantage that the probability of control malfunction is lowered.

In addition, there is an advantage in terms of user hygiene compared to the touch screen method in which a large number of unspecified users directly touch the display surface directly.

1 is a schematic diagram showing the overall configuration of a display control system according to an embodiment of the present invention;
2 is a schematic diagram illustrating a process of comparing a coordinate value of a control menu image of a display control system and a coordinate value of a foot position of a user according to an embodiment of the present invention;
3 is a schematic diagram illustrating a process of selecting one foot as a foot for menu operation input when a plurality of feet of a user are provided in the display control system according to an exemplary embodiment of the present invention;
4 is a schematic diagram illustrating a process in which a separate control menu image is provided for each of a plurality of feet when the user has a plurality of feet in the display control system according to an embodiment of the present invention;
FIG. 5 is a schematic diagram for explaining a process of using a state of changing coordinate values of a foot position of a user in a display control system according to an exemplary embodiment of the present invention.

The present invention may be embodied in many other forms without departing from its spirit or essential characteristics. Accordingly, the embodiments of the present invention are to be considered in all respects as merely illustrative and not restrictive.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

When a component is referred to as being "connected" or "connected" to another component, it may be directly connected to or connected to that other component, but it may be understood that other components may be present in between. Should be. On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In the present application, the terms "comprises", "having", "having", and the like are intended to specify the presence of stated features, integers, steps, operations, components, Steps, operations, elements, components, or combinations of elements, numbers, steps, operations, components, parts, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or corresponding elements are denoted by the same reference numerals, and a duplicate description thereof will be omitted. In the following description of the present invention, if it is determined that the detailed description of the related known technology may obscure the gist of the present invention, the detailed description thereof will be omitted.

1 is a schematic diagram showing the overall configuration of a display control system according to an embodiment of the present invention.

The display control system of the present embodiment is a display control system that controls the screen of the display 40 in a user interaction manner. For example, the display control system is used in a customer guidance kiosk used in an exhibition hall, a museum, or an event hall or a multi-use facility. It can be applied to digital signage and the like.

On the screen of the display 40, for example, guide materials for exhibition objects such as an exhibition hall and a museum may be implemented in a user interaction manner. Here, the user interaction method means that the user does not simply watch the contents of the contents implemented on the screen of the display 40, but requests the input of the contents of the desired contents in a form similar to that of the smart device to view in more detail, or the contents of other contents. As a method of requesting the conversion, the user can control the implementation of the content displayed on the screen in a form desired by the user.

Contents implemented on the screen of the display 40 may be stored in the form of a document file, an image file, a video file, a voice file, and the like in the computing means 10 to be described later. It is detected that the entry within the area of the PA4) can be provided as an image and sound through the screen and the speaker according to the content pre-programmed on the screen of the display 40.

In addition, a guide content for this may be provided on the screen of the display 40 so that the user U may generate a control input of various user interaction methods using the foot F. FIG.

The computing means 10 includes a program for reproducing content files stored in a conventional computer system (eg, a PC) having computing elements such as a central processing unit, a system DB, a system memory, an interface, and the like, and a user interaction method. It can be understood that a program for controlling the display of the device is installed and driven. In terms of functionality, the computing means 10 may include a content realization module for implementing a content file in video and audio on a display, a projector interworking module for displaying a control menu at a predetermined controlled location in conjunction with a projector, and 3 A three-dimensional sensing means interworking module for three-dimensional sensing a foot position of a user located in the sensing area in conjunction with the dimensional sensing means may be provided, and a content DB for storing various content files may be provided. The program for implementing each function may be implemented in the form of a separate application or in the form of middleware. Such computing means 10 may be implemented in the form of a local server or in the form of a client terminal networked with a central administration server. Description of the conventional configuration of such a computing means has been well known in the art, detailed description thereof will be omitted.

The display control system of the present embodiment includes a projector 20 for projecting a control menu image A such that the control menu is displayed at a predetermined controlled position in the sensing area AA set on the bottom surface in front of the display 40. do.

As long as the projector 20 can function as a control menu projection means of the computing means 10, any type of known projector means can be employed. For example, a pico projector capable of projecting on a limited projection area can be used. Can be employed.

Since the position of the projected control menu image A should be controlled in cooperation with the computing means 10, the projector 20 has a coordinate value on the bottom surface of the projected content previously determined by the bottom of the computing means 10. It is prepared as set in the concept of surface display coordinates. For example, if the PA1 to PA4 of FIG. 1 are regarded as the projectable area of the projector 20 (or the sensing area of the three-dimensional sensing means 30), the projector 20 in the computing means 10 is used by the projector 20. The computing means 10 calculates in real time the coordinates of the menu image projected on the floor within the PA1 to PA4 area in consideration of physical factors such as the installation position and the projection angle of the projector 20 in real time. To become aware. Since the computing means 10 calculates or recognizes the coordinate values of the image projected by the projector in real time, many of them are well known in the field of computer display, and thus detailed descriptions thereof will be omitted.

Meanwhile, although the projector 20 is illustrated as being attached to the lower side of the display 40 in FIG. 1, the projector 20 may be installed at various positions for proper image projection. For example, it may be installed on the ceiling side, or may be installed next to the three-dimensional sensing means 30. However, in the installation state at this specific position, the coordinate values of the projected image and the coordinate values recognized by the computing means 10 may be matched in advance.

The display control system of the present embodiment also includes three-dimensional sensing means 30 for sensing the foot position P of the user located in the sensing area in a three-dimensional sensing manner.

The three-dimensional sensing means 30 may be any type of well-known three-dimensional sensing means may be employed if it is capable of three-dimensional sensing the position of the user, in particular the position of the foot, preferably, Kinect depth camera Can be used. Kinect depth camera is a camera that analyzes and obtains the distance information of the three-dimensional scene from the continuously-infrared structured light, commercial products, such as PrimeSense, etc. have been released. For example, a commercial product may have a depth resolution of 640 * 480, a 30 fps, and an effective distance of about 3.5m to sense the depth or position of an object with relatively high accuracy.

In addition to the Kinect depth camera as described above, the sensor may be used to detect the object from a sensor such as a sensor of a general image processing method that calculates a distance from the sensor by general image image analysis, or an infrared light sensor using an IR light / receiver. Any form of known three-dimensional sensing means may be employed if dimensional distance to coordinate computation is possible.

Since the three-dimensional sensing means 30 needs to receive input of sensing information about the foot position P of the user in cooperation with the computing means 10, the coordinate values of the area to be sensed are previously determined by the computing means 10. It is prepared with the concept of spatial coordinates set. For example, if the PA1 to PA4 of FIG. 1 are the sensing area of the three-dimensional sensing means 30, the foot position P of the user is determined by the three-dimensional sensing means 30 in the computing means 10. The computing means 10 calculates or recognizes in real time where it is located in the PA1 to PA4 region. Since a configuration in which the computing means 10 calculates or recognizes a coordinate value of an object to be sensed is generally known, a detailed description thereof will be omitted.

Meanwhile, although the three-dimensional sensing means 30 is illustrated as being attached to the upper side of the display 40 in FIG. 1, the three-dimensional sensing means 30 may be installed at various positions in order to perform an appropriate sensing function. For example, it may be installed on the ceiling side or separately provided and mounted on the mounting means spaced apart from the display 40 by a predetermined distance. However, in the installation state at such a specific position, the position or coordinate value of the sensing object (foot of the user) and the recognized position or coordinate value which are the computing means 10 may be matched in advance.

Meanwhile, in the above description, it has been described that the 3D sensing means 30 detects the foot position of the user located in the sensing area in a 3D sensing manner, and the computing information 10 receives the sensing information in real time. For convenience, the three-dimensional sensing means 30 operates mainly for the image capturing or image capturing function for capturing a foot image, and the calculation of the foot position may be configured to be performed by the computing means 10.

The computing means 10 included in the display control system of the present embodiment controls the position of the control menu image A projected by the projector 20, and the foot of the user sensed by the three-dimensional sensing means 30. Receives sensing information regarding a position P, coordinate values (XZ coordinates of FIG. 1) on the bottom surface of the control menu image A projected by the projector 20, and the three-dimensional sensing means 30. By comparing the coordinate values on the bottom surface of the user's foot position (P) detected by each other, if it is determined to correspond to the preset control menu operation operation, control the display 40 screen according to the control menu operation To perform.

Preferably, the control menu image A includes a plurality of control menu selection areas (divided by lines connecting the points A0 to A5 of FIG. 1 to project one of the plurality of control menus). 2,3,4). As the user selectively steps on each of these areas, the computing means 10 may input and recognize a control command corresponding to the corresponding area through the above process.

Preferably, the computing means 10 is a coordinate value on the bottom surface of the foot position P of the user, the foot closer to the three-dimensional sensing means 30 (foot located at the shortest distance based on the XYZ coordinate of FIG. 1). You can use the coordinate value of the toe point of). Although the three-dimensional sensing means 30 may recognize the user's foot in various ways according to the principle of each sensor, in the present embodiment, for example, the lowest part of the human body part of the user recognized by the sensor in the form of a depth value. A part located at may be recognized as a foot, and a far end of the foot may be set to be recognized as the foot position P of the user. In this case, for example, the three-dimensional sensing means 30 recognizes the user's foot position P in the form of a depth value from the sensor, but the computing means 10 recognizes the position recognized by the depth value. As it is set to be able to recognize the concept of the coordinate value on the floor, as a result it can be recognized as the coordinate value on the floor.

Preferably, the computing means 10, the position spaced apart by a predetermined interval based on the user's foot position (P) detected by the three-dimensional sensing means 30 (D distance of FIG. 1, for example, 30cm apart) The projector 20 is controlled to project the control menu image A to the. Through this configuration, the user can be provided with a control menu image (A) at a position where the user can comfortably control the menu based on the position at which the user stands. In addition, when the user moves, the menu may be provided to move by the above method.

Preferably, the computing means 10, the control menu when the user's foot position (P) detected by the three-dimensional sensing means 30 does not change more than a predetermined waiting time (for example, 2 seconds), the control menu The projector 20 is controlled to project the image A. FIG. Through this configuration, when the user naturally moves the foot, the computing means 10 can stably recognize the standby state for the menu operation only when the user's foot position P does not change for more than a predetermined waiting time. Will be.

Preferably, the computing means 10, when the user's foot position (P) detected by the three-dimensional sensing means 30 is located within the area of the control menu image (A), the user's foot position (P) ) Does not change the projection position of the control menu image A until it is out of the area of the control menu image A. Through this configuration, the user can be continuously provided with the control menu image (A) at the position where the user can comfortably control the menu until he intentionally leaves the area of the control menu image (A) based on the position where he stands. Will be.

Hereinafter, with reference to the accompanying drawings, a detailed control operation of the computing means 10 provided in the display control system of the present embodiment will be described.

FIG. 2 is a schematic diagram illustrating a process of comparing a coordinate value of a control menu image of a display control system and a coordinate value of a foot position of a user according to an embodiment of the present invention.

Preferably, the computing means 10, the foot position (P) of the user detected by the three-dimensional sensing means 30 does not change over a predetermined waiting time on a predetermined position of the control menu image (A) (E.g., 1 second), the coordinate value on the bottom surface of the control menu image A projected by the projector 20 and the foot position P of the user sensed by the three-dimensional sensing means 30 The coordinate values on the bottom surface of the c) are compared with each other to determine that they correspond to a preset control menu manipulation operation.

For example, when the coordinate value of the toe point P of the foot F of the user U is set to the coordinate value on the bottom surface of the foot position P of the user U, in the case of FP1 of the user's foot in FIG. May be selected as Menu 1, Menu 3 as FP2, Menu 3 as FP3, and Menu not selected or error in FP4.

In another aspect, the computing means 10, the foot position (P) of the user detected by the three-dimensional sensing means 30, the boundary between the display area of the control menu image (A) and other areas If it is over, or over the boundary of two or more control menu areas, it is determined that the foot of the user is located in an area where the foot area of the user is larger. For example, in the case of FP3 in FIG. 2, when the coordinate value of the toe point P of the foot F of the user U is set to the coordinate value on the bottom surface of the foot position P of the user, the menu 3 Although it is recognized that the number of times has been selected, if the user's foot area is based on a larger area FP3-2, the menu number 4 may be selected.

Meanwhile, in the above description, the control menu image A is illustrated as a fan shape, but various shapes such as a circle, a rectangle, and a triangle are possible. In addition, the content controlled according to the menu selection may be set in various ways such as selection of content, progression to the next content, start, end, and confirmation of additional information.

FIG. 3 is a schematic diagram illustrating a process of selecting one foot as a foot for menu operation input when a plurality of feet of a user are included in the display control system according to an exemplary embodiment of the present invention.

Preferably, when the user's feet detected by the three-dimensional sensing means 30 are plural, the computing means 10 selects and recognizes one foot as a foot for menu manipulation input according to a predetermined selection criterion. . 3 illustrates a case where the foot FP7 closest to the line connecting PA2-PA3, which is the front end of the sensing area, is selected and recognized as the foot for menu manipulation input. For example, when the foot FP7 is closest to the line connecting PA2-PA3 (when there are a plurality of feet with the same close position), for example, a foot having a longer stopping time may be selected as a foot for menu operation input. . These standards can be variously modified in consideration of the use environment.

4 is a schematic diagram illustrating a process of providing a separate control menu image for each of a plurality of feet when a plurality of feet of a user are provided in the display control system according to an exemplary embodiment of the present invention.

Preferably, when the user's feet detected by the three-dimensional sensing means 30 is a plurality, the computing means 10 selects and recognizes each of the plurality of feet as a foot for a menu manipulation input within a predetermined number, The number and positions of the control menu images A projected by the projector 20 are controlled to provide a separate control menu image A for each of the plurality of feet.

In this case, the contents of the input control menu may also be configured such that a plurality of feet can be interlocked with each other (eg, performing specific control on the premise of simultaneous operation), and first to recognize the menu on the side where the operation input occurs first. You may. The number and position of the control menu image A projected by the projector 20, the setting of the control operation, and the like may be variously modified in consideration of the use environment.

FIG. 5 is a schematic diagram for explaining a process of using a state of changing coordinate values of a foot position of a user in a display control system according to an exemplary embodiment of the present invention.

Preferably, the computing means 10, the coordinate value on the bottom surface of the control menu image (A) projected by the projector 20 and the foot of the user sensed by the three-dimensional sensing means 30 When the coordinate values on the bottom surface of the position P are compared with each other, the change state of the coordinate values on the bottom surface of the foot position P of the user is taken as a determination element.

In the above-described example, the menu input operation is recognized on the premise that the user's foot is stopped for a predetermined time at one coordinate position. However, as shown in FIG. 5, for example, as the drag operation of a computer mouse, the user's foot position P The change state of the coordinate value on the floor surface with time of) may be recognized as a control command. For example, the control command of the state in which the user's foot changes from the menu area 1-> 2 and the state changed from 3-> 2 can be recognized differently.

Embodiments of the present invention include computer readable media including program instructions for performing various computer implemented operations. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The media may be those specially designed and constructed for the present invention or may be those known to those skilled in the computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floppy disks, and ROMs, And hardware devices specifically configured to store and execute the same program instructions. The medium may be a transmission medium such as an optical or metal line, a wave guide, or the like, including a carrier wave for transmitting a signal designating a program command, a data structure, or the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.

10: computing means 20: projector
30: three-dimensional sensing means 40: display
AA: Detection Area A: Control Menu Image
P: user's foot position

Claims (12)

A display control system for controlling a screen of a display by a user interaction method.
A projector for projecting a control menu image so that a control menu is displayed at a predetermined controlled position in a sensing area set on a bottom surface of the display;
Three-dimensional sensing means for sensing a foot position of a user located in the sensing area by a three-dimensional sensing method; And
Control the position of the control menu image projected by the projector, receive the sensing information regarding the position of the user's foot detected by the three-dimensional sensing means, the coordinate value on the bottom surface of the control menu image projected by the projector And comparing the coordinate values on the bottom surface of the foot position of the user sensed by the three-dimensional sensing means to control the display screen according to the control menu operation when it is determined to correspond to a preset control menu operation operation. And display means for performing computing.
The method of claim 1,
The computing means,
And controlling the projector to project the control menu image at a position spaced apart by a predetermined interval based on the position of the user's foot detected by the 3D sensing means.
3. The method of claim 2,
The computing means,
And controlling the projector to project the control menu image when the foot position of the user sensed by the three-dimensional sensing means does not change for more than a predetermined waiting time.
The method of claim 1,
The computing means,
When the foot position of the user sensed by the three-dimensional sensing means does not change over a predetermined waiting time on a predetermined position of the control menu image,
The coordinate values on the bottom surface of the control menu image projected by the projector and the coordinate values on the bottom surface of the foot position of the user detected by the three-dimensional sensing means are compared with each other to correspond to a preset control menu operation operation. Display control system, characterized in that judged to.
The method of claim 1,
The computing means,
And a plurality of feet detected by the three-dimensional sensing means to recognize one foot as a foot for a menu operation input based on a predetermined selection criterion.
The method of claim 1,
The computing means,
When the user's feet detected by the three-dimensional sensing means are plural, each of the plurality of feet is selected and recognized as a foot for menu manipulation input within a predetermined number,
And controlling the number and position of control menu images projected by the projector so that a separate control menu image is provided for each of the plurality of feet.
The method of claim 1,
The computing means,
When the position of the user's foot detected by the three-dimensional sensing means spans the boundary between the display area and the other area of the control menu image or the boundary between two or more control menu areas,
And determining that the foot of the user is located in an area in which the foot area of the user is larger.
The method of claim 1,
The computing means,
And a coordinate value of the toe point of the foot closer to the three-dimensional sensing means as the coordinate value on the bottom surface of the foot position of the user.
The method of claim 1,
The computing means,
When comparing the coordinate values on the bottom surface of the control menu image projected by the projector with the coordinate values on the bottom surface of the foot position of the user sensed by the three-dimensional sensing means, on the bottom surface of the foot position of the user A display control system characterized in that the coordinate value change state together as a determination element.
The method of claim 1,
The computing means,
When the foot position of the user sensed by the three-dimensional sensing means is located within the area of the control menu image,
And the projection position of the control menu image is not changed until the user's foot position is outside the area of the control menu image.
The method of claim 1,
And the control menu image is configured to include a plurality of control menu selection areas to enable selection of one of a plurality of control menus.
A projector for controlling the screen of the display by a user interaction method, projecting a control menu image so that a control menu is displayed at a predetermined controlled position in a sensing area set on a bottom surface of the display, and a foot position of a user located in the sensing area A computer-readable recording medium having recorded thereon a program for executing each function of the computing means of a display control system including a three-dimensional sensing means for detecting a three-dimensional sensing method and a computing means,
A function of controlling a position of a control menu image projected by the projector;
A function of receiving sensing information regarding a foot position of a user sensed by the 3D sensing means;
The coordinate values on the bottom surface of the control menu image projected by the projector and the coordinate values on the bottom surface of the foot position of the user detected by the three-dimensional sensing means are compared with each other to correspond to a preset control menu operation operation. And a computer readable recording medium having recorded thereon a program for executing a function of performing display screen control according to the control menu operation.
KR1020130093271A 2013-08-06 2013-08-06 Display control system and recording medium thereof KR101369358B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130093271A KR101369358B1 (en) 2013-08-06 2013-08-06 Display control system and recording medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130093271A KR101369358B1 (en) 2013-08-06 2013-08-06 Display control system and recording medium thereof

Publications (1)

Publication Number Publication Date
KR101369358B1 true KR101369358B1 (en) 2014-03-04

Family

ID=50647316

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130093271A KR101369358B1 (en) 2013-08-06 2013-08-06 Display control system and recording medium thereof

Country Status (1)

Country Link
KR (1) KR101369358B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107113950A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
WO2018231457A1 (en) * 2017-06-15 2018-12-20 Microsoft Technology Licensing, Llc Displacement oriented interaction in computer-mediated reality
KR20190070619A (en) * 2017-12-13 2019-06-21 이인규 Apparatus of interacting with user
KR20190070568A (en) * 2017-12-13 2019-06-21 주식회사 파코웨어 Apparatus of interacting with user
KR102090903B1 (en) * 2019-07-30 2020-05-29 (주)우보재난시스템 Apparatus for fine dust broadcastring
CN112130659A (en) * 2019-06-25 2020-12-25 幻景启动股份有限公司 Interactive stereo display device and interactive induction method
KR102623632B1 (en) * 2022-08-30 2024-01-10 주식회사 다즐에듀 Board game projector

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130004357A (en) * 2010-04-01 2013-01-09 퀄컴 인코포레이티드 A computing device interface
KR20130022996A (en) * 2011-08-27 2013-03-07 이경자 Virtual touch screen apparatus and method for generating and manipulating

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130004357A (en) * 2010-04-01 2013-01-09 퀄컴 인코포레이티드 A computing device interface
KR20130022996A (en) * 2011-08-27 2013-03-07 이경자 Virtual touch screen apparatus and method for generating and manipulating

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107113950A (en) * 2014-12-26 2017-08-29 日立麦克赛尔株式会社 Lighting device
US10620710B2 (en) 2017-06-15 2020-04-14 Microsoft Technology Licensing, Llc Displacement oriented interaction in computer-mediated reality
WO2018231457A1 (en) * 2017-06-15 2018-12-20 Microsoft Technology Licensing, Llc Displacement oriented interaction in computer-mediated reality
CN110753899B (en) * 2017-06-15 2021-12-10 微软技术许可有限责任公司 Displacement orientation interaction in computer-mediated reality
CN110753899A (en) * 2017-06-15 2020-02-04 微软技术许可有限责任公司 Displacement orientation interaction in computer-mediated reality
US11119581B2 (en) * 2017-06-15 2021-09-14 Microsoft Technology Licensing, Llc Displacement oriented interaction in computer-mediated reality
KR20190070568A (en) * 2017-12-13 2019-06-21 주식회사 파코웨어 Apparatus of interacting with user
KR102119863B1 (en) * 2017-12-13 2020-06-16 주식회사 파코웨어 Apparatus of interacting with user
KR102081685B1 (en) * 2017-12-13 2020-02-26 주식회사 파코웨어 Apparatus of interacting with user
KR20190070619A (en) * 2017-12-13 2019-06-21 이인규 Apparatus of interacting with user
CN112130659A (en) * 2019-06-25 2020-12-25 幻景启动股份有限公司 Interactive stereo display device and interactive induction method
KR102090903B1 (en) * 2019-07-30 2020-05-29 (주)우보재난시스템 Apparatus for fine dust broadcastring
KR102623632B1 (en) * 2022-08-30 2024-01-10 주식회사 다즐에듀 Board game projector

Similar Documents

Publication Publication Date Title
KR101369358B1 (en) Display control system and recording medium thereof
US11599237B2 (en) User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
US10733801B2 (en) Markerless image analysis for augmented reality
JP6445515B2 (en) Detection of gestures made using at least two control objects
US9870058B2 (en) Control of a real world object user interface
US9436273B2 (en) Information processing device, method and computer-readable non-transitory recording medium
US8388146B2 (en) Anamorphic projection device
JP2020534592A5 (en)
US20130135199A1 (en) System and method for user interaction with projected content
JP2004504675A (en) Pointing direction calibration method in video conferencing and other camera-based system applications
JP2001125738A (en) Presentation control system and method
EP3021206B1 (en) Method and device for refocusing multiple depth intervals, and electronic device
CN110072673A (en) Track the robot of the mankind
US10306193B2 (en) Trigger zones for objects in projected surface model
WO2016053320A1 (en) Gesture based manipulation of three-dimensional images
US8977396B2 (en) Mobile robotic assistant for multipurpose applications
KR101575063B1 (en) multi-user recognition multi-touch interface apparatus and method using depth-camera
JP6866467B2 (en) Gesture recognition device, gesture recognition method, projector with gesture recognition device and video signal supply device
WO2012121405A1 (en) A user interface, a device having a user interface and a method of providing a user interface
TWI567629B (en) A method and device for controlling a display device
JP2015176451A (en) Pointing control device and pointing control program
JP2016197192A (en) Projection system and video projection method
JP2006301534A (en) Unit, method, and program for display control, and display
US20210325983A1 (en) Information providing system and information providing method
US20170285874A1 (en) Capture and projection of an object image

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170217

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20180223

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20191205

Year of fee payment: 7