WO2016170872A1 - Dispositif d'affichage d'informations et procédé d'affichage d'informations - Google Patents

Dispositif d'affichage d'informations et procédé d'affichage d'informations Download PDF

Info

Publication number
WO2016170872A1
WO2016170872A1 PCT/JP2016/057900 JP2016057900W WO2016170872A1 WO 2016170872 A1 WO2016170872 A1 WO 2016170872A1 JP 2016057900 W JP2016057900 W JP 2016057900W WO 2016170872 A1 WO2016170872 A1 WO 2016170872A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
gesture
distance
information
operator
Prior art date
Application number
PCT/JP2016/057900
Other languages
English (en)
Japanese (ja)
Inventor
亜紀 高柳
雅志 神谷
雄大 中村
内藤 正博
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to US15/550,313 priority Critical patent/US20180046254A1/en
Priority to DE112016001815.0T priority patent/DE112016001815T5/de
Priority to JP2016547119A priority patent/JP6062123B1/ja
Priority to CN201680022591.XA priority patent/CN107533366B/zh
Publication of WO2016170872A1 publication Critical patent/WO2016170872A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen

Definitions

  • the present invention relates to an information display device and an information display method in which an operator's instruction is input by a gesture performed by an operator (user).
  • a gesture UI User Interface
  • the gesture UI indicates that the operator operates the device by gestures such as the body movement and body shape of the operator (for example, hand movement, hand shape, finger movement, finger shape, etc.). enable.
  • a gesture UI that uses hand pointing to move the pointer on the screen of the display unit (display) in accordance with the movement of the hand performs a gesture in which the operator points the pointer position with the hand. Move the pointer on the screen.
  • the gesture UI detects (captures) an operator's gesture by using an imaging device such as an RGB (red, green, blue) camera or a gesture detection unit (sensor unit) such as a ToF (Time of Flight) sensor. .
  • the gesture UI analyzes image data obtained by imaging the operator, identifies (specifies) the gesture, and outputs a signal indicating the instruction content meaning the gesture.
  • an imaging device such as an RGB (red, green, blue) camera or a gesture detection unit (sensor unit)
  • ToF Time of Flight
  • the gesture UI analyzes image data obtained by imaging the operator, identifies (specifies) the gesture, and outputs a signal indicating the instruction content meaning the gesture.
  • the size of the operator in each frame image generated by imaging decreases, and the amount of movement of the body part due to the operator's gesture also decreases.
  • the gesture UI identifies (specifies) the gesture based on a small movement of the body part, and sends a signal indicating the instruction content indicated by the gesture. Must be output. For this reason, in the gesture UI, when the operator leaves the gesture detection unit, there is a problem that the gesture is erroneously identified (specified) or the gesture cannot be identified.
  • Patent Document 1 proposes setting an operation target region based on the position and size of a detection target in an image generated by an imaging device as a gesture detection unit in the image processing device.
  • this image processing apparatus by setting the operator's hand or face as the detection target, the ratio between the set operation target area and the size of the operator's hand or face included in the operation target area is kept constant. Be drunk. Thereby, the operativity by an operator improves.
  • the present invention has been made to solve the above-described problems of the prior art, and even when the operator is located away from the information display device, erroneous identification of the operator's gesture occurs. It is an object to provide a difficult information display device and information display method.
  • An information display device includes a display control unit that displays information on a display unit, a gesture detection unit that generates gesture information based on a gesture executed by an operator, and the gesture generated by the gesture detection unit
  • a gesture identification unit that identifies the gesture based on information and outputs a signal based on the identification result; a distance estimation unit that estimates a distance between the operator and the display unit; When the first set distance is stored, and the number of gestures that can be identified by the gesture identifying unit when the distance exceeds the first set distance is less than or equal to the first set distance The number of gesture identification units is less than the number of gestures that can be identified by the gesture identification unit.
  • An identification function setting unit for setting an identifiable gestures Te and having a.
  • An information display method is an information display method executed by an information display device that displays information on a display unit, and includes a gesture detection step of generating gesture information based on a gesture executed by an operator, and the gesture Gesture identification step for identifying the gesture based on the gesture information generated in the detection step and generating a signal based on the identification result, and a distance for estimating a distance between the operator and the display unit And when the number of gestures that can be identified in the gesture identifying step when the distance exceeds a predetermined first set distance is less than the first set distance, From the number of gestures that can be identified in the gesture identification step As eliminated, and having a an identification function setting step of setting an identifiable gestures in the gesture identification step.
  • identifiable gestures are set so that the number of identifiable gestures is reduced when the operator is at a position away from the display unit of the information display device.
  • an error in the gesture of the operator can be achieved. Identification can be made difficult to occur.
  • FIG. 1 is a block diagram schematically showing a configuration of an information display device according to Embodiment 1.
  • FIG. It is a figure which shows an example of the image which the information display apparatus which concerns on Embodiment 1 acquired by imaging an operator.
  • 2 is a schematic diagram illustrating an operation state of the information display device according to Embodiment 1.
  • FIG. 6 is a diagram illustrating an example of an image displayed on a display unit of the information display device according to Embodiment 1.
  • FIG. 6 is a diagram showing another example of an image displayed on the display unit of the information display device according to Embodiment 1.
  • FIG. 6 is a diagram showing another example of an image displayed on the display unit of the information display device according to Embodiment 1.
  • FIG. 6 is a diagram showing another example of an image displayed on the display unit of the information display device according to Embodiment 1.
  • FIG. 6 is a diagram showing another example of an image displayed on the display unit of the information display device according to Embodiment 1.
  • FIG. (A) And (b) is a figure which shows the transition (when an operator is in the position close
  • FIG. (A) And (b) is a figure which shows the transition (when an operator is in a position far from a display part) of the image displayed on the display part of the information display apparatus which concerns on Embodiment 1.
  • FIG. (A) to (c) is a schematic diagram illustrating a usage state of the information display device according to the first embodiment.
  • 3 is a flowchart showing initial setting in the information display apparatus according to Embodiment 1; 3 is a flowchart illustrating an operation of the information display device according to the first embodiment. It is a flowchart which shows operation
  • FIG. 6 is a block diagram schematically showing a configuration of an information display device according to Embodiment 3.
  • FIG. It is the schematic which shows the use condition of the information display apparatus which concerns on Embodiment 4 of this invention.
  • FIG. 10 is a block diagram schematically showing a configuration of an information display device according to a fourth embodiment. It is a block diagram which shows roughly the structure of the information display apparatus which concerns on Embodiment 5 of this invention.
  • FIG. 10 is a block diagram schematically showing a configuration of an information display device according to a modification example of Embodiment 5.
  • 10 is a flowchart illustrating an operation of the information display apparatus according to the fifth embodiment. It is a block diagram which shows roughly the structure of the information display apparatus which concerns on Embodiment 6 of this invention.
  • FIG. 10 is a block diagram schematically showing a configuration of an information display device according to a seventh embodiment. 18 is a flowchart showing an operation of the information display apparatus according to the seventh embodiment.
  • FIG. 10 is a hardware configuration diagram showing a configuration of a modification of the information display device according to the first to seventh embodiments.
  • FIG. 1 is a schematic diagram showing a usage state of an information display apparatus 1 according to Embodiment 1 of the present invention.
  • the information display device 1 includes a display unit 30 as a display and a gesture detection unit (sensor unit) 10 as an imaging unit (camera) or an imaging unit such as ToF.
  • the information display device 1 includes a gesture UI for operating the information display device 1 by a gesture performed by an operator (user) 2.
  • the gesture detection unit 10 is attached to the upper part of the main body of the information display device 1.
  • the gesture detection unit 10 may be housed inside the main body of the information display device 1 or may be provided near the main body of the information display device 1.
  • the gesture detection unit 10 is provided at a position away from the main body of the information display device 1. May be.
  • the operator 2 instructs the information display device 1 by a gesture that is the body movement and shape of the operator 2 (for example, a predetermined motion and pose by the hand 2A and the finger 2B, or an expression of the face 2C). Enter information.
  • the gesture for example, the operator 2 moves the hand 2A in the right lateral direction, moves the hand 2A in the left lateral direction, moves the hand 2A upward, moves the hand 2A downward, Examples thereof include an operation indicated by 2B and an operation of forming a peace sign with the hand 2A and the finger 2B.
  • FIG. 2 is a block diagram schematically showing the configuration of the information display device 1 according to the first embodiment.
  • the information display device 1 is a device that can perform the information display method according to the first embodiment.
  • the information display device 1 includes a gesture detection unit 10 that is an imaging unit, and a control unit 20.
  • the information display device 1 may include a display unit 30 as a display and a storage unit 40 such as a memory.
  • the control unit 20 includes a distance estimation unit 21, an identification function setting unit 22, a gesture identification unit (gesture determination unit) 23, and a display control unit 25.
  • the information display device 1 may have a function execution unit 24.
  • the gesture detection unit 10 generates gesture information G1 based on the gesture GE performed by the operator 2. For example, the gesture detection unit 10 captures the gesture GE of the operator 2 and generates gesture information G1 including image data corresponding to the gesture GE.
  • the image data is, for example, still image data of a plurality of frames arranged in time order or moving image data.
  • the gesture detection unit 10 may be any device that can generate gesture information according to the gesture of the operator 2. Other devices such as an operation device (described in Embodiment 4) provided on a part of the body of the operator 2 may be used.
  • the gesture identification unit 23 identifies (specifies) a gesture based on the gesture information G1 generated by the gesture detection unit 10, and outputs a signal G2 based on the identification result.
  • the identification of the gesture is a process of specifying what instruction content the gesture performed by the operator 2 indicates.
  • the gesture identification unit 23 detects a gesture motion pattern from the image data generated by the gesture detection unit 10. At this time, the gesture identification unit 23 detects the movement pattern of the gesture by analyzing the displacement of the detection region including the body part of the operator 2 from the still image data or the moving image data of a plurality of frames arranged in time order. .
  • the gesture identification unit 23 estimates the three-dimensional position of the detection region including the body part of the operator 2. Thus, the movement pattern of the gesture is detected.
  • the gesture identification unit 23 moves the gesture based on the distance D0 estimated by the distance estimation unit 21 and the image data of the detection area. Detect patterns.
  • the storage unit 40 stores a gesture database (DB) 40a in which a plurality of reference gestures and instruction contents corresponding to each of the plurality of reference gestures are associated with each other.
  • the gesture identifying unit 23 compares the detected motion pattern with each of the plurality of reference gestures stored as the gesture DB 40a, and identifies the one corresponding to the detected motion pattern from the plurality of stored reference gestures.
  • the gesture identification unit 23 determines that the instruction content associated with the identified reference gesture is the instruction content of the operator 2.
  • the gestures and instruction contents stored in the storage unit 40 as the gesture DB 40a are stored in advance. Further, the operator 2 can determine a gesture and instruction content corresponding to the gesture and store the gesture in the storage unit 40 as the gesture DB 40a.
  • the distance estimation unit 21 estimates the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) based on the gesture information G1 output from the gesture detection unit 10, and indicates the distance D0.
  • the distance information G3 is given to the identification function setting unit 22 and the display control unit 25.
  • the distance between the operator 2 and the display unit 30 is equal to the distance between the operator 2 and the gesture detection unit 10, or the operator There is a certain relationship with the distance between 2 and the gesture detection unit 10, and calculation is possible.
  • the distance estimation unit 21 detects a specific part of the body of the operator 2 included in the image data as a detection region, and the operator 2 and the gesture detection unit 10 based on the size of the detection region in each frame image. Estimate (calculate) the distance D0 to (or display unit 30).
  • the distance estimation unit 21 may calculate the distance D0 by another method.
  • FIG. 3 is a diagram illustrating an example of an image 200 acquired by the gesture detection unit 10 of the information display device 1 according to Embodiment 1 capturing an image of the operator 2.
  • the distance estimation unit 21 detects the hand 2 ⁇ / b> A of the operator 2 included in the image 200 as the detection area 201.
  • the distance estimation unit 21 detects the color of the operator's skin from the color information included in the image, and uses a specific part of the operator's body as a detection region.
  • the detection area is a face, it is also possible to determine the entire face area after determining the face part (eyes, nose, mouth, etc.).
  • the distance estimation unit 21 may employ another method such as determining a face by providing a search window in the image and determining whether the region is a face region by a classifier. For example, both the face of the operator 2 and the hand 2A may be used as the detection area.
  • the distance estimation unit 21 estimates (calculates) a distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) based on the size of the detection region 201 in the image 200.
  • the operator 2 determines a reference distance determined in advance. In some cases, it is determined whether the position is closer to the display unit 30 or farther than the display unit 30.
  • the distance estimation unit 21 correlates between the size of the detection region 201 in the image 200 and the distance D0. Data is stored in advance.
  • the distance estimation unit 21 estimates (calculates) the distance D0 from the size of the detection area 201 in the generated image 200 based on the correlation data.
  • the size of the detection region 201 can be derived from at least one of the area of the detection region 201 or the vertical length Lv and the horizontal length Lh of the detection region 201.
  • the distance estimation unit 21 uses the position where the operator 2 is scheduled to perform a gesture as a reference. As the position, the reference position and the size of the detection area 201 in the image 200 at the reference position are stored in advance as correlation data. Based on the correlation data, the distance estimation unit 21 determines whether the operator 2 is at a position closer or farther than the reference position from the size of the detection area 201 in the generated image 200. In this case, the calculation amount in the control unit 20 is reduced.
  • the identification function setting unit 22 stores a predetermined first set distance D1, and when the distance D0 exceeds the first set distance, the number of gestures that can be identified by the gesture identification unit 23 is the distance D0. Gestures that can be identified by the gesture identification unit 23 are set so that the number of gestures can be smaller than the number of gestures that can be identified by the gesture identification unit 23.
  • the first set distance D1 is set in advance when the use of the information display device 1 is started.
  • the identification function setting unit 22 does not limit the gestures that can be identified by the gesture identification unit 23 when the distance D0 is equal to or less than the first set distance D1. At this time, the operation state of the information display device 1 is referred to as a full function mode.
  • the identification function setting unit 22 notifies an identifiable gesture to the gesture identification unit 23 and notifies the display control unit 25 of the operation state of the information display device 1.
  • the identification function setting unit 22 restricts gestures that can be identified by the gesture identification unit 23 when the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2.
  • the limited identifiable gesture is a gesture for a function frequently used by the operator 2.
  • the operation state of the information display device 1 is referred to as a restricted function mode.
  • the identification function setting unit 22 further restricts the gestures that can be identified by the gesture identification unit 23 when the distance D0 is greater than the second set distance D2.
  • the further restricted identifiable gesture is a gesture for a function that is more frequently used by the operator 2.
  • the operation state of the information display device 1 is referred to as a specific function mode.
  • the distance D2 used as a reference in the identification function setting unit 22 is a distance at which it is difficult to recognize a gesture in the gesture identification unit 23.
  • This distance D2 is sensor performance such as the resolution of the camera used in the gesture detection unit 10, spectral filter information, lens performance such as MTF (Modulation Transfer Function), illumination environment information such as illumination light color and illuminance, and detection. It is set based on any or all of the size and color information of the object.
  • the function execution unit 24 executes a function based on the signal G2 output from the gesture identification unit 23. Further, the function execution unit 24 notifies the display control unit 25 of the result of executing the function.
  • the display control unit 25 controls the display operation of the display unit 30.
  • the display control unit 25 generates an operation menu and causes the display unit 30 to display the operation menu.
  • the display control unit 25 displays the display layout and display contents of the display unit 30 according to whether the operation state set by the identification function setting unit 22 is the full function mode, the restricted function mode, or the specific function mode. Set.
  • the display control unit 25 may adjust the sizes of characters, icons, and pointers displayed on the display unit 30 based on the distance D0 estimated (calculated) by the distance estimation unit 21. For example, when the distance D0 is small, the display control unit 25 reduces the size of the displayed characters, icons, and pointers, and increases the size of the displayed characters, icons, and pointers as the distance D0 increases. To do. By doing so, the visibility of icons, characters, pointers, and the like is maintained.
  • the display control unit 25 causes the display unit 30 to display the result notified from the function execution unit 24.
  • FIG. 4 is a schematic diagram showing an operation state of the information display apparatus 1 according to the first embodiment.
  • FIG. 4 shows the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30), and can be identified by the information display device 1 when the operator 2 performs a gesture operation on the information display device 1 (use The relationship between the types of possible gestures and the state of the information display device 1 is shown.
  • identifiable (usable) gestures are classified into three types of gestures GE1, GE2, and GE3.
  • the three types of gestures differ in the ease of identification by the information display device 1.
  • the first type of gesture GE1 is a swipe in which the operator 2 shakes his / her hand
  • the second type of gesture GE2 is a pointing in which the operator 2 shows a direction or position with a finger.
  • a third type of gesture GE3 is a pose in which the operator 2 changes the shape of the hand to a specific shape.
  • the first type of gesture GE1 is a gesture that can be determined only by the moving direction of a specific part of the body such as the hand 2A.
  • the third type of gesture GE3 cannot be determined only by the moving direction of the specific part of the body such as the hand 2A, and cannot be determined unless the displacement amount of the specific part of the body is analyzed.
  • the second type of gesture GE2 is a gesture that is hardly recognized
  • the third type of gesture GE3 is the gesture that is hardly recognized.
  • the operator 2 When the distance D0 is equal to or less than the first set distance D1 (that is, when the information display device 1 is in the all-function mode), when the gesture operation is performed on the information display device 1, the operator 2 performs all kinds of gestures GE1. , GE2, and GE3 can be used.
  • the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2 (that is, when the information display device 1 is in the restricted function mode)
  • the operator 2 When the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2 (that is, when the information display device 1 is in the restricted function mode), the operator 2 performs the first type of gesture
  • GE1 and the second type of gesture GE2 can be used, the third type of gesture GE3 cannot be used.
  • the operator 2 can use the first type of gesture GE1.
  • the second type of gesture GE2 and the third type of gesture GE3 cannot be used.
  • FIG. 5 is a diagram illustrating an example of an image displayed on the screen 31 of the display unit 30 of the information display device 1 according to the first embodiment.
  • FIG. 5 shows an example of the display content when the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) is equal to or less than the first set distance D1.
  • the viewing program image 301 based on the video content is displayed on the entire screen 31 of the display unit 30.
  • a “MENU” icon 302 is displayed at the lower left of the screen 31.
  • the information display device 1 identifies a predetermined gesture of the operator 2 (a gesture whose instruction content is transition to the display content including the operation menu), or the operator 2 displays the “MENU” icon.
  • the display shown in FIG. 5 transitions to display contents (FIG. 6) including an operation menu.
  • FIG. 6 is a diagram illustrating another example of an image displayed on the screen 31 of the display unit 30 of the information display device 1 according to the first embodiment.
  • FIG. 6 shows an example of display contents when the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) is equal to or less than the first set distance D1.
  • FIG. 6 shows the display content after transition from the display content of FIG.
  • the screen 31 is divided into four areas. In the upper left area of the screen 31, a viewing program image 301 based on the video content is displayed. In the upper right area of the screen 31, content information 303 indicating information about the currently viewed program image 301 is displayed. In addition, content setting information 304 as an operation menu is displayed at the lower left of the screen 31.
  • the content setting information 304 includes operation information for the viewing program image 301, that is, information such as channel switching, volume adjustment, recording and playback, image quality and sound quality change, and communication method.
  • the application menu information 305 indicates information including an icon for selecting an application that the information display device 1 has.
  • the “return to program” icon 306 is an icon for transitioning from the display content of FIG. 6 to the display content of FIG. 5.
  • the information display device 1 When the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) is equal to or less than the first set distance D1, the information display device 1 is in the all function mode.
  • the content setting information 304 and the application menu information 305 which are operation menus shown in FIG. 6, are information related to functions that can be operated by the operator 2 in the all function mode.
  • the icons, characters, and pointers displayed on the screen 31 of the display unit 30 can be configured such that the size decreases when the distance D0 is small, and the size increases as the distance D0 increases. Thereby, the operator 2 can recognize that the information display device 1 grasps the distance between the operator 2 and the information display device 1. In other words, the fact that the information display device 1 recognizes the distance D0 can be fed back to the operator 2 through the sizes of icons, characters, pointers, and the like.
  • the optimum sizes of icons, characters, and pointers on the screen 31 are set in advance in the information display device 1. Therefore, in the information display device 1, even when these sizes are changed, the layout of each information on the screen 31 is collapsed, or the information display device 1 protrudes from the area of the screen 31 on which icons, characters, and pointers are displayed. For example, it is set so as not to cause a change that impairs visibility.
  • FIG. 7 is a diagram illustrating another example of an image displayed on the screen 31 of the display unit 30 of the information display device 1 according to the first embodiment.
  • FIG. 7 shows an example of display when the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2. In this case, the area of the screen 31 is not divided.
  • the viewing program image 301 based on the video content is shown on the entire screen 31 as a background image.
  • the application menu information 305 is shown on the entire screen 31 with the viewing program image 301 as a background.
  • the information display device 1 When the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2, the information display device 1 is in the function restriction mode.
  • functions that the operator 2 can perform gesture operations are limited to functions that are frequently used.
  • the operation information for the viewing program image 301 relates to a function that is less frequently used by the operator 2. Therefore, as shown in FIG. 7, the content setting information 304 shown in FIG. 6 is not displayed in the function restriction mode. Also, the content information 303 shown in FIG. 6 is not necessarily information that is frequently used by the operator 2, and is not displayed in the function restriction mode as shown in FIG.
  • a “movie” icon 305 ⁇ / b> A is an icon for displaying the video content of the movie on the information display device 1.
  • FIG. 7 shows that the “movie” icon 305A is selected by an arrow as a pointer.
  • the content displayed on the display unit 30 in the function restriction mode is not limited to the application menu information 305.
  • the operator 2 may select items frequently used in advance and perform ranking, and the information display device 1 may determine the content displayed on the display unit 30 based on the ranking information.
  • the information display device 1 counts the operation items used by the gesture of the operator 2 and stores the result of the count, whereby the information display device 1 determines the content displayed in the function restriction mode. It is also possible.
  • FIG. 8 is a diagram illustrating another example of an image displayed on the screen 31 of the display unit 30 of the information display device 1 according to the first embodiment.
  • the “movie” icon 305B in FIG. 8 has a thick frame around the “movie” icon 305B instead of being selected by a pointer (arrow).
  • the display of the selected icon can be changed according to the distance D0.
  • the size of the “movie” icon 305A (305B) or other icons and characters included in the application menu information 305 may be changed according to the distance D0.
  • FIG. 9 is a diagram illustrating another example of an image displayed on the screen 31 of the display unit 30 of the information display device 1 according to the first embodiment.
  • FIG. 9 shows an example of an image displayed on the screen 31 of the display unit 30 when the distance D0 exceeds the second set distance D2.
  • a viewing program image 301 based on video content is displayed on the entire screen 31 of the display unit 30.
  • the information display device 1 When the distance D0 exceeds the second set distance D2, the information display device 1 is in the specific function mode.
  • functions that the operator 2 can perform gesture operations are limited to functions that are used more frequently, for example, channel switching and volume change related to video content that the operator 2 is viewing.
  • the “MENU” icon 302 (FIG. 5) for transitioning to the display content including the operation menu is not displayed on the screen 31 of the display unit 30.
  • the functions of the information display device 1 are limited to functions that are frequently used by the operator 2, and information that is not frequently used by the operator 2 is not displayed on the display unit 30.
  • unnecessary information is not displayed, so that it is possible to avoid a situation in which the display content of the display unit 30 is difficult to be visually recognized by the operator.
  • FIGS. 10A and 10B are diagrams showing transition of display contents by the information display device 1 according to the first embodiment.
  • 10A and 10B show the display of the information display device 1 when the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) is equal to or less than the first set distance D1.
  • a mode that the display content of the part 30 changes is shown.
  • the display content of the display unit 30 in FIG. 10A is the same as the display content of the display unit 30 shown in FIG.
  • the display content of the display part 30 shown by FIG.10 (b) is the same as the display content of the display part 30 shown by FIG.
  • the operator 2 performs a gesture and selects the “MENU” icon 302 of the display unit 30 in FIG.
  • the display content of the display unit 30 is changed from the display content of FIG. 10A to FIG. Transition to the display content of.
  • the display shown in FIG. 10A is displayed from the display content of the display unit 30 shown in FIG. Transition to the display content of the unit 30.
  • FIGS. 11A and 11B are diagrams showing transition of display contents by the information display apparatus 1 according to the first embodiment.
  • 11A and 11B show a case where the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) is greater than the first set distance D1 and not more than the second set distance D2.
  • Fig. 5 shows how the display content of the display unit 30 of the information display device 1 changes.
  • the display content of the display unit 30 in FIG. 11A is the same as the display content of the display unit 30 shown in FIG.
  • the display content of the display part 30 shown by FIG.11 (b) is the same as the display content of the display part 30 shown by FIG.
  • the display content of the display unit 30 shown in FIG. 11B is changed from the display content of the display unit 30 shown in FIG. Transition to display contents.
  • the operator 2 performs a gesture and selects the “return to program” icon 306 shown in FIG. 7, the display contents of the display unit 30 shown in FIG. The display contents on the display unit 30 are transitioned to.
  • FIG. 12A shows the display unit 30 of the information display device 1 when the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) exceeds the second set distance D2. Indicates the display contents.
  • FIGS. 12B and 12C show examples of gestures performed by the operator 2.
  • the display content of the display unit 30 in FIG. 12A is the same as the display content of the display unit 30 in FIG.
  • the operator 2 performs a left / right swipe (moves the hand 2 ⁇ / b> A left or right in the direction of the arrow) when switching channels.
  • swipe up and down moves the hand 2A up or down in the direction of the arrow).
  • the operator 2 can operate the frequently used channel switching and volume adjustment functions with only gestures, without using the operation menu shown on the display unit 30.
  • a swipe that is a gesture used at this time is a gesture that is easily identified by the information display device 1. Therefore, even when the distance between the operator 2 and the information display device 1 is large, the possibility of being erroneously identified by the information display device 1 can be reduced.
  • the functions that can be operated by the operator 2 when the distance D0 exceeds the second set distance D2 are not limited to channel switching and volume adjustment. However, it is desirable that the gesture that can be used by the operator 2 is not identified by a displacement amount of a specific part such as the hand 2A of the operator 2, but is identified by the moving direction.
  • FIG. 13 is a flowchart showing an initial setting in the information display device 1 according to the first embodiment.
  • the information display device 1 sets the first set distance D1 (step S1).
  • the information display device 1 uses the gesture detection unit 10 to detect the hand 2A, the finger 2B, the face 2C, and the like of the operator 2 that are located at the first set distance D1 from the gesture detection unit 10 (or the display unit 30).
  • An image is taken, and the sizes of the hand 2A, finger 2B, face 2C, and the like included in the taken image are stored.
  • the information display device 1 stores, as shown in the detection area 201 in FIG. 3, the area and the vertical length of the detection area when the hands 2A, the fingers 2B, and the face 2C are set as the detection areas.
  • the information display device 1 sets a second set distance D2 (step S2). As in step S1, the information display device 1 images the operator 2 located at a position away from the gesture detection unit 10 (or display unit 30) by the second set distance D2, and stores the area of the detection region and the like. .
  • FIG. 14 is a flowchart showing the operation of the information display device 1 according to the first embodiment.
  • the gesture detection unit 10 starts capturing (detecting) a gesture using a predetermined gesture of the operator 2 as a trigger (step S10).
  • the predetermined gesture may be a pose with the hand 2A or a movement with the hand 2A.
  • the gesture detection unit 10 captures the gesture of the operator 2 and generates image data.
  • image data includes a plurality of people.
  • the image data includes a plurality of persons, for example, a method of performing gesture recognition for a person who first detects a predetermined part of the body or a method of performing gesture recognition for all persons is applied. .
  • the information display apparatus 1 determines whether the initial setting shown in FIG. 13, that is, the setting of the first setting distance D1 and the second setting distance D2 (steps S1 and S2) is completed (step S1). S11). If the initial setting has not been made, the information display device 1 performs the initial setting shown in FIG. 13 (step S12).
  • the distance estimation unit 21 estimates the distance D0 between the operator 2 and the gesture detection unit 10 (or display unit 30) (step S13).
  • the information display device 1 proceeds to step S15.
  • the distance D0 is greater than the first set distance D1 and equal to or less than the second set distance D2
  • the process proceeds to step S17 (step S14).
  • the identification function setting unit 22 sets the operation state of the information display device 1 to the all function mode (step S15).
  • the identification function setting unit 22 sets the operation state of the information display device 1 to the function restriction mode (step S16).
  • the identification function setting unit 22 sets the operation state of the information display device 1 to the specific function mode (step S17).
  • the display control unit 25 changes the layout in which the video content and the operation menu are displayed based on the operation state of the information display unit 1a, and causes the display unit 30 to display it. Further, the display control unit 25 adjusts the sizes of characters, icons, and pointers displayed on the display unit 30 based on the distance D0 (step S18).
  • the gesture identification unit 23 identifies a gesture based on the motion pattern indicated by the gesture of the operator 2 and the identifiable gesture restricted by the identification function setting unit 22 (step S19).
  • the gesture identification unit 23 determines the instruction content of the operator 2 with reference to the gesture DB 40a. At this time, when the time required for identifying the gesture exceeds the time limit, or when the operator 2 operates the information display device 1 with a remote controller or the like, the gesture identifying unit 23 stops the gesture identifying process (step S22).
  • the function execution unit 24 executes a function based on the instruction content (step S20).
  • the information display device 1 determines whether or not the operation by the gesture of the operator 2 is finished (step S21). If there is an operation by the operator 2 (NO in step S21), the gesture identifying unit 23 identifies the gesture of the operator 2.
  • ⁇ 1-3 Effect
  • the operator 2 is away from the display unit 30 of the information display device 1 (for example, the first set distance).
  • the identifiable gesture is set so that the number of identifiable gestures is reduced.
  • a gesture with a small amount of body movement is excluded from the identifiable gestures. By doing so, it is possible to make it difficult to erroneously identify the gesture of the operator 2.
  • the information display device 1 when the operator 2 is at a position away from the display unit 30 of the information display device 1 (for example, a position away from the first set distance D1). Limit identifiable gestures to gestures for frequently used functions. Thus, according to the information display device 1 according to the first embodiment, the number of identifiable gestures is reduced when the operator 2 is at a position away from the display unit 30 (for example, use of the operator). Therefore, it is possible to prevent erroneous recognition of the gesture of the operator 2.
  • the operation menu is enlarged and displayed even when the operator 2 is located away from the display unit 30. It is possible to suppress a decrease in visibility (visibility) of display contents on the display unit 30 and ease of operation (operability).
  • the information display device 1 may adopt a method using only one of the first set distance D1 and the second set distance D2, or a method using three or more set distances. May be.
  • Embodiment 2 In the first embodiment, when the gesture of the operator 2 is not identified (determined) by the information display device 1, the gesture identification determination process of the information display device 1 is stopped (step S22 in FIG. 14), and the gesture is again performed. The operation is performed. On the other hand, in Embodiment 2, it is displayed on the display part 30 that a gesture is not identified.
  • the configuration of the information display device in the second embodiment is the same as the configuration of the information display device 1 according to the first embodiment. Therefore, FIG. 1 is also referred to when describing the second embodiment.
  • FIG. 15 is a flowchart showing the operation of the information display apparatus according to Embodiment 2 of the present invention.
  • the same processing steps as those shown in FIG. 14 are denoted by the same reference numerals as those shown in FIG.
  • the gesture identification unit 23 identifies a gesture based on the motion pattern indicated by the gesture of the operator 2 and the identifiable gesture restricted by the identification function setting unit 22 (step S19).
  • the gesture identification unit 23 receives the operator via the function execution unit 24.
  • the display control unit 25 is notified that the second gesture cannot be identified.
  • the display control unit 25 displays on the display unit 30 that the gesture cannot be identified (step S23).
  • the display unit 30 displays that the gesture cannot be identified.
  • the display by the display unit 30 allows the operator 2 to grasp that the gesture has not been identified. For this reason, the operator 2 can select to perform gesture again or to operate with a device such as a remote controller.
  • a device such as a remote controller.
  • Embodiment 2 the information display apparatus and information display method according to Embodiment 2 are the same as the apparatus and method of Embodiment 1 except for the points described above.
  • the gesture detection unit 10 is installed in the main body of the information display device 1 (upper part of the display unit 30).
  • the gesture detection unit is provided at a position away from the information display unit including the display unit (display) 30 will be described.
  • FIG. 16 is a schematic diagram showing a usage state of the information display apparatus 100 according to Embodiment 3 of the present invention.
  • the information display device 100 includes an information display unit 1 a having a display unit 30 and a gesture detection unit 10 a.
  • the operator 2 performs a gesture and operates the information display unit 1a.
  • the gesture detection unit 10a is provided at a position away from the information display unit 1a, for example, on the wall surface of the room.
  • FIG. 17 is a block diagram schematically showing the configuration of the information display apparatus 100 according to the third embodiment. 17, components that are the same as or correspond to the components shown in FIG. 2 (Embodiment 1) are assigned the same reference numerals as those in FIG. As shown in FIG. 17, the information display apparatus 100 includes a gesture detection unit 10a and an information display unit 1a.
  • the gesture detection unit 10 a includes an imaging unit 13 and a transmission unit 11.
  • the imaging unit 13 captures the gesture GE of the operator 2 and generates image data.
  • the transmission unit 11 transmits the image data generated by the imaging unit 13 to the reception unit 12 of the information display unit 1a.
  • the reception unit 12 of the information display unit 1a receives the image data transmitted from the transmission unit 11 of the gesture detection unit 10a, and sends the received image data to the control unit 20.
  • the communication method between the transmission unit 11 and the reception unit 12 may be any of wireless communication such as Bluetooth (registered trademark), infrared communication, and Wi-Fi communication, and wired communication.
  • the gesture detection unit 10a is fixed at a position where the gesture of the operator 2 can be detected.
  • the information display unit 1a also includes position information (for example, indicated by XYZ coordinates) indicating the position where the gesture detection unit 10a is fixed, and position information (for example, XYZ) indicating the position of the display unit 30 of the information display unit 1a. (Indicated by coordinates).
  • the distance estimation unit 21 can estimate (calculate) the distance between the operator 2 and the display unit 30 based on the above-described position information and the image data received from the reception unit 12.
  • the gesture identification unit 23 can detect a movement pattern by analyzing the displacement of the body part (detection region) of the operator 2 included in the image data received from the reception unit 12.
  • the gesture detection unit 10a is provided at a position away from the information display unit 1a, the distance between the operator 2 and the display unit 30 can be estimated. Since the display content on the display unit 30 is changed according to the estimated distance, the information display device 100 can maintain good operability when the operator 2 operates the information display unit 1a.
  • the gesture detection unit 10 and the information display unit 1a are combined by updating the software of the information display unit 1a.
  • Such an information display device can be configured.
  • the information display apparatus according to Embodiment 3 can be applied to a conventional information display apparatus.
  • Embodiment 3 the information display apparatus and information display method according to Embodiment 3 are the same as the apparatus and method of Embodiment 1 or 2 except for the points described above.
  • ⁇ 4 Embodiment 4
  • the case where the gesture detection unit 10a is fixed at a predetermined position and the detection target (imaging target) by the gesture detection unit 10a is the operator 2 has been described.
  • an operation device as a gesture detection unit is attached to the body of the operator 2 who performs a gesture (for example, the operator holds the operation device in his hand), and the operation device as a gesture detection unit is used.
  • the detection target (imaging target) is, for example, a display unit of an information display device will be described.
  • FIG. 18 is a schematic diagram showing a usage state of the information display device 100a according to the fourth embodiment of the present invention.
  • the information display device 100 a includes an information display unit 1 b having a display unit 30 and an operation device 10 b as a gesture detection unit.
  • the operator 2 performs an operation signal to the reception unit 12 (shown in FIG. 19 described later) of the information display unit 1b by performing a gesture while the operation device 10b is attached to the body (for example, held in a hand). Enter.
  • the operator 2 moves (shakes) the operating device 10b to the left side, to the right side, to the upper side, or to the lower side, or to the character in the space with the operating device 10b. Then, an operation signal is input to the information display unit 1b by a gesture such as drawing a mark.
  • a gesture such as drawing a mark.
  • the shape and size of the operating device 10b are not limited to those shown in FIG. 18, and may be other shapes.
  • FIG. 19 is a block diagram schematically showing the configuration of the information display device 100a according to the fourth embodiment.
  • the same or corresponding components as those shown in FIG. 17 (Embodiment 3) are given the same reference numerals as those shown in FIG.
  • the operating device 10b includes an imaging unit 15 as a recognition sensor, a feature extraction unit 14, and a transmission unit 11.
  • the imaging unit 15 captures an area including the information display unit 1b as a subject while the operator 2 performs a gesture, and generates image data.
  • the imaging unit 15 is, for example, an RGB camera and a ToF sensor.
  • the RGB camera may be a camera mounted on a portable information terminal such as a smartphone in which software for realizing a communication function with the information display unit 1b is installed.
  • the feature extraction unit 14 extracts the feature of the gesture performed by the operator 2 from the image data generated by the imaging unit 15.
  • the feature of the gesture is that for a certain captured object (target object) included in the image data, by analyzing the amount of change of the target object in a plurality of still image data or moving image data arranged in time order, It is extracted as a motion vector (trajectory of movement of the operating device 10b).
  • the feature extraction unit 14 can use the display unit 30 of the information display unit 1b as an object included in the image data generated by the imaging unit 15.
  • the feature extraction unit 14 sends the generated image data, gesture features, and information about the area of the display unit 30 to the transmission unit 11.
  • the transmission unit 11 transmits the generated image data, the feature of the gesture, and information about the area of the display unit 30 of the information display unit 1b to the reception unit 12 of the information display unit 1b.
  • the information display device 100a estimates the distance between the operator 2 and the display unit 30 by performing the gesture while the operator 2 holds the operation device 10b. Can do. Since the display content of the information display device 100a is changed according to the estimated distance, the operability when the operator operates the information display device can be maintained.
  • the imaging unit 15 and the information display unit 1b are combined by updating the software of the information display unit 1b.
  • the information display device according to Embodiment 4 can be configured.
  • the information display apparatus according to Embodiment 4 can be applied to a conventional information display apparatus.
  • Embodiment 5 Embodiment 5
  • the environment for example, lighting conditions and natural light conditions
  • the information display device 1 or the information display units 1a and 1b
  • the image data generated by the information display device 1 is affected by disturbances such as shaking, illumination, and natural light (external light) depending on the environment of the place (for example, lighting conditions, natural light conditions, etc.)
  • the possibility that the display device 1 erroneously recognizes the content of the gesture operation by the operator 2 increases.
  • the information display device 1c according to the fifth embodiment includes the disturbance detection unit 26, thereby suppressing erroneous recognition due to the influence of the disturbance and improving the operability of the information display device 1c by gesture operation.
  • the information display device 1c according to the fifth embodiment and the information display device 1d according to the modification will be described with reference to FIGS.
  • FIG. 20 is a block diagram schematically showing the configuration of the information display device 1c according to the fifth embodiment of the present invention.
  • 20 components that are the same as or correspond to the components shown in FIG. 2 (Embodiment 1) are assigned the same reference numerals as those in FIG.
  • the information display device 1c according to the fifth embodiment is different from the first embodiment in that the control unit 20c has a disturbance detection unit 26 and the processing content of the distance estimation unit 21c of the control unit 20c. 1 is different from the information display device 1 according to FIG.
  • the disturbance detection unit 26 detects a movement in a plurality of frames of images (a plurality of pieces of image data) G1 generated by imaging in the gesture detection unit 10, and a predetermined fixed period is included in the detected movement.
  • a predetermined fixed period is included in the detected movement.
  • disturbances for example, vibrations, changes in lighting conditions, changes in natural light conditions, etc.
  • filter information G5 based on this determination ( Information used for processing to remove the influence of disturbance).
  • the filter information G5 generated by the disturbance detection unit 26 is sent to the distance estimation unit 21c.
  • the distance estimation unit 21c corrects the motion in the multiple frame images (multiple image data) G1. More specifically, the distance estimation unit 21c eliminates the influence of the disturbance by removing the regular motion component that is the influence of the disturbance from the image (multiple image data) G1 of the plurality of frames. Alternatively, an image (a plurality of pieces of image data) G3 having a plurality of frames (in which the influence of disturbance is reduced) is generated.
  • FIG. 21 is a block diagram schematically showing a configuration of an information display device 1d according to a modification of the fifth embodiment.
  • the same or corresponding components as those shown in FIG. 20 are denoted by the same reference numerals as those in FIG.
  • the information display device 1d is different from the information display device 1c according to the fifth embodiment in that it includes a disturbance detection unit 26d as a configuration outside the control unit 20d.
  • the control unit 20d of the information display device 1d performs the same control as the control unit 20c of the information display device 1c.
  • the disturbance detection unit 26d detects a movement in a plurality of frames of images (a plurality of pieces of image data) G1 generated by imaging in the gesture detection unit 10, and a predetermined fixed period is included in the detected movement. If the regular movement is included, the regular movement is judged as disturbance (for example, acceleration, vibration, tilt, change in lighting situation, change in natural light situation, etc.) and based on this judgment Filter information G5 (information used for processing to remove the influence of disturbance) is generated. In this way, the disturbance detection unit 26d is newly installed outside the control unit 20d, and can detect disturbances such as acceleration, vibration, inclination, lighting conditions, and natural light conditions, for example. An acceleration sensor, a vibration sensor, a tilt sensor, an optical sensor, or the like may be used.
  • the filter information G5 generated by the disturbance detection unit 26d is sent to the distance estimation unit 21d.
  • the distance estimation unit 21d corrects the motion in the multiple frame image (multiple image data) G1. More specifically, the distance estimation unit 21d removes a regular motion component that is an influence of a disturbance from a plurality of frames of image (a plurality of pieces of image data) G1, thereby eliminating the influence of the disturbance ( Alternatively, an image (a plurality of pieces of image data) G3 having a plurality of frames (in which the influence of disturbance is reduced) is generated.
  • FIG. 22 is a flowchart showing operations of the information display devices 1c and 1d according to the fifth embodiment. 22, the same reference numerals as those shown in FIG. 14 are given to the same processing steps as those shown in FIG. 14 (Embodiment 1). The process shown in FIG. 22 differs from the process shown in FIG. 14 in that steps S24 and S25 are included. Below, with reference to FIG. 22, the main part of operation
  • the gesture detection unit 10 starts imaging (detection) of a gesture using a predetermined gesture of the operator 2 as a trigger (step S10).
  • step S24 the disturbance detection unit 26 performs a disturbance detection process.
  • the distance estimation unit 21c corrects the influence of the disturbance by removing a regular motion component caused by the disturbance from the image data G1.
  • Step S25 the distance estimation unit 21c advances the process to step S11 without performing the process of correcting the influence of the disturbance.
  • the processing after step S11 is the same as the processing shown in FIG. 14 (Embodiment 1).
  • disturbance detection is performed after gesture detection is started (step S10), but gesture detection is started after disturbance detection (step S24) is performed (step S24). S10).
  • the information display device 1c and the information display method according to the fifth embodiment are the same as the devices and methods of the first to fourth embodiments.
  • the information display device 1d and the information display method according to the modification of the fifth embodiment are the same as the devices and methods of the first to fourth embodiments.
  • the influence of disturbance can be reduced, so that the accuracy of gesture recognition can be improved.
  • Embodiment 6 In Embodiments 1 to 5, the configuration in which the gesture is detected by the gesture detection unit 10 and the distance is measured without specifying the operator 2 has been described.
  • the operator recognition unit 27 specifies (recognizes) the operator 2 who performs the gesture operation from the image (image data) G1 generated by the gesture detection unit 10, and is registered in advance. A case where the distance is measured using the size information of the body part of the operator 2 will be described.
  • FIG. 23 is a block diagram schematically showing the configuration of the information display device 1e according to Embodiment 6 of the present invention.
  • constituent elements that are the same as or correspond to those shown in FIG. 2 (Embodiment 1) are assigned the same reference numerals as those in FIG.
  • the information display device 1e is configured to display the information according to the first embodiment in that the control unit 20e has an operator recognition unit 27 and the processing content of the distance estimation unit 21e of the control unit 20e. Different from device 1.
  • the operator recognizing unit 27 stores the initial setting information (predetermined first set distance D1 and second set distance D2) in the identification function setting unit 22, and the face image (face image) of the operator 2 Data) and the attribute information of the operator 2 are associated with each other (linked) and stored in advance.
  • the attribute information stored in the operator recognizing unit 27 includes information for determining the attribute of the operator 2 such as an adult or a child, a man or a woman, an age (age), and a nationality.
  • the attribute information is estimated from a part of the body that appears in an image generated by photographing by the gesture detection unit 10 such as a face, a physique, a hand size, and a skin color that are registered at the initial setting. It may be a thing, or may be selected by the operator 2 at the time of setting.
  • FIG. 24 is a flowchart showing the operation of the information display device 1e according to the sixth embodiment. 24, the same processing steps as those shown in FIG. 14 are denoted by the same reference numerals as those shown in FIG. The process shown in FIG. 24 is different from the process shown in FIG. 14 in that steps S26, S27, and S28 are included. Below, with reference to FIG. 24, the main part of operation
  • the gesture detection unit 10 starts imaging (detecting) a gesture using a predetermined gesture of the operator 2 as a trigger (step S10).
  • the information display device 1e determines whether the initial setting shown in FIG. 13, that is, the setting of the first setting distance D1 and the second setting distance D2 (steps S1 and S2) has been completed (step S1). S11). If the initial setting has not been made, the information display device 1e performs the initial setting shown in FIG. 13 (step S12).
  • step S26 when the operator recognition unit 27 starts detecting gestures and confirms whether initial settings including operator registration are performed (steps S10, S11, and S12), the operator recognition unit 27 performs an operator recognition process (step S26). ).
  • the operator recognizing unit 27 sets the operator 2, and displays the display setting information G4 and the image data G1 based on the setting of the operator 2 on the distance estimating unit 21e. (Step S27). If the operator 2 cannot be recognized (NO in step S26), the operator recognition unit 27 sets guest setting information and sends the guest setting information and the image data G1 to the distance estimation unit 21e (step S28).
  • the distance estimation unit 21e estimates the distance D0 between the operator 2 and the gesture detection unit 10 (or display unit 30) (step S13).
  • the process after step S13 in FIG. 24 is the same as the process after step S13 shown in FIG. 14 (Embodiment 1).
  • the operator recognizing unit 27 identifies (identifies) the operator 2, and the distance estimating unit 21e performs processing based on the attribute information of the operator 2 and the image data G1, thereby displaying the display unit.
  • 30 can also display the identification information of the operator 2. In this case, even if there are a plurality of people at the same distance from the information display device 1e, who the information display device 1e recognizes as the operator 2 among the plurality of people? , Can be communicated to the operator 2 itself. For this reason, the operator 2 can obtain favorable operability when performing a gesture operation on the information display device 1e.
  • the information display device 1e and the information display method according to the sixth embodiment since there is a plurality of persons and the operator 2 can be identified and the gesture can be recognized, the accuracy of the gesture recognition is improved. be able to.
  • the information display device 1e and the information display method according to the sixth embodiment are the same as the devices and methods according to the first to fourth embodiments.
  • the distance estimation unit 21 calculates the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) based on the gesture information G1 output from the gesture detection unit 10. It is assumed that the operator 2 is not on the left side or the right side of the gesture detection unit 10 (or the display unit 30).
  • the information display device 1f according to Embodiment 7 includes not only the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) but also the operator. 2 is on the left or right side of the front center position of the gesture detection unit 10 (or display unit 30). That is, when there are two persons (the operator 2 and the other person 3), the information display device 1f determines whether the operator 2 is on the left side or the right side of the front center position of the gesture detection unit 10. It has a function that can be distinguished.
  • FIG. 26 is a block diagram schematically showing the configuration of the information display device 1f according to Embodiment 7 of the present invention.
  • constituent elements that are the same as or correspond to those shown in FIG. 2 (Embodiment 1) are assigned the same reference numerals as those in FIG.
  • the information display device 1e is different from the first embodiment in that the distance estimation unit 21f of the control unit 20f has a left / right discrimination unit 21fa and the processing contents of the distance estimation unit 21f of the control unit 20f. This is different from the information display device 1 according to the above.
  • the left / right determination unit 21fa determines whether the divided area is on the right side or the left side. As a result, the left / right determination unit 21fa can determine whether the operator 2 is on the right side or the left side of the area divided into the left side and the right side of the frame image.
  • the left / right determination unit 21fa measures the distance from the gesture detection unit 10 to the operator 2 and the operator 2 divides the frame image into left and right parts It is possible to determine whether it is on the right side or the left side.
  • the left / right discriminating unit 21fa sits in a driver's seat where the operator 2 is sitting in the driver's seat (left side as viewed from the gesture detector 10 in the case of a right-hand drive vehicle) or in the passenger's seat in the interior space. Whether the person is on the right side as viewed from the gesture detection unit 10 in the case of a right-hand drive vehicle or the person sitting on the rear seat (right side or left side as viewed from the gesture detection unit 10 of the rear seat) Can be determined.
  • FIG. 27 is a flowchart showing the operation of the information display apparatus according to Embodiment 7 of the present invention.
  • the same reference numerals as those shown in FIG. 14 are given to the same processing steps as those shown in FIG.
  • the process shown in FIG. 27 is based on the point having steps S29 and S30 and the determination result in step S30, and the process in steps S14, S15, S16, and S17 or the processes in steps S14a, S15a, S16a, and S17a. This is different from the process shown in FIG. Below, with reference to FIG. 27, the main part of operation
  • the gesture detection unit 10 starts imaging (detection) of a gesture using a predetermined gesture of the operator 2 as a trigger (step S10).
  • the information display device 1f determines whether the initial setting shown in FIG. 13, that is, the setting of the first setting distance D1 and the second setting distance D2 (steps S1 and S2) is completed (step S1). S11). If the initial setting has not been made, the information display device 1 performs the initial setting shown in FIG. 13 (step S12).
  • the distance estimation unit 21f estimates the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) (step S13).
  • the left / right determination unit 21fa determines whether the operator 2 is left or right when viewed from the display unit 30 from the frame image captured and generated by the gesture detection unit 10a. Is determined (step S29).
  • the information display device 1f shifts to any mode based on the left or right distance D0 (step S30). ). If the operator 2 cannot determine whether the operator 2 is on the left side or the right side as viewed from the display unit 30, predetermined processing (for example, steps S14 to S17 or steps S14a to S17a) is performed. May be selected and notified to the display control unit 25 that it could not be determined, and displayed on the display unit 30. Further, it may be displayed on the display unit 30 that the operator 2 is on the left side or the right side as viewed from the display unit 30 or that the operator 2 cannot be discriminated.
  • predetermined processing for example, steps S14 to S17 or steps S14a to S17a
  • the distance estimation unit 21f includes a left / right determination unit 21fa.
  • the operator 2 is the operator 2 sitting in the driver's seat or the operator sitting in the passenger seat. By determining whether it is 2, the display content controlled by the display control unit 25 can be switched.
  • the information display device 1f displays all operations for which gesture operation is possible on the display unit 30, and the operator 2 is a person sitting in the passenger seat. In this case, the information display device 1f can cause the display unit 30 to display only functions that are not related to safety among the operations that allow gesture operations (restricted functions).
  • the distance estimation unit 21f estimates the distance D0 between the operator 2 and the gesture detection unit 10 (or the display unit 30) (step S13).
  • the information display device 1f determines that the process is performed if the distance D0 is equal to or less than the first set distance D1. If the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2, the process proceeds to step S16, and if the distance D0 exceeds the second set distance D2, the process is performed. The process proceeds to step S17 (step S14).
  • the information display device 1f performs the process if the distance D0 is equal to or less than the first set distance D1. Proceeding to step S15a, if the distance D0 is greater than the first set distance D1 and less than or equal to the second set distance D2, the process proceeds to step S16a, and if the distance D0 exceeds the second set distance D2, the process proceeds to step S15a. The process proceeds to S17a (step S14a). Steps S15a, S16a, and S17a are the same processes as steps S15, S16, and S17, except that the operator 2 is on the right side.
  • the information display device 1f and the information display method according to the seventh embodiment it is possible to determine in which direction the operator 2 is seen from the information display device 1f, and the determined direction (for example, left side or The operation content (display content on the display unit 30) can be switched according to the right side).
  • the information display device 1f and the information display method according to the seventh embodiment are the same as the devices and methods of the first to fourth embodiments.
  • FIG. 28 is a hardware configuration diagram showing a configuration of a modified example of the information display device according to the first to seventh embodiments.
  • the information display device according to the first to seventh embodiments uses a memory 91 as a storage device that stores a program as software, and a processor 92 as an information processing unit that executes the program stored in the memory 91 ( For example, by a computer).
  • the storage unit 40 shown in FIGS. 2, 17, and 19 corresponds to the memory 91 in FIG. 28, and the control unit 20 shown in FIGS. This corresponds to the processor 92 that executes the program in FIG.
  • a part of the control unit 20 shown in FIGS. 2, 17, and 19 may be realized by the memory 91 and the processor 92 in FIG.
  • the present invention can be modified without departing from the spirit of the present invention.
  • the disturbance suppressing function of the fifth embodiment the function of obtaining the distance using the operator recognition result of the sixth embodiment, and the direction of the operator of the seventh embodiment (left side or right side when viewed from the apparatus) It is possible to arbitrarily combine the function of switching display contents in accordance with.
  • the present invention is applicable to various information display devices such as a digital signage system installed in a television, a PC, a car navigation system, a rear seat entertainment system (RSE), a station, an airport, and the like.
  • a digital signage system installed in a television, a PC, a car navigation system, a rear seat entertainment system (RSE), a station, an airport, and the like.
  • RSE rear seat entertainment system
  • 1, 1c, 1d, 1e, 1f Information display device 1a, 1b information display unit, 2 operators, 10, 10a gesture detection unit, 10b operation device, 11 transmission unit, 12 reception unit, 14 feature extraction unit, 20, 20c, 20d, 20e, 20f control unit, 21, 21c, 21d, 21e, 21f distance estimation unit, 21fa left / right discrimination unit, 22 identification function setting unit, 23 gesture identification unit, 24 function execution unit, 25 display control unit, 26 Disturbance detection unit, 27 operator recognition unit, 30 display unit, 31 screen.

Abstract

L'invention concerne un dispositif d'affichage d'informations, comprenant : une unité de commande d'affichage (25) ; une unité de détection de gestes (10) ; une unité d'identification de gestes (23) qui met en œuvre une identification d'un geste de l'opérateur sur la base d'informations de gestes, et délivre en sortie un signal basé sur le résultat de l'identification ; une unité d'estimation de distance (21) qui estime la distance entre un opérateur (2) et une unité d'affichage (30) ; et une unité de réglage de fonction d'identification (22) qui règle des gestes qui peuvent être identifiés par l'unité d'identification de gestes (23) de telle sorte que le nombre de gestes qui peuvent être identifiés par l'unité d'identification de gestes (23) lorsque la distance estimée dépasse une première distance définie est inférieur au nombre de gestes qui peut être identifié par l'unité d'identification de gestes (23) lorsque la distance estimée est inférieure ou égale à la première distance définie.
PCT/JP2016/057900 2015-04-20 2016-03-14 Dispositif d'affichage d'informations et procédé d'affichage d'informations WO2016170872A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/550,313 US20180046254A1 (en) 2015-04-20 2016-03-14 Information display device and information display method
DE112016001815.0T DE112016001815T5 (de) 2015-04-20 2016-03-14 Informationsanzeigevorrichtung und Informationsanzeigeverfahren
JP2016547119A JP6062123B1 (ja) 2015-04-20 2016-03-14 情報表示装置及び情報表示方法
CN201680022591.XA CN107533366B (zh) 2015-04-20 2016-03-14 信息显示装置和信息显示方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015085523 2015-04-20
JP2015-085523 2015-04-20

Publications (1)

Publication Number Publication Date
WO2016170872A1 true WO2016170872A1 (fr) 2016-10-27

Family

ID=57144095

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/057900 WO2016170872A1 (fr) 2015-04-20 2016-03-14 Dispositif d'affichage d'informations et procédé d'affichage d'informations

Country Status (5)

Country Link
US (1) US20180046254A1 (fr)
JP (1) JP6062123B1 (fr)
CN (1) CN107533366B (fr)
DE (1) DE112016001815T5 (fr)
WO (1) WO2016170872A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019159936A (ja) * 2018-03-14 2019-09-19 沖電気工業株式会社 第1の情報処理システム、第2の情報処理システムおよび第3の情報処理システム
JP2019164440A (ja) * 2018-03-19 2019-09-26 株式会社リコー 情報処理装置及び情報処理方法
CN111603072A (zh) * 2019-02-22 2020-09-01 合盈光电(深圳)有限公司 以手势驱动的调理机
JP2020149152A (ja) * 2019-03-11 2020-09-17 株式会社デンソーテン 制御装置および制御方法
JP2020160875A (ja) * 2019-03-27 2020-10-01 株式会社Subaru 車両の非接触操作装置、および車両

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170025400A (ko) * 2015-08-28 2017-03-08 삼성전자주식회사 디스플레이장치 및 그 제어방법
WO2017104272A1 (fr) * 2015-12-18 2017-06-22 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN109491496A (zh) * 2017-09-12 2019-03-19 精工爱普生株式会社 头部佩戴型显示装置和头部佩戴型显示装置的控制方法
CN109920309B (zh) * 2019-01-16 2023-02-03 深圳壹账通智能科技有限公司 手语转换方法、装置、存储介质和终端
CN111736693B (zh) * 2020-06-09 2024-03-22 海尔优家智能科技(北京)有限公司 智能设备的手势控制方法及装置
KR102462140B1 (ko) * 2021-10-12 2022-11-03 (주)알피바이오 분배 장치 및 이를 구비하는 소형캡슐이 충진된 경질캡슐 제조 시스템
CN114898409B (zh) * 2022-07-14 2022-09-30 深圳市海清视讯科技有限公司 数据处理方法和设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120124525A1 (en) * 2010-11-12 2012-05-17 Kang Mingoo Method for providing display image in multimedia device and thereof
WO2012147960A1 (fr) * 2011-04-28 2012-11-01 Necシステムテクノロジー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
JP2013047918A (ja) * 2011-08-29 2013-03-07 Fujitsu Ltd 電子装置、生体画像認証装置、生体画像認証プログラムおよび生体画像認証方法
JP2014016912A (ja) * 2012-07-11 2014-01-30 Institute Of Computing Technology Of The Chinese Acadamy Of Sciences 画像処理装置、画像処理方法、および、画像処理プログラム
JP2014206866A (ja) * 2013-04-12 2014-10-30 任天堂株式会社 情報処理プログラム、情報処理システム、情報処理装置、および、情報処理の実行方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563996A (en) * 1992-04-13 1996-10-08 Apple Computer, Inc. Computer note pad including gesture based note division tools and method
WO2011155192A1 (fr) * 2010-06-08 2011-12-15 パナソニック株式会社 Dispositif de génération d'images vidéo, procédé et circuit intégré
US8897491B2 (en) * 2011-06-06 2014-11-25 Microsoft Corporation System for finger recognition and tracking
CN202133990U (zh) * 2011-06-25 2012-02-01 北京播思软件技术有限公司 一种可准确定位光标的移动终端
CN104641410A (zh) * 2012-11-30 2015-05-20 日立麦克赛尔株式会社 影像显示装置,及其设定变更方法,设定变更程序
JP2014137627A (ja) * 2013-01-15 2014-07-28 Sony Corp 入力装置、出力装置および記憶媒体
US9830073B2 (en) * 2014-12-12 2017-11-28 Alpine Electronics, Inc. Gesture assistive zoomable selector for screen

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120124525A1 (en) * 2010-11-12 2012-05-17 Kang Mingoo Method for providing display image in multimedia device and thereof
WO2012147960A1 (fr) * 2011-04-28 2012-11-01 Necシステムテクノロジー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
JP2013047918A (ja) * 2011-08-29 2013-03-07 Fujitsu Ltd 電子装置、生体画像認証装置、生体画像認証プログラムおよび生体画像認証方法
JP2014016912A (ja) * 2012-07-11 2014-01-30 Institute Of Computing Technology Of The Chinese Acadamy Of Sciences 画像処理装置、画像処理方法、および、画像処理プログラム
JP2014206866A (ja) * 2013-04-12 2014-10-30 任天堂株式会社 情報処理プログラム、情報処理システム、情報処理装置、および、情報処理の実行方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019159936A (ja) * 2018-03-14 2019-09-19 沖電気工業株式会社 第1の情報処理システム、第2の情報処理システムおよび第3の情報処理システム
JP2019164440A (ja) * 2018-03-19 2019-09-26 株式会社リコー 情報処理装置及び情報処理方法
CN111603072A (zh) * 2019-02-22 2020-09-01 合盈光电(深圳)有限公司 以手势驱动的调理机
JP2020149152A (ja) * 2019-03-11 2020-09-17 株式会社デンソーテン 制御装置および制御方法
JP2020160875A (ja) * 2019-03-27 2020-10-01 株式会社Subaru 車両の非接触操作装置、および車両
JP7304184B2 (ja) 2019-03-27 2023-07-06 株式会社Subaru 車両の非接触操作装置、および車両

Also Published As

Publication number Publication date
US20180046254A1 (en) 2018-02-15
JPWO2016170872A1 (ja) 2017-04-27
DE112016001815T5 (de) 2017-12-28
CN107533366A (zh) 2018-01-02
CN107533366B (zh) 2020-07-03
JP6062123B1 (ja) 2017-01-18

Similar Documents

Publication Publication Date Title
JP6062123B1 (ja) 情報表示装置及び情報表示方法
US9746931B2 (en) Image processing device and image display device
KR101459441B1 (ko) 차량 내 손가락 사이점을 이용한 사용자 인터페이스 조작 시스템 및 방법
KR101811909B1 (ko) 제스처 인식을 위한 장치 및 방법
US9405373B2 (en) Recognition apparatus
KR101490908B1 (ko) 차량 내 손모양 궤적 인식을 이용한 사용자 인터페이스 조작 시스템 및 방법
KR20150076627A (ko) 차량 운전 학습 시스템 및 방법
CN106066537B (zh) 头戴式显示器和头戴式显示器的控制方法
US20170021770A1 (en) Image processing device, method for controlling image processing device, non-transitory computer readable medium recording program, and display device
EP2904470A1 (fr) Dispositif de traitement d'informations, procédé de commande d'affichage, et programme de modification de défilement de contenu à défilement automatique
EP2985993B1 (fr) Dispositif et procédé de commande pour dispositif
KR20150054825A (ko) 헤드 마운트 디스플레이를 위한 사용자 인터페이스 제공 장치 및 방법
US20180316911A1 (en) Information processing apparatus
KR101438615B1 (ko) 2차원 카메라를 이용한 사용자 인터페이스 조작 시스템 및 방법
US20170010797A1 (en) Gesture device, operation method for same, and vehicle comprising same
JP2016038621A (ja) 空間入力システム
US10928919B2 (en) Information processing device and information processing method for virtual objects operability
US10296101B2 (en) Information processing system, information processing apparatus, control method, and program
WO2018061413A1 (fr) Dispositif de détection de geste
WO2014103217A1 (fr) Dispositif d'opération et procédé de détection d'opération
KR20120048190A (ko) 동작인식을 이용한 차량 기능 제어 시스템
JP2016126687A (ja) ヘッドマウントディスプレイ、操作受付方法および操作受付プログラム
US20230169939A1 (en) Head mounted display and setting method
JP6452658B2 (ja) 情報処理装置、およびその制御方法ならびにプログラム
JP2014048775A (ja) 注視位置特定装置、および注視位置特定プログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016547119

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16782894

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15550313

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 112016001815

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16782894

Country of ref document: EP

Kind code of ref document: A1