WO2012149713A1 - Procédé et appareil pour interaction homme-machine - Google Patents

Procédé et appareil pour interaction homme-machine Download PDF

Info

Publication number
WO2012149713A1
WO2012149713A1 PCT/CN2011/078867 CN2011078867W WO2012149713A1 WO 2012149713 A1 WO2012149713 A1 WO 2012149713A1 CN 2011078867 W CN2011078867 W CN 2011078867W WO 2012149713 A1 WO2012149713 A1 WO 2012149713A1
Authority
WO
WIPO (PCT)
Prior art keywords
human eye
parameter
display screen
terminal display
relative
Prior art date
Application number
PCT/CN2011/078867
Other languages
English (en)
Chinese (zh)
Inventor
蔡畅
王德锁
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2012149713A1 publication Critical patent/WO2012149713A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a method and apparatus for human-computer interaction. Background technique
  • the invention provides a method and device for human-computer interaction, which can realize human-computer interaction in visual effect.
  • the present invention adopts the following technical solutions:
  • a method of human-computer interaction including:
  • Obtaining at least one of a human eye characteristic parameter and a position parameter of the human eye relative to the terminal display screen is specifically: Obtaining a picture in front of the terminal display through the camera;
  • At least one of a human eye feature parameter and a positional parameter of the human eye relative to the terminal display screen is obtained from the picture.
  • Obtaining at least one of a human eye feature parameter and a position parameter of the human eye relative to the terminal display screen from the picture is specifically:
  • the human eye characteristic parameter obtained from the picture and the position parameter of the human eye relative to the terminal display screen are At least one of the mapped values results in a corresponding human eye characteristic parameter and/or a human visual parameter relative to the terminal display screen.
  • the method for performing preset adjustment on the display image of the terminal display screen according to the human eye characteristic parameter and/or the position parameter of the human eye relative to the terminal display screen includes: according to the human eye feature parameter at the current time and/or The positional parameter of the human eye relative to the display screen of the terminal directly adjusts the display image of the terminal display screen.
  • the method for performing preset adjustment on the display image of the terminal display screen according to the human eye characteristic parameter and/or the position parameter of the human eye relative to the display screen of the terminal includes: the human eye feature according to the current time and the previous time
  • the parameters and/or changes in the positional parameters of the human eye relative to the display of the terminal are preset to the display image of the terminal display.
  • the human eye characteristic parameter includes one or more of a human eye's size, or a shape characteristic parameter.
  • a human-machine interaction device including a terminal display screen, further includes a parameter acquisition module and an adjustment module, wherein
  • the parameter acquisition module is configured to acquire at least one of a human eye feature parameter and a position parameter of the human eye relative to the terminal display screen;
  • the adjusting module is configured to perform preset adjustment on the display image of the terminal display screen according to the human eye characteristic parameter and/or the position parameter of the human eye relative to the terminal display screen.
  • the camera is further configured to acquire a picture in front of the display screen of the terminal, and the parameter is configured to determine whether there is a human eye in the picture, and if so, obtain a human eye feature from the picture. At least one of a parameter and a positional parameter of the human eye relative to the terminal display.
  • the parameter obtaining module is specifically configured to obtain, from the picture, a mapping value of at least one of a human eye feature parameter and a position parameter of the human eye relative to the terminal display screen, and according to the preset human eye feature parameter and the human eye a mapping relationship between a position parameter of the terminal display screen and its mapping value, and a human eye corresponding to a mapping value of at least one of a human eye characteristic parameter obtained from the picture and a position parameter of the human eye relative to the terminal display screen Characteristic parameters and/or positional parameters of the human eye relative to the terminal display.
  • the adjusting module is specifically configured to directly adjust the display image of the terminal display screen according to the human eye characteristic parameter at the current time and/or the position parameter of the human eye relative to the terminal display screen, or according to the current time and the previous time
  • the change in the human eye characteristic parameter and/or the positional parameter of the human eye relative to the terminal display screen is preset to the display image of the terminal display screen.
  • the method and device for human-computer interaction acquires at least one of a human eye characteristic parameter and a position parameter of a human eyesight for a terminal display screen, and displays according to the acquired human eye characteristic parameter and/or the human eye relative to the terminal
  • the position parameter of the screen is preset to the display image of the terminal display screen, and the invention realizes the human-computer interaction in the visual effect by the technical solution, so that the screen of the terminal display screen is adjusted according to the change of the eye, thereby greatly Improve the experience of human-computer interaction.
  • FIG. 1 is a flowchart of a human-computer interaction method according to an embodiment of the present invention
  • FIG. 2 is a block diagram of a human-machine interaction device according to an embodiment of the present invention. detailed description
  • the embodiment of the invention provides a method for human-computer interaction, including:
  • the display image of the terminal display screen is presetly adjusted according to the acquired human eye characteristic parameter and/or the positional parameter of the human eye relative to the terminal display screen.
  • FIG. 1 is a flowchart of a human-computer interaction method according to an embodiment of the present invention. As shown in FIG. 1, the method includes:
  • the front camera starts to run, and the video in front of the mobile phone display screen is obtained through the front camera of the mobile phone.
  • a mapping table is set in the database, and the mapping table lists a plurality of bits of the human eye in the picture.
  • Set, size or shape information and correspondingly list the positional parameters of the human eye relative to the terminal display screen corresponding to the positional information of each human eye in the picture, and the size of each human eye in the picture or
  • the human eye characteristic parameter corresponding to the shape information compares the position, size or shape information of the human eye in the picture with the list at the current time, and can obtain the human eye characteristic parameter at the current time and the human eye relative to the terminal display screen.
  • Position parameter is set in the database, and the mapping table lists a plurality of bits of the human eye in the picture.
  • the human eye characteristic parameter includes one or more of the characteristic parameters such as the size and shape of the human eye, and the positional parameters of the human eye relative to the display screen of the terminal include positional parameters such as the direction, angle, and distance of the human eye relative to the display screen of the terminal, It is represented by the coordinates of the human eye relative to the display screen.
  • S16 Perform preset adjustment on the display image of the terminal display screen according to the human eye characteristic parameter at the current time and/or the position parameter of the human eye relative to the terminal display screen.
  • the display image of the terminal display screen may be presetly adjusted according to the change of the human eye characteristic parameter at the current time and the previous time and/or the position parameter of the human eye relative to the terminal display screen.
  • the preset adjustment includes various adjustment schemes, such as the change of the position of the human eye relative to the terminal display from left to right, from right to left, from top to bottom or from bottom to top compared with the previous moment.
  • the display image of the terminal display can be adjusted to change according to the change of the human eye, and accordingly, changes from left to right, from right to left, from top to bottom, or from bottom to top occur, so that the adjusted image is always Pointing at the human eye.
  • a small animal can be designed on the UI (User Interface) so that the animal displayed on the screen of the mobile phone always looks at the direction of the user's eyes, and allows the mobile phone to display when the user's eyes change sharply with respect to the coordinates of the mobile phone.
  • the animal displayed above performs the corresponding limb movement, for example, the current time characteristic parameter changes compared with the previous moment, such as wide eyes or closed eyes, the animal displayed on the screen of the mobile phone can be adjusted to follow the human eye. Change and change, and close your eyes or close accordingly.
  • the position of the human eye relative to the display of the terminal is rapidly approaching or rapidly changing, so that the animals displayed on the screen of the mobile phone can always be used.
  • the direction of the eye of the household while letting the animal ⁇ ⁇ similar to the frightened or happy body movements.
  • the human-machine interaction device includes a terminal display screen 21, and further includes a parameter acquisition module 22 and an adjustment module 23, wherein the parameter acquisition module 22 is configured to acquire at least one of a human eye feature parameter and a position parameter of the human eye relative to the terminal display screen 21; the adjustment module 23 is configured to be based on the acquired human eye feature parameter and/or the human eye relative to the terminal display screen
  • the positional parameter of 21 performs a preset adjustment on the display image of the terminal display 21.
  • the human-machine interaction device further includes a camera 24, and the camera 24 is configured to acquire a picture in front of the display screen 21, and the parameter is obtained from the ear block 22 to determine whether there is a human eye in the picture, and if so, obtain the human eye from the picture.
  • mapping value of the feature parameter and at least one of a position parameter of the human eye relative to the terminal display screen 21 and according to a preset mapping relationship between the human eye feature parameter and the position parameter of the human eye relative to the terminal display screen 21 and its mapping value,
  • Corresponding human eye feature parameters and/or positions of the human eye relative to the terminal display screen 21 are obtained from mapping values of at least one of a human eye feature parameter obtained from a picture and a position parameter of the human eye relative to the terminal display screen 21. parameter.
  • the adjusting module 23 is specifically configured to directly adjust the display image of the terminal display screen 21 according to the human eye characteristic parameter at the current time and/or the position parameter of the human eye relative to the terminal display screen 21, or according to the current time and the front.
  • the display image of the terminal display screen 21 is presetly adjusted by a change in the human eye characteristic parameter at a time and/or a change in the positional parameter of the human eye relative to the terminal display screen 21.
  • the 3D mobile phone is taken as an example, the video is acquired by the front camera of the mobile phone, the picture is obtained from the video, and at least one of the human eye characteristic parameter and the position parameter of the human eye relative to the terminal display screen is obtained from the image, and then according to The acquired human eye characteristic parameter and/or the position parameter of the human eye relative to the terminal display screen are presetly adjusted to the display image of the terminal display screen, so that the adjusted display image of the terminal display screen can be aligned with the person according to a preset manner. Eyes, or make corresponding changes, achieve human-computer interaction in visual effects, thus greatly improving the experience of human-computer interaction. It is to be understood that the specific embodiments of the invention are limited only by the description. It will be apparent to those skilled in the art that the present invention may be made without departing from the spirit and scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé et un appareil pour interaction homme-machine. Le procédé consiste à obtenir au moins un paramètre de caractéristique d'un œil humain et un paramètre de position d'un œil humain par rapport à un écran d'affichage de terminal, et à exécuter un ajustement prédéfini sur l'image affichée de l'écran d'affichage de terminal en fonction du paramètre de caractéristique de l'œil humain et/ou du paramètre de position de l'œil humain par rapport à l'écran d'affichage de terminal. L'invention, via la solution technique précitée, permet de résoudre le problème technique de la mise en œuvre d'une interaction homme-machine dans l'effet visuel.
PCT/CN2011/078867 2011-05-04 2011-08-24 Procédé et appareil pour interaction homme-machine WO2012149713A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2011101142862A CN102207822A (zh) 2011-05-04 2011-05-04 一种人机交互的方法及装置
CN201110114286.2 2011-05-04

Publications (1)

Publication Number Publication Date
WO2012149713A1 true WO2012149713A1 (fr) 2012-11-08

Family

ID=44696681

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/078867 WO2012149713A1 (fr) 2011-05-04 2011-08-24 Procédé et appareil pour interaction homme-machine

Country Status (2)

Country Link
CN (1) CN102207822A (fr)
WO (1) WO2012149713A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103024146A (zh) * 2012-11-20 2013-04-03 广东欧珀移动通信有限公司 一种控制显示屏的方法及移动智能终端

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103377643B (zh) * 2012-04-26 2017-02-15 富泰华工业(深圳)有限公司 字体调整系统及方法
CN102880289B (zh) * 2012-08-20 2016-03-30 广东步步高电子工业有限公司 检测眼球注视点可实现视频播放和暂停的控制系统及方法
CN102945077B (zh) * 2012-10-24 2015-12-16 广东欧珀移动通信有限公司 一种图片查看方法、装置及智能终端
CN103941984A (zh) * 2013-01-18 2014-07-23 维沃移动通信有限公司 一种应用于移动手持设备中的浏览器网页界面智能滑动的方法及系统
CN104423546A (zh) * 2013-08-28 2015-03-18 联芯科技有限公司 一种方向传感实现方法及终端设备
CN103713728B (zh) * 2014-01-14 2016-09-21 东南大学 一种复杂系统人机界面可用性的检测方法
CN106873853A (zh) * 2017-01-18 2017-06-20 上海木爷机器人技术有限公司 屏幕显示方法及装置

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101751209A (zh) * 2008-11-28 2010-06-23 联想(北京)有限公司 一种调整屏幕呈现元素的方法及计算机
CN101893934A (zh) * 2010-06-25 2010-11-24 宇龙计算机通信科技(深圳)有限公司 一种智能调整屏幕显示的方法和装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101454742A (zh) * 2006-05-31 2009-06-10 皇家飞利浦电子股份有限公司 控制观察参数

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101751209A (zh) * 2008-11-28 2010-06-23 联想(北京)有限公司 一种调整屏幕呈现元素的方法及计算机
CN101893934A (zh) * 2010-06-25 2010-11-24 宇龙计算机通信科技(深圳)有限公司 一种智能调整屏幕显示的方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103024146A (zh) * 2012-11-20 2013-04-03 广东欧珀移动通信有限公司 一种控制显示屏的方法及移动智能终端

Also Published As

Publication number Publication date
CN102207822A (zh) 2011-10-05

Similar Documents

Publication Publication Date Title
WO2012149713A1 (fr) Procédé et appareil pour interaction homme-machine
US10627902B2 (en) Devices, methods, and graphical user interfaces for a wearable electronic ring computing device
US10013083B2 (en) Utilizing real world objects for user input
US9886086B2 (en) Gesture-based reorientation and navigation of a virtual reality (VR) interface
US11170580B2 (en) Information processing device, information processing method, and recording medium
KR102494698B1 (ko) 카메라의 초점을 변경하는 방법 및 장치
JP2021536059A (ja) シミュレートされた深度効果のためのユーザインタフェース
US10521648B2 (en) Body information analysis apparatus and method of auxiliary comparison of eyebrow shapes thereof
KR101038323B1 (ko) 영상인식기법을 이용한 화면 프레임 제어장치
CN105573485B (zh) 一种显示内容调节方法及终端
WO2012142869A1 (fr) Procédé et appareil d'ajustement automatique d'affichage d'interface de terminal
US20220262080A1 (en) Interfaces for presenting avatars in three-dimensional environments
WO2013067776A1 (fr) Procédé de commande d'une interface d'affichage de terminal, et terminal
WO2021036420A1 (fr) Procédé de communication vidéo, terminal et support de stockage
CN105677040A (zh) 一种终端控制方法、装置及可穿戴设备
US10444831B2 (en) User-input apparatus, method and program for user-input
WO2020080107A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US9392223B2 (en) Method for controlling visual light source, terminal, and video conference system
US20240028177A1 (en) Devices, methods, and graphical user interfaces for interacting with media and three-dimensional environments
CN111988522B (zh) 拍摄控制方法、装置、电子设备及存储介质
US20230343049A1 (en) Obstructed objects in a three-dimensional environment
US20230316674A1 (en) Devices, methods, and graphical user interfaces for modifying avatars in three-dimensional environments
US10503278B2 (en) Information processing apparatus and information processing method that controls position of displayed object corresponding to a pointing object based on positional relationship between a user and a display region
JP2015052895A (ja) 情報処理装置及び情報処理方法
WO2019176218A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11864849

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11864849

Country of ref document: EP

Kind code of ref document: A1