WO2017185608A1 - 一种多界面交互方法和电子设备 - Google Patents

一种多界面交互方法和电子设备 Download PDF

Info

Publication number
WO2017185608A1
WO2017185608A1 PCT/CN2016/099949 CN2016099949W WO2017185608A1 WO 2017185608 A1 WO2017185608 A1 WO 2017185608A1 CN 2016099949 W CN2016099949 W CN 2016099949W WO 2017185608 A1 WO2017185608 A1 WO 2017185608A1
Authority
WO
WIPO (PCT)
Prior art keywords
threshold
rotation
interactive control
interface
axis
Prior art date
Application number
PCT/CN2016/099949
Other languages
English (en)
French (fr)
Inventor
王子涵
高国威
Original Assignee
乐视控股(北京)有限公司
乐视致新电子科技(天津)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视致新电子科技(天津)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017185608A1 publication Critical patent/WO2017185608A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the embodiments of the present invention relate to the field of virtual reality technologies, and in particular, to a multi-interface interaction method and an electronic device.
  • VR virtual reality
  • the simulation environment is a computer-generated, real-time, dynamic, three-dimensional, realistic image.
  • Perception means that the ideal VR should have the perception of all people, such as the visual perception generated by computer graphics technology.
  • Natural skills refer to the rotation of the person's head, eyes, gestures, or other human behaviors.
  • the computer processes the data that is appropriate to the actions of the participants, responds to the user's input in real time, and feeds back to the user's facial features. .
  • virtual reality devices that have appeared on the market mainly include virtual reality glasses such as virtual reality glasses and virtual reality helmets, and the user can watch a video image of a 3D effect like a theater after wearing the virtual reality device, or Manage and control applications.
  • the head-mounted virtual reality device uses a display mounted on the head to close the person's visual and auditory sense to the outside world, guiding the user to create a feeling in the virtual environment.
  • the display principle is that the left and right eye images are respectively displayed on the left and right eye screens, and the human eye obtains such a difference information and generates a three-dimensional feeling in the mind.
  • Head-mounted virtual reality device as a virtual reality display device with small and closed Strong characteristics, widely used in military training, virtual driving and virtual cities.
  • the inventor of the present invention has found that in the prior art, it is necessary to switch the interface through a peripheral device, such as a touchpad or an operation handle, and switch the current interactive interface to a setting interface or a search interface to be called.
  • a peripheral device such as a touchpad or an operation handle
  • switch the current interactive interface to a setting interface or a search interface to be called.
  • the prior art can only realize the fast interaction of multiple interfaces through peripherals, so that the user's ease of operation using the virtual reality device is greatly reduced, the user experience is reduced, and the other required for real-time retrieval by simple head rotation cannot be achieved. interface.
  • the object of the present invention is to provide a multi-interface interaction method and an electronic device, thereby overcoming the rapid interaction between multiple interfaces that can only be realized by peripheral devices in the prior art, so that the user's ease of operation using the virtual reality device is greatly reduced. Disadvantages.
  • a multi-interface interaction method includes: detecting a rotation direction and a rotation position of an interaction control portion, and comparing the rotation direction with a preset first threshold, and rotating the rotation Comparing the position with the preset second threshold; performing a corresponding multiple interface according to the first comparison result of the rotation direction and the first threshold, and the second comparison result of the rotation position and the second threshold Interactive processing.
  • the method before the detecting the rotation direction and the rotation position of the interaction control portion, the method further includes: presetting the first threshold value of the rotation direction of the interaction control portion and rotating the interaction control portion
  • the second threshold of the position, the first threshold and the second threshold are respectively used to determine whether the rotation direction of the interactive control part and the rotational position of the interactive control part can perform multi-interface interaction processing.
  • the detecting a rotation direction and a rotation position of the interaction control portion, and comparing the rotation direction with a preset first threshold, Comparing the rotational position with a preset second threshold comprises: establishing a three-dimensional coordinate system, wherein the X axis is a direction in which the interactive control portion rotates left and right, the Z axis is a direction in which the interactive control portion rotates back and forth, and the Y axis is an interactive control portion.
  • a direction of up-and-down rotation transforming the first threshold into a negative-direction threshold on the Y-axis in a three-dimensional coordinate system, transforming the second threshold into a coordinate threshold on a Y-axis in a three-dimensional coordinate system; detecting an interactive control portion
  • the direction of rotation and the position of rotation change the direction after the rotation of the interactive control part into the direction value on the Y axis in the three-dimensional coordinate system, and convert the position after the rotation of the interactive control part into the coordinate value on the Y axis in the three-dimensional coordinate system;
  • the direction value is compared to the direction threshold and the coordinate value is compared to the coordinate threshold.
  • the first comparison result according to the rotation direction and the first threshold, and the second comparison result of the rotation position and the second threshold Performing corresponding multi-interface interaction processing includes: when the rotation direction reaches the first threshold and the rotation position reaches the second threshold, extracting other interfaces that interact with the current interface; when the rotation The current interface is maintained unchanged when the direction does not reach the first threshold and/or the rotational position does not reach the second threshold.
  • the method further includes: when the interaction control part returns to the direction and position before the rotation, switching the current interaction interface to the interaction interface before performing the multi-interface interaction processing.
  • a virtual reality multi-interface interaction method by detecting changes in the rotational direction and the rotational position of the interactive control portion, and the preset The threshold values are compared separately.
  • the changes of the rotation direction and the rotation position respectively reach a preset threshold value
  • other interfaces required by the user such as a setting interface and a search interface, can be retrieved to achieve the purpose of quickly interacting with multiple interfaces.
  • the user can manually operate the touchpad or the operating handle to trigger the cumbersome operation of multi-interface interaction.
  • a multi-interface interaction device includes: a rotation detecting module, configured to detect a rotation direction and a rotation position of an interaction control portion, and perform the rotation direction with a preset first threshold Comparing, comparing the rotational position with a preset second threshold; And a comparison processing module, configured to perform corresponding multi-interface interaction processing according to the first comparison result of the rotation direction and the first threshold, and the second comparison result of the rotation position and the second threshold.
  • the method further includes: a threshold setting module, configured to preset a first threshold value of the rotation direction of the interaction control part and a second threshold value of the rotation position of the interaction control part, The first threshold and the second threshold are respectively used to determine whether the rotational direction of the interactive control part and the rotational position of the interactive control part can perform multi-interface interaction processing.
  • the rotation detecting module specifically includes: a coordinate establishing sub-module, configured to establish a three-dimensional coordinate system, wherein the X-axis is a direction in which the interactive control portion rotates left and right, and the Z-axis is The direction in which the interactive control portion rotates back and forth, the Y axis is the direction in which the interactive control portion rotates up and down; the threshold conversion submodule is configured to transform the first threshold into a negative direction threshold on the Y axis in the three-dimensional coordinate system, The second threshold is transformed into a coordinate threshold on the Y-axis in the three-dimensional coordinate system; the rotation transformation sub-module is configured to detect the rotation direction and the rotation position of the interactive control portion, and transform the direction after the rotation of the interactive control portion into the three-dimensional coordinate system.
  • a coordinate establishing sub-module configured to establish a three-dimensional coordinate system, wherein the X-axis is a direction in which the interactive control portion rotates left and right, and the Z-axis is The direction in
  • a direction value on the Y axis the position after the rotation of the interactive control portion is transformed into a coordinate value on the Y axis in the three-dimensional coordinate system; a threshold comparison submodule for comparing the direction value with the direction threshold, and The coordinate value is compared to the coordinate threshold.
  • the comparison processing module specifically includes: a first comparison submodule, configured to: when the rotation direction reaches the first threshold and the rotation position reaches the The second threshold is used to retrieve other interfaces that interact with the current interface; the second comparison sub-module is configured to: when the rotation direction does not reach the first threshold and/or the rotation position does not reach the second At the threshold, the current interface is kept unchanged.
  • the method further includes: an interface switching module, configured to switch the current interactive interface to perform multi-interface interaction processing when the interactive control portion returns to the direction and position before the rotation Interactive interface.
  • the embodiment of the present invention has the following beneficial effects: the embodiment of the present invention
  • a virtual reality multi-interface interaction device detects the change of the interactive control part in the rotation direction and the rotation position, and compares with the preset threshold respectively, when the change of the rotation direction and the rotation position respectively reaches a preset threshold value, It can retrieve other interfaces that the user needs, such as setting interface and search interface, so as to realize the purpose of quickly interacting with multiple interfaces, and avoiding the need for the user to manually operate the touchpad or the operation handle to trigger the complicated operation of multiple interface interaction.
  • an embodiment of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory is stored An instruction executed by the at least one processor, the instruction being executed by the at least one processor to cause the at least one processor to perform various possible implementations of the above method.
  • a virtual reality multi-interface interactive electronic device in the embodiment of the present invention detects the change of the interactive control part in the rotation direction and the rotation position, and presets The thresholds are compared separately.
  • the changes of the rotation direction and the rotation position respectively reach a preset threshold
  • other interfaces required by the user such as a setting interface and a search interface, can be retrieved to realize rapid interaction with multiple interfaces.
  • the purpose is to avoid the need for the user to manually operate the touchpad or the operating handle to trigger the cumbersome operation of multi-interface interaction.
  • embodiments of the present invention provide a non-transitory computer readable storage medium storing computer executable instructions, the computer executable The instructions are used to perform various possible implementations of the above methods.
  • the non-transitory computer readable storage medium of the embodiment of the present invention detects the change of the interactive control part in the rotation direction and the rotation position, and compares with the preset threshold respectively, when the change of the rotation direction and the rotation position respectively reaches the pre-predetermined
  • the threshold is set, other interfaces required by the user, such as a setting interface and a search interface, can be retrieved to achieve the purpose of quickly interacting with multiple interfaces, thereby avoiding the need for the user to manually operate the touchpad or the operating handle to trigger multiple interfaces. The cumbersome operation of interaction.
  • an embodiment of the present invention provides a computer program product, which is stored on a non-transitory computer readable storage medium.
  • a computer program comprising program instructions that, when executed by a computer, cause the computer to perform various possible implementations of the above methods.
  • a computer program product detects the change of the interactive control part in the rotation direction and the rotation position, and performs the difference with the preset threshold separately.
  • other interfaces required by the user such as a setting interface and a search interface, can be retrieved to achieve the purpose of quickly interacting with multiple interfaces, thereby avoiding users.
  • Manual operation of the touchpad or joystick is required to trigger the cumbersome operation of multiple interface interactions.
  • FIG. 1 is a schematic flowchart of a multi-interface interaction method according to an embodiment of the present invention.
  • step S101 in a multi-interface interaction method according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a multi-interface interaction apparatus according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of hardware of an electronic device for performing a multi-interface interaction method according to an embodiment of the present invention.
  • a multi-interface interaction method and an electronic device detect a coordinate and a direction change of an interactive control part on a Y-axis in a three-dimensional coordinate system, and when the degree of change reaches a preset condition, the user needs to be retrieved. Other interfaces to interact with the current interface.
  • a multi-interface interaction method includes the following steps.
  • Step S101 detecting a rotation direction and a rotation position of the interactive control portion, and comparing the rotation direction with a preset first threshold, and comparing the rotation position with a preset second threshold.
  • the main body of the method may be an electronic device, where the electronic device may be a virtual reality device, and if the virtual reality device is a wearable device (such as a VR glasses or a VR helmet), the interactive control portion It can be the user's head, but is not limited to the user's head.
  • the electronic device may be a virtual reality device
  • the virtual reality device is a wearable device (such as a VR glasses or a VR helmet)
  • the interactive control portion It can be the user's head, but is not limited to the user's head.
  • the interactive interface When the user activates or reactivates the virtual reality device for the first time, the interactive interface is initialized to interactively control the current direction and position of the control part (ie, the direction and position of the interactive control part at the time of initialization) as the direction and position of the human eye, and according to the The direction of the visual field of the human eye can be set to the relative position of the interactive interface and the head, that is, the interactive interface is adjusted to the front position of the visual field direction of the human eye.
  • the interactive control portion After determining the initial position of the interactive interface, when the interactive control portion is rotated, the corresponding other interactive interface can be retrieved according to the rotational direction and the rotational position.
  • Step S102 Perform corresponding multi-interface interaction processing according to a first comparison result of the rotation direction and the first threshold, and a second comparison result of the rotation position and the second threshold.
  • step 102 can be implemented as the following steps.
  • the other interaction interfaces corresponding to the current interaction interface are retrieved.
  • the search page can be retrieved; currently, the video play page can be retrieved.
  • the specifically extracted page can be set according to actual needs without being limited by the embodiments of the present invention. In the method of the embodiment of the present invention, for example, when the interaction control part is lowered and reaches a preset level, other interaction interfaces are retrieved.
  • the method may further include: Step S103: When the interaction control part returns to the direction and position before the rotation, switch the current interaction interface to the interaction interface before the multi-interface interaction processing.
  • the current interactive interface is exited and returned to the interactive interface during initialization.
  • the method may further include:
  • Step S100 preset a first threshold value of the rotation direction of the interaction control part and a second threshold value of the rotation position of the interaction control part, where the first threshold value and the second threshold value are respectively used to determine the rotation direction and interactive control of the interaction control part Whether the rotational position of the part can perform multi-interface interaction processing.
  • the first threshold is a relative direction of the head direction after the head is rotated
  • the second threshold is a relative position of the head position after the head is rotated relative to the initialization
  • the user In the process of using the virtual reality device, the user inevitably has a slight movement, for example, a small rotation of the interactive control portion, and such rotation should not be considered as a user's operation to control the interactive interface in the virtual reality device. Therefore, in the embodiment of the present invention, to eliminate the small rotation of the interactive control part, it is necessary to set a relatively reasonable condition for picking up other interactive interfaces to operate the user in the process of using the virtual reality device. Perform real-time detection and process accordingly.
  • a multi-interface interaction method detects a change in an rotational direction and a rotational position of an interactive control portion, and compares it with a preset threshold, respectively, when the change in the rotational direction and the rotational position respectively reaches a preset
  • a preset threshold When the threshold is used, other interfaces required by the user, such as a setting interface and a search interface, can be retrieved to achieve the purpose of quickly interacting with multiple interfaces, thereby avoiding the need for the user to manually operate the touchpad or the operating handle to trigger multi-interface interaction. Trivial operation.
  • step S101 in the first embodiment is described in detail, and includes the following steps.
  • Step S201 Establish a three-dimensional coordinate system, wherein the X axis is a direction in which the interactive control portion rotates left and right, the Z axis is a direction in which the interactive control portion rotates back and forth, and the Y axis is a direction in which the interactive control portion rotates up and down.
  • Step S202 transform the first threshold into a negative direction threshold on the Y axis in the three-dimensional coordinate system, and transform the second threshold into a coordinate threshold on the Y axis in the three-dimensional coordinate system.
  • Step S203 detecting a rotation direction and a rotation position of the interactive control part, converting a direction after the rotation of the interaction control part into a direction value on the Y axis in the three-dimensional coordinate system, and transforming the position after the rotation of the interaction control part into a three-dimensional coordinate system The coordinate value on the Y axis.
  • Step S204 comparing the direction value with the direction threshold, and comparing the coordinate value with the coordinate threshold.
  • a multi-interface interaction method detects a change in an rotational direction and a rotational position of an interactive control portion, and compares it with a preset threshold, respectively, when the change in the rotational direction and the rotational position respectively reaches a preset
  • a preset threshold When the threshold is used, other interfaces required by the user, such as a setting interface and a search interface, can be retrieved to achieve the purpose of quickly interacting with multiple interfaces, thereby avoiding the need for the user to manually operate the touchpad or the operating handle to trigger multi-interface interaction. Trivial operation.
  • a multi-interface interaction device includes: a rotation detecting module 31, configured to detect a rotation direction and a rotation position of an interactive control portion, and to rotate the rotation direction with a preset first Comparing the threshold to compare the rotation position with a preset second threshold; the comparison processing module 32 is configured to: according to the first comparison result of the rotation direction and the first threshold, and the rotation position The second comparison result of the second threshold is performed, and corresponding multi-interface interaction processing is performed.
  • the method further includes: a threshold setting module 33, configured to preset a first threshold of the rotation direction of the interaction control part and a second threshold of the rotation position of the interaction control part, The first threshold and the second threshold are respectively used to determine whether the rotational direction of the interactive control part and the rotational position of the interactive control part can perform multi-interface interaction processing.
  • the rotation detecting module 31 specifically includes: a coordinate establishing sub-module 311, configured to establish a three-dimensional coordinate system, wherein the X-axis is a direction in which the interactive control portion rotates left and right, Z The axis is a direction in which the interactive control portion rotates back and forth, and the Y axis is a direction in which the interactive control portion rotates up and down; the threshold conversion sub-module 312 is configured to transform the first threshold into a negative direction threshold on the Y-axis in the three-dimensional coordinate system, Transforming the second threshold into a coordinate threshold on the Y axis in the three-dimensional coordinate system; the rotation transformation sub-module 313 is configured to detect a rotation direction and a rotation position of the interactive control portion, and convert the direction after the rotation of the interactive control portion to a direction value on the Y axis in the three-dimensional coordinate system, the position after the rotation of the interactive control portion is transformed into a coordinate value on the
  • the comparison processing module 32 specifically includes: a first comparison sub-module 321 configured to: when the rotation direction reaches the first threshold, and the rotation position is reached The second threshold is used to retrieve other interfaces that interact with the current interface; the second comparison sub-module 322 is configured to: when the direction of rotation does not reach the first threshold and/or the rotational position does not reach When the second threshold is described, the current interface is kept unchanged.
  • the method further includes: an interface switching module 34, configured to switch the current interaction interface to perform multi-interface interaction processing when the interaction control part returns to the direction and position before the rotation The previous interactive interface.
  • Embodiments of the present invention provide a non-transitory (non-volatile) computer storage medium storing computer-executable instructions that can perform the methods of any of the foregoing method embodiments.
  • a non-transitory (non-volatile) computer storage medium the computer executable instruction detects a change in the rotation direction and the rotation position of the interactive control portion, and compares with a preset threshold separately.
  • a preset threshold When the change of the rotation direction and the rotation position respectively reaches a preset threshold, other interfaces required by the user, such as a setting interface and a search interface, can be retrieved to achieve the purpose of quickly interacting with multiple interfaces, thereby avoiding the need for the user to manually operate.
  • the touchpad or joystick can trigger the cumbersome operation of multiple interface interactions.
  • FIG. 4 is a schematic diagram of a hardware structure of an electronic device for performing a multi-interface interaction method according to an embodiment of the present invention.
  • the device includes one or more processors 610 and a memory 620.
  • One processor 610 is taken as an example in FIG.
  • the device may also include an input device 630 and an output device 640.
  • the processor 610, the memory 620, the input device 630, and the output device 640 may be connected by a bus or other means, as exemplified by a bus connection in FIG.
  • the memory 620 is a non-transitory computer readable storage medium for storing non-transitory soft Programs, non-transitory computer executables, and modules.
  • the processor 610 executes various functional applications and data processing of the electronic device by running non-transitory software programs, instructions, and modules stored in the memory 620, that is, the processing method of the above method embodiments.
  • the memory 620 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function; the storage data area may store data or the like.
  • memory 620 can include high speed random access memory, and can also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device.
  • memory 620 can optionally include memory remotely located relative to processor 610, which can be connected to the processing device over a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • Input device 630 can receive input digital or character information and generate a signal input.
  • the output device 640 can include a display device such as a display screen.
  • the one or more modules are stored in the memory 620, and when executed by the one or more processors 610, perform: detecting a rotational direction and a rotational position of the interactive control portion, and the rotational direction and the pre- And comparing the first threshold value, comparing the rotation position with a preset second threshold value; according to the first comparison result of the rotation direction and the first threshold value, and the rotation position and the second The second comparison result of the threshold performs corresponding multi-interface interaction processing.
  • the method before the detecting the rotation direction and the rotation position of the interaction control portion, the method further includes: presetting the first threshold value of the rotation direction of the interaction control portion and rotating the interaction control portion
  • the second threshold of the position, the first threshold and the second threshold are respectively used to determine whether the rotation direction of the interactive control part and the rotational position of the interactive control part can perform multi-interface interaction processing.
  • the detecting a rotation direction and a rotation position of the interaction control portion, and comparing the rotation direction with a preset first threshold, and the rotation position is The comparison of the preset second threshold includes: establishing a three-dimensional coordinate system, wherein the X axis is a direction in which the interactive control portion rotates left and right, and the Z axis is a direction in which the interactive control portion rotates back and forth, Y The axis is a direction in which the interactive control portion rotates up and down; the first threshold is converted into a negative direction threshold on the Y axis in the three-dimensional coordinate system, and the second threshold is converted into a coordinate threshold on the Y axis in the three-dimensional coordinate system Detecting the rotation direction and the rotation position of the interactive control part, transforming the direction after the rotation of the interactive control part into the direction value on the Y axis in the three-dimensional coordinate system, and transforming the position after the rotation of the interactive control part into the Y axi
  • the first comparison result according to the rotation direction and the first threshold, and the second comparison result of the rotation position and the second threshold Performing corresponding multi-interface interaction processing includes: when the rotation direction reaches the first threshold and the rotation position reaches the second threshold, extracting other interfaces that interact with the current interface; when the rotation The current interface is maintained unchanged when the direction does not reach the first threshold and/or the rotational position does not reach the second threshold.
  • the method further includes: when the interaction control part returns to the direction and position before the rotation, switching the current interaction interface to the interaction interface before performing the multi-interface interaction processing.
  • the above product can perform the method provided by the embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method.
  • the above product can perform the method provided by the embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method.
  • the electronic device of the embodiment of the invention exists in various forms, including but not limited to:
  • Mobile communication devices These devices are characterized by mobile communication functions and are mainly aimed at providing voice and data communication.
  • Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.
  • Ultra-mobile PC devices These devices belong to the category of personal computers, have computing and processing functions, and generally have mobile Internet access.
  • Such terminals include: PDAs, MIDs, and UMPC devices, such as the iPad.
  • Portable entertainment devices These devices can display and play multimedia content.
  • This type of equipment includes: Audio, video players (such as iPod), handheld game consoles, e-books, and smart toys and portable car navigation devices.
  • the server A device that provides computing services.
  • the server consists of a processor, hard disk, memory, system bus, etc.
  • the server is similar to a general-purpose computer architecture, but because of the need to provide highly reliable services, processing power, stability, and reliability. Security, scalability, manageability and other aspects are high.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • a multi-interface interaction method and an electronic device provided by an embodiment of the present invention, by detecting interaction Controlling a rotation direction and a rotation position of the portion, and comparing the rotation direction with a preset first threshold, and comparing the rotation position with a preset second threshold; according to the rotation direction and the first a first comparison result of the threshold, and a second comparison result of the rotation position and the second threshold, performing corresponding multi-interface interaction processing, and being able to retrieve other interfaces required by the user, so as to quickly interact with multiple interfaces
  • the purpose is to avoid the cumbersome operation of the multi-interface interaction that the user needs to manually operate the touchpad or the operation handle.

Abstract

本发明实施例公开了一种多界面交互方法和电子设备,包括:检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。本发明实施例能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。

Description

一种多界面交互方法和电子设备
交叉引用
本发明要求在2016年4月25日提交中国专利局、申请号为201610262143.9、发明名称为“一种多界面交互方法和装置”的中国专利申请的优先权,该申请的全部内容通过引用结合在本发明中。
技术领域
本发明实施例涉及虚拟现实技术领域,特别涉及一种多界面交互方法和电子设备。
背景技术
虚拟现实(英文:virtual reality,缩写:VR)技术是仿真技术与计算机图形学人机接口技术、多媒体技术、传感技术、网络技术等多种技术的集合,是一门富有挑战性的交叉技术前沿学科和研究领域。VR主要包括模拟环境、感知、自然技能和传感设备等方面。模拟环境是由计算机生成的、实时动态的三维立体逼真图像。感知是指理想的VR应该具有一切人所具有的感知,例如计算机图形技术所生成的视觉感知等。自然技能是指人的头部转动,眼睛、手势、或其他人体行为动作,由计算机来处理与参与者的动作相适应的数据,并对用户的输入作出实时响应,并分别反馈到用户的五官。
目前,市场上已经出现的虚拟现实设备主要有:虚拟现实眼镜和虚拟现实头盔等头戴式虚拟现实设备,用户在穿戴上虚拟现实设备后,能够观看如影院般的3D效果的视频图像,或者对应用程序进行管理和控制。
头戴式虚拟现实设备利用配戴在头部的显示器将人的对外界的视觉、听觉封闭,引导用户产生一种身在虚拟环境中的感觉。其显示原理是左右眼屏幕上分别显示左右眼的图像,人眼获取这种带有差异的信息后在脑海中产生立体感。头戴式虚拟现实设备作为虚拟现实的显示设备,具有小巧和封闭 性强的特点,在军事训练、虚拟驾驶和虚拟城市等项目中具有广泛的应用。
本发明的发明人发现,在现有技术中,需要通过外设进行界面的切换,如:触控板或操作手柄,将当前交互界面切换成待调取的设置界面或搜索界面。现有技术仅能通过外设来实现多界面的快速交互,使得用户使用虚拟现实设备的易操作性大打折扣,降低了用户体验,无法做到通过简单的头部转动实时调取所需要的其他界面。
本发明背景技术部分所公开的信息仅仅旨在增加对本发明的总体背景的理解,而不应当被视为承认或以任何形式暗示该信息已构成本领域一般技术人员所公知的现有技术。
发明内容
本发明的目的在于提供一种多界面交互方法和电子设备,从而克服现有技术中存在的仅能通过外设来实现多界面的快速交互,使得用户使用虚拟现实设备的易操作性大打折扣的缺点。
第一方面,本发明实施例提供的一种多界面交互方法,包括:检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,在所述检测交互控制部位的转动方向和转动位置之前,还包括:预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,所述检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较, 将所述转动位置与预设的第二阈值进行比较包括:建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
在一种可能的实现方式中,在上述技术方案中,所述根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理包括:当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
在一种可能的实现方式中,在上述技术方案中,还包括:当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
与现有技术相比,本发明实施例具有如下有益效果:本发明实施例的一种虚拟现实多界面交互方法,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
第二方面,本发明实施例提供的一种多界面交互装置,包括:转动检测模块,用于检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较; 比较处理模块,用于根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,还包括:阈值设定模块,用于预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,所述转动检测模块具体包括:坐标建立子模块,用于建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;阈值转换子模块,用于将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;转动变换子模块,用于检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;阈值比较子模块,用于将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
在一种可能的实现方式中,在上述技术方案中,所述比较处理模块具体包括:第一比较子模块,用于当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;第二比较子模块,用于当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
在一种可能的实现方式中,在上述技术方案中,还包括:界面切换模块,用于当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
与现有技术相比,本发明实施例具有如下有益效果:本发明实施例的 一种虚拟现实多界面交互装置,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
为实现上述发明目的,在另一方面,本发明实施例提供了一种电子设备,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器执行以上方法的各种可能的实现方式。
与现有技术相比,本发明实施例具有如下有益效果:本发明实施例的一种虚拟现实多界面交互电子设备,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
为实现上述发明目的,在另一方面,本发明实施例提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于执行以上方法的各种可能的实现方式。
本发明实施例的非暂态计算机可读存储介质,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
为实现上述发明目的,在另一方面,本发明实施例提供了一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的 计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行以上方法的各种可能的实现方式。
与现有技术相比,本发明实施例具有如下有益效果:本发明实施例的一种计算机程序产品,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本发明实施例提供的一种多界面交互方法的流程示意图。
图2是本发明实施例提供的一种多界面交互方法中步骤S101的流程示意图。
图3是本发明实施例提供的一种多界面交互装置的结构示意图。
图4为本发明实施例提供的执行一种多界面交互方法的电子设备的硬件结构示意图。
具体实施方式
以下结合附图对本发明的优选实施例进行说明,应当理解,此处所描述的优选实施例仅用于说明和解释本发明,并不用于限定本发明。
下面结合附图,对本发明实施例的具体实施方式进行详细描述,但应当理解本发明的保护范围并不受具体实施方式的限制。
除非另有其它明确表示,否则在整个说明书和权利要求书中,术语“包 括”或其变换如“包含”或“包括有”等等将被理解为包括所陈述的元件或组成部分,而并未排除其它元件或其它组成部分。
为了解决现有技术中存在的仅能通过外设来实现多界面的快速交互,使得用户使用虚拟现实设备的易操作性大打折扣的技术问题,本发明实施例提出了一种虚拟现实多界面交互方法和电子设备。本发明实施例的一种多界面交互方法和电子设备,检测交互控制部位在三维坐标系中Y轴上的坐标和方向变化,当变化程度达到预设的条件时,调取出用户所需要的其他界面,以与当前界面进行交互。
实施例1
如图1所示,本发明实施例的一种多界面交互方法,包括以下步骤。
步骤S101:检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较。
在本发明实施例的方法中,执行本方法的主体可以为电子设备,该电子设备可以为虚拟现实设备,如果虚拟现实设备是可穿戴设备(如VR眼镜或VR头盔等)时,交互控制部位可以是用户头部,但不限于用户头部。
在本发明实施例的方法中,当交互控制部位的转动方向达到第一阈值,且交互控制部位转动位置达到第二阈值时,才认为交互控制部位转动的幅度达到能够调取出其他交互界面的条件;除上述情形以外的其他任何情形,均认为未达到能够调取出其他交互界面的条件。
用户在初次激活或重新激活虚拟现实设备时,会初始化交互界面,以交互控制部位当前的方向和位置(即初始化时交互控制部位的方向和位置)作为人眼平视的方向和位置,并根据该人眼平视的视野方向,可以设置交互界面与头部的相对位置,即将交互界面调整到人眼平视的视野方向的正前方位置。当确定交互界面的初始位置后,在交互控制部位转动时,能够根据转动方向和转动位置调取出相应的其他交互界面。
步骤S102:根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
具体的,步骤102可实施为以下步骤。
(1)当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;也就是说,只有当转动方向和转动位置分别达到预设的阈值时,才能够进行多界面交互处理。
(2)当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。当转动方向或转动位置其中之一没有达到预设的阈值,或者二者均没有达到预设的阈值,均不能触发多界面交互处理。
当所述头部的转动方向达到第一阈值,且头部的转动位置达到第二阈值时,调取出与当前交互界面相应的其他交互界面。如:当前为应用程序图标界面时,可以调取出搜索页面;当前为视频播放页面,可以调取出设置页面。具体调取出的页面可以根据实际需求自行设置,而不受本发明实施例所述的限制。在本发明实施例的方法中,例如可以设置交互控制部位低下且达到预设的程度时,即调取出其他交互界面。
在本发明实施例的方法中,还可以包括:步骤S103:当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
例如:当用户抬头回到初始化时交互控制部位的方向和位置时,退出当前交互界面,重新回到初始化时的交互界面。
在本发明实施例的方法中,在一种可能的实现方式中,在步骤S101之前,还可以包括:
步骤S100:预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
其中,第一阈值为头部转动后相对于初始化时头部方向的相对方向,第二阈值为头部转动后相对于初始化时头部位置的相对位置。
用户在使用虚拟现实设备的过程中,不可避免地会出现轻微的移动,例如交互控制部位小幅的转动,这种转动不应认为是用户对虚拟现实设备中的交互界面进行控制的操作。因此,在本发明实施例中,对这种交互控制部位小幅的转动进行排除,就需要设定一个相对合理的调取出其他交互界面的条件,以对用户在使用虚拟现实设备过程中的操作进行实时检测,并进行相应的处理。
本发明实施例的一种多界面交互方法,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
实施例2
如图2所示,对实施例一中步骤S101的具体流程进行详细说明,包括以下步骤。
步骤S201:建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向。
步骤S202:将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值。
步骤S203:检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值。
步骤S204:将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
本实施例为实施例一中步骤S101的具体说明。本发明实施例的一种多界面交互方法,通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
实施例3
如图3所示,本发明实施例的一种多界面交互装置,包括:转动检测模块31,用于检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;比较处理模块32,用于根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,还包括:阈值设定模块33,用于预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,所述转动检测模块31具体包括:坐标建立子模块311,用于建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;阈值转换子模块312,用于将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;转动变换子模块313,用于检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;阈值比较子模块314,用于将所述方向值与所述方向阈值进行比较,并将所述坐 标值与所述坐标阈值进行比较。
在一种可能的实现方式中,在上述技术方案中,所述比较处理模块32具体包括:第一比较子模块321,用于当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;第二比较子模块322,用于当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
在一种可能的实现方式中,在上述技术方案中,还包括:界面切换模块34,用于当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
实施例4
本发明实施例提供了一种非暂态(非易失性)计算机存储介质,所述计算机存储介质存储有计算机可执行指令,该计算机可执行指令可执行上述任意方法实施例中的方法。
本发明实施例的一种非暂态(非易失性)计算机存储介质,计算机可执行指令通过检测交互控制部位在转动方向和转动位置上的变化,并与预设的阈值分别进行比较,当转动方向和转动位置的变化分别达到预设的阈值时,能够调取出用户需要的其他界面,如设置界面、搜索界面等,以实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。
实施例5
图4是本发明实施例提供的执行一种多界面交互方法的电子设备的硬件结构示意图,如图4所示,该设备包括一个或多个处理器610以及存储器620。图4中以一个处理器610为例。该设备还可以包括:输入装置630和输出装置640。
处理器610、存储器620、输入装置630和输出装置640可以通过总线或者其他方式连接,图6中以通过总线连接为例。
存储器620作为一种非暂态计算机可读存储介质,可用于存储非暂态软 件程序、非暂态计算机可执行程序以及模块。处理器610通过运行存储在存储器620中的非暂态软件程序、指令以及模块,从而执行电子设备的各种功能应用以及数据处理,即实现上述方法实施例的处理方法。
存储器620可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储数据等。此外,存储器620可以包括高速随机存取存储器,还可以包括非暂态存储器,例如至少一个磁盘存储器件、闪存器件、或其他非暂态固态存储器件。在一些实施例中,存储器620可选包括相对于处理器610远程设置的存储器,这些远程存储器可以通过网络连接至处理装置。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入装置630可接收输入的数字或字符信息,以及产生信号输入。输出装置640可包括显示屏等显示设备。
所述一个或者多个模块存储在所述存储器620中,当被所述一个或者多个处理器610执行时,执行:检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,在所述检测交互控制部位的转动方向和转动位置之前,还包括:预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
在一种可能的实现方式中,在上述技术方案中,所述检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较包括:建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y 轴为交互控制部位上下转动的方向;将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
在一种可能的实现方式中,在上述技术方案中,所述根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理包括:当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
在一种可能的实现方式中,在上述技术方案中,还包括:当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
上述产品可执行本发明实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本发明实施例所提供的方法。
本发明实施例的电子设备以多种形式存在,包括但不限于:
移动通信设备:这类设备的特点是具备移动通信功能,并且以提供话音、数据通信为主要目标。这类终端包括:智能手机(例如iPhone)、多媒体手机、功能性手机,以及低端手机等。
超移动个人计算机设备:这类设备属于个人计算机的范畴,有计算和处理功能,一般也具备移动上网特性。这类终端包括:PDA、MID和UMPC设备等,例如iPad。
便携式娱乐设备:这类设备可以显示和播放多媒体内容。该类设备包括: 音频、视频播放器(例如iPod),掌上游戏机,电子书,以及智能玩具和便携式车载导航设备。
服务器:提供计算服务的设备,服务器的构成包括处理器、硬盘、内存、系统总线等,服务器和通用的计算机架构类似,但是由于需要提供高可靠的服务,因此在处理能力、稳定性、可靠性、安全性、可扩展性、可管理性等方面要求较高。
其他具有数据交互功能的电子装置。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。
工业实用性
本发明实施例提供的一种多界面交互方法和电子设备,通过检测交互 控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理,能够调取出用户需要的其他界面,实现快速对多个界面进行交互的目的,避免用户需要手动操作触控板或操作手柄才能触发多界面交互的繁琐操作。

Claims (21)

  1. 一种多界面交互方法,由电子设备执行,所述方法包括:
    检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;
    根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
  2. 根据权利要求1所述的方法,其特征在于,在所述检测交互控制部位的转动方向和转动位置之前,还包括:
    预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
  3. 根据权利要求1所述的方法,其特征在于,所述检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较包括:
    建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;
    将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;
    检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;
    将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
  4. 根据权利要求1所述的方法,其特征在于,所述根据所述转动方向与 所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理包括:
    当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;
    当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
  5. 根据权利要求1-4任意一项所述的方法,其特征在于,还包括:
    当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
  6. 一种多界面交互装置,其特征在于,包括:
    转动检测模块,用于检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;
    比较处理模块,用于根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
  7. 根据权利要求6所述的装置,其特征在于,还包括:
    阈值设定模块,用于预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
  8. 根据权利要求6所述的装置,其特征在于,所述转动检测模块具体包括:
    坐标建立子模块,用于建立三维坐标系,其中X轴为交互控制部位左右 转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;
    阈值转换子模块,用于将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;
    转动变换子模块,用于检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;
    阈值比较子模块,用于将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
  9. 根据权利要求6所述的装置,其特征在于,所述比较处理模块具体包括:
    第一比较子模块,用于当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;
    第二比较子模块,用于当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
  10. 根据权利要求6-9任意一项所述的装置,其特征在于,还包括:
    界面切换模块,用于当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
  11. 一种电子设备,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够:
    检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;
    根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
  12. 根据权利要求11所述的电子设备,其特征在于,所述处理器还能够:在所述检测交互控制部位的转动方向和转动位置之前,预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
  13. 根据权利要求11所述的电子设备,其特征在于,所述检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较包括:
    建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;
    将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;
    检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;
    将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
  14. 根据权利要求11所述的电子设备,其特征在于,所述根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理包括:
    当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;
    当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
  15. 根据权利要求11-14任意一项所述的电子设备,其特征在于,所述处理器还能够:
    当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
  16. 一种非暂态计算机可读存储介质,其特征在于,所述非暂态计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于:
    检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较;
    根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理。
  17. 根据权利要求16所述的非暂态计算机可读存储介质,其特征在于,所述计算机可执行指令还用于:在所述检测交互控制部位的转动方向和转动位置之前,预先设定交互控制部位转动方向的第一阈值和交互控制部位转动位置的第二阈值,所述第一阈值、所述第二阈值分别用于判断交互控制部位的转动方向、交互控制部位的转动位置是否能够进行多界面交互处理。
  18. 根据权利要求16所述的非暂态计算机可读存储介质,其特征在于,所述检测交互控制部位的转动方向和转动位置,并将所述转动方向与预设的第一阈值进行比较,将所述转动位置与预设的第二阈值进行比较包括:
    建立三维坐标系,其中X轴为交互控制部位左右转动的方向,Z轴为交互控制部位前后转动的方向,Y轴为交互控制部位上下转动的方向;
    将所述第一阈值变换为在三维坐标系中Y轴上的负方向阈值,将所述第二阈值变换为在三维坐标系中Y轴上的坐标阈值;
    检测交互控制部位的转动方向和转动位置,将交互控制部位转动后的方向变换为在三维坐标系中Y轴上的方向值,将交互控制部位转动后的位置变换为三维坐标系中Y轴上的坐标值;
    将所述方向值与所述方向阈值进行比较,并将所述坐标值与所述坐标阈值进行比较。
  19. 根据权利要求16所述的非暂态计算机可读存储介质,其特征在于,所述根据所述转动方向与所述第一阈值的第一比较结果,以及所述转动位置与所述第二阈值的第二比较结果,进行相应的多界面交互处理包括:
    当所述转动方向达到所述第一阈值且所述转动位置达到所述第二阈值时,调取出与当前界面进行交互的其他界面;
    当所述转动方向未达到所述第一阈值和/或所述转动位置未达到所述第二阈值时,保持所述当前界面不变。
  20. 根据权利要求16-19任意一项所述的非暂态计算机可读存储介质,其特征在于,所述计算机可执行指令还用于:
    当交互控制部位回到转动前的方向和位置时,将当前交互界面切换为进行多界面交互处理之前的交互界面。
  21. 一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行权利要求1-5中任一项所述的方法。
PCT/CN2016/099949 2016-04-25 2016-09-23 一种多界面交互方法和电子设备 WO2017185608A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610262143.9A CN105975057A (zh) 2016-04-25 2016-04-25 一种多界面交互方法和装置
CN201610262143.9 2016-04-25

Publications (1)

Publication Number Publication Date
WO2017185608A1 true WO2017185608A1 (zh) 2017-11-02

Family

ID=56993173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/099949 WO2017185608A1 (zh) 2016-04-25 2016-09-23 一种多界面交互方法和电子设备

Country Status (2)

Country Link
CN (1) CN105975057A (zh)
WO (1) WO2017185608A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562201B (zh) * 2017-09-08 2020-07-07 网易(杭州)网络有限公司 定向交互方法、装置、电子设备及存储介质
CN112148120A (zh) * 2020-08-18 2020-12-29 华为技术有限公司 一种显示虚拟界面的方法、设备以及存储介质
WO2022041110A1 (zh) * 2020-08-28 2022-03-03 深圳晶泰科技有限公司 Vr头盔、晶体交互系统及方法
CN112068696A (zh) * 2020-08-28 2020-12-11 深圳晶泰科技有限公司 Vr头盔、晶体交互系统及方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092349A (zh) * 2013-01-23 2013-05-08 宁凯 基于Kinect体感设备的全景体验方法
CN103116403A (zh) * 2013-02-16 2013-05-22 广东欧珀移动通信有限公司 一种屏幕切换方法及移动智能终端
CN103593044A (zh) * 2012-08-13 2014-02-19 鸿富锦精密工业(深圳)有限公司 电子装置校正系统及方法
CN103927171A (zh) * 2014-04-14 2014-07-16 广州市久邦数码科技有限公司 一种立体桌面多屏预览的实现方法及系统
CN104298354A (zh) * 2014-10-11 2015-01-21 河海大学 一种人机交互的手势识别方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8457353B2 (en) * 2010-05-18 2013-06-04 Microsoft Corporation Gestures and gesture modifiers for manipulating a user-interface
US20130249793A1 (en) * 2012-03-22 2013-09-26 Ingeonix Corporation Touch free user input recognition
US20140152558A1 (en) * 2012-11-30 2014-06-05 Tom Salter Direct hologram manipulation using imu
CN103878636B (zh) * 2012-12-19 2018-05-04 鸿准精密模具(昆山)有限公司 机床控制系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593044A (zh) * 2012-08-13 2014-02-19 鸿富锦精密工业(深圳)有限公司 电子装置校正系统及方法
CN103092349A (zh) * 2013-01-23 2013-05-08 宁凯 基于Kinect体感设备的全景体验方法
CN103116403A (zh) * 2013-02-16 2013-05-22 广东欧珀移动通信有限公司 一种屏幕切换方法及移动智能终端
CN103927171A (zh) * 2014-04-14 2014-07-16 广州市久邦数码科技有限公司 一种立体桌面多屏预览的实现方法及系统
CN104298354A (zh) * 2014-10-11 2015-01-21 河海大学 一种人机交互的手势识别方法

Also Published As

Publication number Publication date
CN105975057A (zh) 2016-09-28

Similar Documents

Publication Publication Date Title
US11947729B2 (en) Gesture recognition method and device, gesture control method and device and virtual reality apparatus
US20180088663A1 (en) Method and system for gesture-based interactions
US11003307B1 (en) Artificial reality systems with drawer simulation gesture for gating user interface elements
CN106716302B (zh) 用于显示图像的方法、设备和计算机可读介质
WO2015188614A1 (zh) 操作虚拟世界里的电脑和手机的方法、装置以及使用其的眼镜
CN110633008B (zh) 用户交互解释器
GB2553607A (en) Virtual reality
WO2017185608A1 (zh) 一种多界面交互方法和电子设备
US11086475B1 (en) Artificial reality systems with hand gesture-contained content window
US20200387286A1 (en) Arm gaze-driven user interface element gating for artificial reality systems
US20140068526A1 (en) Method and apparatus for user interaction
US10921879B2 (en) Artificial reality systems with personal assistant element for gating user interface elements
US10649616B2 (en) Volumetric multi-selection interface for selecting multiple objects in 3D space
US11430192B2 (en) Placement and manipulation of objects in augmented reality environment
WO2017181588A1 (zh) 一种显示页面定位的方法和电子设备
US11043192B2 (en) Corner-identifiying gesture-driven user interface element gating for artificial reality systems
KR102565711B1 (ko) 관점 회전 방법 및 장치, 디바이스 및 저장 매체
JP2018142313A (ja) 仮想感情タッチのためのシステム及び方法
US10852839B1 (en) Artificial reality systems with detachable personal assistant for gating user interface elements
JP7421010B2 (ja) 情報表示方法、装置及び記憶媒体
CN108012195A (zh) 一种直播方法、装置及其电子设备
CN106502401B (zh) 一种图像控制方法及装置
WO2018000606A1 (zh) 一种虚拟现实交互界面的切换方法和电子设备
CN111901518B (zh) 显示方法、装置和电子设备
JP2018206391A (ja) イマーシブコンテンツ項目を消費するときに見る者をリファレンス方向の方へ回転させる方法及び装置

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16900117

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16900117

Country of ref document: EP

Kind code of ref document: A1