WO2015188691A1 - 一种利用体感遥控设备控制浏览器的方法、装置及系统 - Google Patents

一种利用体感遥控设备控制浏览器的方法、装置及系统 Download PDF

Info

Publication number
WO2015188691A1
WO2015188691A1 PCT/CN2015/079687 CN2015079687W WO2015188691A1 WO 2015188691 A1 WO2015188691 A1 WO 2015188691A1 CN 2015079687 W CN2015079687 W CN 2015079687W WO 2015188691 A1 WO2015188691 A1 WO 2015188691A1
Authority
WO
WIPO (PCT)
Prior art keywords
somatosensory
user
browser
motion
feature
Prior art date
Application number
PCT/CN2015/079687
Other languages
English (en)
French (fr)
Inventor
王永
许科
Original Assignee
阿里巴巴集团控股有限公司
王永
许科
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司, 王永, 许科 filed Critical 阿里巴巴集团控股有限公司
Publication of WO2015188691A1 publication Critical patent/WO2015188691A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present application relates to the field of intelligent display terminal technologies, and in particular, to a method for controlling a browser by using a somatosensory remote control device.
  • the present application also relates to a device for controlling a browser using a somatosensory remote control device and a system for controlling a browser using a somatosensory remote control device.
  • the intelligent display terminal represented by smart TV is such a terminal device that helps people obtain external information.
  • Smart TVs can get the content that users need most from various channels such as the Internet, storage media and computers, and display the content of the program clearly on the screen through an easy-to-use interface.
  • the smart TV can realize various application services such as network search, network television, video on demand, digital music, network news, network video telephone, and the like.
  • smart TV terminals are equipped with an operating system (such as Android system) like a smart phone, and support the installation and uninstallation of applications. Users can install and uninstall various applications to expand their own functions.
  • the remote control of the smart TV is controlled by the key operation to achieve the purpose of controlling the smart TV, and many remote controllers have the function of the mouse at the same time, the remote control controls the smart TV by simulating the operation of the mouse, and browses the webpage on the smart television by using the remote controller.
  • One is to control the browser by moving the focus on the webpage through the buttons of the remote controller; the second is to control the browser through an external mouse to browse the webpage on the smart television terminal.
  • the method for controlling a browser to browse a webpage on the smart television terminal of the above prior art has obvious drawbacks.
  • the method of controlling the browser by moving the focus on the webpage by the button of the remote controller is very inconvenient, especially in the case that the content of the currently browsed webpage is large, the webpage browsing by the button is very cumbersome and time consuming; If the webpage is browsed on the smart TV through a specially designed external mouse, when the smart TV screen is large or placed at a high position, the user needs to look up for a long time and bend over to operate the mouse, causing the user to be physically exhausted, and the solution A special mouse needs to be configured for the smart TV, and the mouse is generally not a standard configuration of the smart TV, which increases the hardware cost and wastes.
  • the present application provides a method for controlling a browser by using a somatosensory remote control device to solve the inconvenience and cumbersome and time consuming problems existing in the prior art.
  • the present application further provides an apparatus for controlling a browser using a somatosensory remote control device and a system for controlling the browser using the somatosensory remote control device.
  • the present invention provides a method for controlling a browser using a somatosensory remote control device, including:
  • the somatosensory action data corresponding to the user's somatosensory action collected by the somatosensory remote control device is read based on the smart display terminal;
  • the browser is notified to perform the browser control action and displayed on the smart display terminal.
  • the calculating the speed characteristics of the user's somatosensory motion based on the somatosensory motion data includes:
  • the speed feature includes: an angle between a speed direction of the user's somatosensory motion and a feature direction between the feature directions, and the user senses a characteristic speed value and a characteristic acceleration value in the feature direction.
  • the determining whether the speed feature is in a preset threshold interval comprises:
  • the characteristic speed value If the feature angle, the characteristic speed value, and the characteristic acceleration value are all within a corresponding threshold interval, determining that the user's somatosensory motion corresponding to the characteristic acceleration value is an effective operation.
  • the browser control action includes:
  • Scroll up to display the page Scroll down to display the page, go back to the previous page, go to the next level, zoom in and/or zoom out.
  • the user somatosensory action comprises:
  • the direction of the upward swinging motion and the downward swinging motion correspond to the direction of the y-axis in the three-dimensional space
  • the direction of the leftward swinging motion and the rightward waving motion correspond to the direction of the x-axis in the three-dimensional space
  • the forward swinging motion and The direction of the backward swing motion corresponds to the direction of the z-axis in the three-dimensional space.
  • the reading the somatosensory action data corresponding to the user's somatosensory motion collected by the somatosensory remote control device includes:
  • the operation guide based on the screen of the smart display terminal displays the user's somatosensory action.
  • the somatosensory remote control device includes:
  • Somatosensory remote control somatosensory handle, smart mobile terminal and/or wearable device.
  • the smart display terminal includes:
  • the invention also provides an apparatus for controlling a browser by using a somatosensory remote control device, comprising:
  • a data reading unit configured to read the somatosensory motion data of the corresponding user's somatosensory motion collected by the somatosensory remote control device
  • a calculating unit configured to calculate a speed feature of the user's somatosensory motion based on the somatosensory motion data
  • a determining unit configured to determine whether the speed feature is in a preset threshold interval
  • the browser control action obtaining unit is configured to consult the preset user somatosensory action-browser control action correspondence data, and obtain a browser control action corresponding to the user somatosensory action;
  • an execution unit configured to notify the browser to execute the browser control action, and display the same on the smart display terminal.
  • the calculating unit comprises:
  • a direction angle calculation subunit configured to calculate an angle between a direction of the speed of the user's somatosensory motion and a direction between the x-axis, the y-axis, and the z-axis direction;
  • a feature acquisition subunit configured to use a direction of the axis with the smallest angle of the direction as a feature direction of the user's somatosensory action, and acquire a velocity feature of the user's somatosensory action in the feature direction;
  • the speed feature includes: an angle between a speed direction of the user's somatosensory motion and a feature direction between the feature directions, and the user senses a characteristic speed value and/or a characteristic acceleration value in the feature direction.
  • the device comprises:
  • the operation guide display unit is configured to display an operation guide of the user's somatosensory action based on the screen of the smart display terminal.
  • the method comprises:
  • An intelligent display terminal comprising the apparatus for controlling a browser using a somatosensory remote control device according to any one of claims 9 to 11;
  • the smart display terminal and the somatosensory remote control device communicate based on a wireless connection
  • the somatosensory remote control device includes:
  • the somatosensory motion data acquisition unit is configured to collect somatosensory motion data corresponding to the user's somatosensory motion based on the motion sensor;
  • the somatosensory motion data transmitting unit is configured to transmit the somatosensory motion data to the smart display terminal based on the wireless connection.
  • the motion sensor comprises:
  • Acceleration sensors angular velocity sensors and/or gyroscopes.
  • the wireless connection comprises:
  • the operation provided by the present application is simple, convenient, and convenient, and does not require additional equipment to be deployed, saving resources.
  • the method for controlling a browser to browse a webpage on a smart television terminal provided by the prior art is inconvenient, cumbersome and time consuming, and causes waste of resources.
  • the method for controlling a browser by using a somatosensory remote control device includes: reading, according to an intelligent display terminal, somatosensory action data corresponding to a somatosensory action collected by a somatosensory remote control device; and calculating the user based on the somatosensory action data Determining whether the speed feature is in a preset threshold interval; if the speed feature is in a preset threshold interval, determining that the user's somatosensory action corresponding to the speed feature is an effective operation; and referring to the preset user
  • the somatosensory action-browser controls the action correspondence data, acquires a browser control action corresponding to the user's somatosensory action, and notifies the browser to execute the browser control action, and displays the same on the smart display terminal.
  • the method for controlling a browser by using the somatosensory remote control device provided by the present application the user's somatosensory action corresponding to the browser control action of the browser is simple, convenient, and convenient, and avoids the operation of browsing the webpage by using the button control in the prior art. Inconvenient, cumbersome and time consuming problems; and somatosensory devices have been widely used in smart display terminals, eliminating the need to configure additional devices and saving resources.
  • FIG. 1 is a flowchart of a method for controlling a browser by using a somatosensory remote control device according to a first embodiment of the present application.
  • FIG. 2 is a flowchart of a process of step S104 in a method for controlling a browser by using a somatosensory remote control device according to a first embodiment of the present application.
  • FIG. 3 is a schematic diagram of an apparatus for controlling a browser using a somatosensory remote control device according to a second embodiment of the present application.
  • FIG. 4 is a schematic diagram of a system for controlling a browser using a somatosensory remote control device according to a third embodiment of the present application.
  • the present application provides a method for controlling a browser by using a somatosensory remote control device to solve the inconvenience and cumbersome and time consuming problems existing in the prior art.
  • the present application further provides an apparatus for controlling a browser using a somatosensory remote control device and a system for controlling the browser using the somatosensory remote control device.
  • the first embodiment of the present application provides a method for controlling a browser using a somatosensory remote control device.
  • FIG. 1 a flowchart of a method for controlling a browser using a somatosensory remote control device according to a first embodiment of the present application is shown.
  • S101 displaying an operation guide of the user's somatosensory action based on the screen of the smart display terminal.
  • the operation guide for displaying a user's somatosensory action based on the screen of the smart display terminal refers to a user guide interface for displaying a user operation guide on a screen of the smart display terminal, where the user guide interface is composed of characters and/or images;
  • the default state of the user guidance interface is on, that is, the user guidance interface is displayed on the screen of the smart display terminal; the user can disable the display of the user guidance interface function on the screen of the smart display terminal by setting, or set the ignore This guides the display of the interface functions.
  • the operation guide for displaying the user's somatosensory action based on the screen of the smart display terminal may also be implemented by other methods than the embodiment;
  • a prompt for issuing an operation guide to the user by playing a sound, or by playing a pre-recorded user operation guide video on the screen of the smart display terminal can be used to implement the operation guide for displaying the user's somatosensory action based on the screen of the smart display terminal according to the step.
  • the physical motion data corresponding to the user's somatosensory motion uploaded by the somatosensory remote control device received by the smart display terminal is read.
  • the somatosensory motion data corresponding to the user's somatosensory motion collected by the somatosensory remote control device is read based on the smart display terminal.
  • the somatosensory remote control device includes a somatosensory remote controller, a somatosensory handle, a smart mobile terminal, and/or a wearable device; the user's somatosensory action refers to a user using a somatosensory device (eg, a somatosensory remote controller, a somatosensory handle, a smart mobile terminal, and/or Wearable device) The completed human body movement or expression.
  • a somatosensory device eg, a somatosensory remote controller, a somatosensory handle, a smart mobile terminal, and/or Wearable device
  • the user's somatosensory actions performed by the user through the somatosensory remote control device include: an upward swing motion, a downward swing motion, a left swing motion, a right swing motion, a forward swing motion, and/or a backward swing motion;
  • the "upper and lower” as used herein is relative to the space, that is, the direction of the upward swinging motion and the downward swinging motion correspond to the direction of the y-axis in the three-dimensional space; similarly, the leftward waving motion and The direction of the rightward swing motion corresponds to the direction of the x-axis in the three-dimensional space, and the direction of the forward swing motion and the backward swing motion corresponds to the direction of the z-axis in the three-dimensional space.
  • the user sends a user's somatosensory action by means of the somatosensory remote control device;
  • the somatosensory remote control device collects motion state data based on its built-in sensors (such as an acceleration sensor, an angular velocity sensor, and/or a gyroscope), that is, collects somatosensory motion data corresponding to the user's somatosensory motion;
  • sensors such as an acceleration sensor, an angular velocity sensor, and/or a gyroscope
  • the somatosensory remote control device uploads the somatosensory action data to the smart display terminal based on a wireless connection (eg, an infrared connection, a WIFI connection, and/or a Bluetooth connection).
  • a wireless connection eg, an infrared connection, a WIFI connection, and/or a Bluetooth connection.
  • the somatosensory motion data corresponding to the user's somatosensory motion uploaded by the somatosensory remote control device received by the smart display terminal is read.
  • step S102 After reading the somatosensory motion data corresponding to the user's somatosensory motion uploaded by the somatosensory remote control device received by the smart display terminal according to the above step S102, calculating the motion sensor based on the somatosensory motion data, and calculating the speed feature of the user's somatosensory motion, the speed feature including : an angle between a speed direction of the user's somatosensory motion and a feature direction between the feature directions, the user senses a characteristic speed value and a characteristic acceleration value in the feature direction, and uses the speed feature as a measurement and judgment user The standard of somatosensory action.
  • the speed feature calculation process is divided into the following two steps:
  • the speed direction refers to a moving direction of the user's somatosensory motion.
  • the speed direction of the user's somatosensory motion is used as the moving direction of the user's somatosensory motion.
  • the angle between the speed direction of the user's somatosensory motion and the positive or negative direction of the x-axis, the y-axis, and the z-axis is calculated.
  • the direction of the axis with the smallest angle is selected as the user.
  • the characteristic direction of the somatosensory action is selected as the user.
  • the angle between the speed direction of the user's somatosensory motion and the positive direction of the x-axis is the smallest, it is determined that the user's somatosensory motion is a rightward motion, that is, a rightward swing motion.
  • the angle, speed, and acceleration of the direction corresponding to the feature direction are used as the speed feature of the user's somatosensory action
  • angle between the user's somatosensory motion and the direction of the feature is an angle of the feature direction
  • the speed of the user's somatosensory motion in the feature direction is a characteristic speed value
  • the magnitude of the acceleration of the user's somatosensory motion in the feature direction is a characteristic acceleration value.
  • the displacement characteristics (displacement magnitude, displacement direction) of the user's somatosensory motion are used as criteria for measuring and judging the user's somatosensory motion, and are not limited herein.
  • the purpose of determining whether the speed feature is in a preset threshold interval in this step is to judge the validity of the user's somatosensory action, and filter the invalid user operation; when the user uses the somatosensory remote control device to send the user's somatosensory action, the user feels There is no uniform standard for the action.
  • the user may have some unconscious somatosensory motion detected. For example, taking the action of the somatosensory remote control device may not control the browser, but may be considered as the control of the browser. A misoperation occurred.
  • the present embodiment determines a unified criterion, that is, a threshold interval.
  • the threshold interval filters the invalid user's somatosensory action, prevents a large number of invalid user somatosensory actions from acting, and, in response to each user's somatosensory action, sends an instruction to the browser in the form of inter-process communication, if there is no threshold interval.
  • the judgment filtering once receiving a large number of user-sensed motion characteristics corresponding to the user's somatosensory action, will generate a large amount of process data, causing process or even process collapse; the establishment of the threshold interval ensures the stability and rationality of the system.
  • FIG. 2 it is a flowchart of the process of step S104 in the method for controlling a browser using a somatosensory remote control device according to the first embodiment of the present application.
  • determining whether the user's somatosensory action corresponding speed feature is in a preset threshold interval includes the following three steps:
  • Step S201 determining whether the angle of the feature direction is in a preset direction threshold interval
  • the direction threshold interval is an angle interval of a positive and negative range in the feature direction; taking the positive direction of the X axis as an example, the direction threshold interval may be a clamp of plus or minus 15° compared with the positive direction of the X axis. In the angular range, if the range is exceeded, the sender of the somatosensory action is likely not to be controlled by the browser;
  • step S202 is performed to further determine whether the feature speed value of the feature direction is in a range of a preset speed threshold interval;
  • step S204 is executed without any processing.
  • Step S202 determining whether the feature speed value of the feature direction is in a preset speed threshold interval
  • the speed threshold interval described in this step includes a minimum threshold and a maximum threshold; if the somatosensory action issuer intends to perform a browser operation, the speed will be within a reasonable range, and may not be too slow or too fast, therefore
  • the preset speed threshold interval may be used to determine whether the somatosensory action belongs to a control operation of the browser; the speed may be an average speed of the entire operation process;
  • step S203 is performed to further determine whether the characteristic acceleration value of the feature direction is in a range of a preset acceleration threshold interval;
  • step S204 is performed without any processing. Just fine.
  • Step S203 determining whether the characteristic acceleration value of the feature direction is in a preset acceleration threshold interval
  • the acceleration threshold interval described in this step includes a minimum threshold and a maximum threshold; similarly, if the somatosensory action issuer intends to perform a browser operation, its speed change will be reasonable. In the interval range, it is unreasonable that the speed changes too fast or too slow. Therefore, the preset acceleration threshold interval can be used to determine whether the somatosensory action belongs to the control operation of the browser; the acceleration can be the average of the entire somatosensory action process. Acceleration
  • step S105 is performed to consult the preset user somatosensory action-browser control action correspondence data to obtain the user's somatosensory action. Corresponding browser control actions;
  • step S204 is performed without any processing. Just fine.
  • the user's somatosensory action can be The operation determined to be valid in the feature direction may further refer to the preset user somatosensory action-browser control action correspondence data to acquire the user The browser control action corresponding to the somatosensory action.
  • the determining order of the speed features corresponding to the user's somatosensory motion is to first determine whether the angle of the feature direction is in a preset direction threshold interval, and secondly determine the feature speed value of the feature direction. Whether it is in a preset speed threshold interval, and finally determining whether the characteristic acceleration value of the feature direction is in a preset acceleration threshold interval;
  • the determination order of the feature direction angle, the feature speed value, and the characteristic acceleration value in the corresponding threshold interval is not fixed, and the feature direction angle and the feature speed value may also be adjusted according to different systems or scenes. And determining whether the characteristic acceleration value is in the corresponding threshold interval, for example, first determining whether the characteristic speed value of the feature direction is in a preset speed threshold interval or determining whether the characteristic acceleration value of the feature direction is in a preset acceleration threshold interval This embodiment is not limited herein.
  • S105 Refer to the preset user somatosensory action-browser control action correspondence data, and obtain a browser control action corresponding to the user somatosensory action.
  • the browser control action refers to an instruction action capable of controlling a browser to operate thereon.
  • the browser control action includes: scrolling up to display a webpage, scrolling down to display a webpage, returning to a previous webpage, entering a next-level webpage, zooming in on a webpage, and/or reducing a webpage.
  • this step acquires the browser control action corresponding to the user's somatosensory action by referring to the preset user somatosensory action-browser control action correspondence data, that is, A valid user somatosensory action corresponds to the browser control action.
  • the preset user somatosensory action-browser control action correspondence data is as follows:
  • the upward swinging action in the user somatosensory action corresponds to a browser control action of scrolling up the displayed webpage
  • the downward swinging action in the user somatosensory action corresponds to a browser control action of scrolling down to display a webpage
  • the leftward waving action in the user somatosensory action corresponds to a browser control action of returning to the previous level webpage
  • the rightward waving action in the user somatosensory action corresponds to a browser control action of entering a next-level webpage
  • the forward swinging motion in the user's somatosensory action corresponds to the browser control of the enlarged display webpage Work
  • the backward swinging motion in the user's somatosensory action corresponds to a browser control action of reducing the displayed webpage.
  • S106 The browser is notified to execute the browser control action and displayed on the smart display terminal.
  • the execution command of the user somatosensory action corresponding browser control action is sent to the browser, and the browser is notified to execute the user.
  • a browser control action corresponding to the somatosensory action after the browser receives the execution command of the browser control action corresponding to the user's somatosensory action, the command is executed, and the result after the command is executed (for example, jumping to the previous level)
  • the subsequent web page or the web page after entering the next level, etc.) is displayed on the screen of the smart display terminal.
  • the second embodiment of the present application provides an apparatus for controlling a browser using a somatosensory remote control device.
  • FIG. 3 a schematic diagram of an apparatus for controlling a browser using a somatosensory remote control device according to a second embodiment of the present application is shown.
  • the data reading unit 301 is configured to read the somatosensory action data corresponding to the somatosensory action of the user acquired by the somatosensory remote control device;
  • the calculating unit 302 is configured to calculate a speed feature of the user's somatosensory motion based on the somatosensory motion data;
  • the determining unit 303 is configured to determine whether the speed feature is in a preset threshold interval
  • the browser control action obtaining unit 304 is configured to consult a preset user somatosensory action-browser control action correspondence data, and obtain a browser control action corresponding to the user somatosensory action;
  • the executing unit 305 is configured to notify the browser to perform the browser control action and display the smart display terminal.
  • the calculating unit 302 includes:
  • a direction angle calculation sub-unit 302-1 configured to calculate an angle between a direction of the speed of the user's somatosensory motion and a direction between the x-axis, the y-axis, and the z-axis direction;
  • a feature acquisition sub-unit 302-2 configured to use a direction of the axis with the smallest angle of the direction as a feature direction of the user's somatosensory action, and acquire a velocity feature of the user's somatosensory action in the feature direction;
  • the speed feature includes: an angle between a speed direction of the user's somatosensory motion and a feature direction between the feature directions, and the user senses a characteristic speed value and/or a characteristic acceleration value in the feature direction.
  • the device for controlling a browser by using a somatosensory remote control device includes:
  • the operation guide display unit 306 is configured to display an operation guide of the user's somatosensory action based on the smart display terminal screen.
  • a third embodiment of the present application provides a system for controlling a browser using a somatosensory remote control device.
  • FIG. 4 a schematic diagram of a system for controlling a browser using a somatosensory remote control device according to a third embodiment of the present application is shown.
  • the system for controlling a browser by using a somatosensory remote control device includes:
  • the smart display terminal 410 includes:
  • the apparatus 411 for controlling a browser using a somatosensory remote control device according to the above embodiment
  • the somatosensory remote control device 420 includes:
  • the somatosensory motion data collecting unit 421 is configured to collect somatosensory motion data corresponding to the user's somatosensory motion based on the motion sensor;
  • the somatosensory motion data transmitting unit 422 is configured to send the somatosensory motion data collected by the somatosensory motion data collecting unit 421 to the smart display terminal 410 through a wireless connection;
  • the motion sensor refers to a device for collecting motion data built in the somatosensory remote control device 420, such as an acceleration sensor or an angular velocity sensor built in the smart bracelet, a built-in gyroscope built in the somatosensory handle or a smart mobile terminal;
  • the wireless connection refers to a communication manner between the somatosensory remote control device 420 and the smart display terminal 410, for example, the somatosensory remote controller performs an infrared connection with an intelligent display terminal (eg, a smart television terminal).
  • somatosensory handles communicate with intelligent display terminals (eg smart TV terminals, PCs) via Bluetooth connection, smart mobile terminals (eg smartphones, tablets) and wearable devices (eg smart bracelets, smart rings) Communicate with the intelligent display terminal via Bluetooth or WIFI connection.
  • intelligent display terminals eg smart TV terminals, PCs
  • smart mobile terminals eg smartphones, tablets
  • wearable devices eg smart bracelets, smart rings
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media including both permanent and non-persistent, removable and non-removable media may be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device.
  • computer readable media does not include non-transitory computer readable media, such as modulated data signals and carrier waves.
  • embodiments of the present application can be provided as a method, system, or computer program product.
  • the present application can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment in combination of software and hardware.
  • the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种利用体感遥控设备控制浏览器的方法,包括:基于智能显示终端,读取体感遥控设备采集的对应用户体感动作的体感动作数据(S102);基于体感动作数据计算出所述用户体感动作的速度特征(S103);判断所述速度特征是否处于预设的阈值区间(S104);若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作(S105);通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示(S106)。提供的利用体感遥控设备控制浏览器的方法,避免了现有技术通过按键控制浏览网页存在的操作不便、繁琐费时的问题;并且无需配置额外的设备,节省资源。

Description

一种利用体感遥控设备控制浏览器的方法、装置及系统 技术领域
本申请涉及智能显示终端技术领域,具体涉及一种利用体感遥控设备控制浏览器的方法。本申请同时涉及一种利用体感遥控设备控制浏览器的装置和利用体感遥控设备控制浏览器的系统。
背景技术
随着科技的不断发展,人们通过多种多样的智能终端连接互联网以获取丰富的信息,以智能电视为代表的智能显示终端正是这样一种帮助人们获取外界信息的终端设备。智能电视可以从互联网、存储介质以及计算机等多种渠道获得用户最需要的节目内容,并通过简单易用的操作界面将节目内容在屏幕上清晰地展现。与传统电视的应用平台相比,智能电视可实现网络搜索、网络电视、视频点播、数字音乐、网络新闻、网络视频电话等各种应用服务。此外,智能电视终端像智能手机一样,搭载了操作系统(比如:Android系统),并且支持应用程序的安装和卸载,用户可自行安装和卸载各种应用程序对自身功能进行扩充。
目前,智能电视的遥控器都是通过按键操作达到控制智能电视的目的,并且很多遥控器同时具有鼠标的功能,遥控器通过模拟鼠标的操作来控制智能电视,利用遥控器在智能电视上浏览网页时,一是通过遥控器的按键移动网页上的焦点从而控制浏览器;二是通过外接鼠标控制浏览器从而在智能电视终端上浏览网页。
上述现有技术所述智能电视终端上控制浏览器浏览网页的方法存在明显缺陷。
现有技术下,上述通过遥控器的按键移动网页上的焦点控制浏览器的方法,操作非常不便,,尤其是在当前浏览网页页面内容较多的情况下,通过按键浏览网页会非常繁琐费时;如果通过专门设置的外接鼠标在智能电视上浏览网页,则智能电视屏幕较大或者放置的位置较高时,用户需要长时间的抬头并且弯腰来操作鼠标,造成用户身体疲劳,并且,该方案需要给智能电视配置专门的鼠标,而鼠标一般并非智能电视的标准配置,这就增加了硬件成本,造成浪费。
发明内容
本申请提供一种利用体感遥控设备控制浏览器的方法,以解决现有技术存在的操作不便和繁琐费时问题。本申请另外提供一种利用体感遥控设备控制浏览器的装置和利用体感遥控设备控制浏览器的系统。
本发明提供一种利用体感遥控设备控制浏览器的方法,包括:
基于智能显示终端,读取体感遥控设备采集的对应用户体感动作的体感动作数据;
基于所述体感动作数据计算出所述用户体感动作的速度特征;
判断所述速度特征是否处于预设的阈值区间;
若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;
查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;
通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
可选的,所述基于所述体感动作数据计算出所述用户体感动作的速度特征包括:
计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴方向之间的方向夹角;
将所述方向夹角最小的轴的方向作为所述用户体感动作的特征方向,并获取所述用户体感动作在该特征方向上的速度特征;
其中,所述速度特征包括:所述用户体感动作的速度方向与该特征方向之间的特征方向夹角,所述用户体感动作在所述特征方向的特征速度值和特征加速度值。
可选的,所述判断所述速度特征是否处于预设的阈值区间包括:
判断所述特征方向夹角是否处于预设的方向阈值区间、所述特征方向的特征速度值是否处于预设的速度阈值区间以及所述特征方向的特征加速度值是否处于预设的加速度阈值区间;
若所述特征方向夹角、特征速度值和特征加速度值三者均处于相应的阈值区间内,则判定所述特征加速度值对应的用户体感动作为有效操作。
可选的,所述浏览器控制动作包括:
向上滚动显示网页、向下滚动显示网页、返回上一级网页、进入下一级网页、放大显示网页和/或缩小显示网页。
可选的,所述用户体感动作包括:
向上挥动动作、向下挥动动作、向左挥动动作、向右挥动动作、向前挥动动作和/或向后挥动动作;
其中,所述向上挥动动作和向下挥动动作的方向对应于三维空间中y轴的方向,向左挥动动作和向右挥动动作的方向对应于三维空间中x轴的方向,向前挥动动作和向后挥动动作的方向对应于三维空间中z轴的方向。
可选的,所述读取体感遥控设备采集的对应用户体感动作的体感动作数据之前包括:
基于智能显示终端的屏幕显示用户体感动作的操作指引。
可选的,所述体感遥控设备包括:
体感遥控器、体感手柄、智能移动终端和/或可穿戴式设备。
可选的,所述智能显示终端包括:
智能电视终端和/或PC。
本发明同时提供一种利用体感遥控设备控制浏览器的装置,包括:
数据读取单元,用于读取体感遥控设备采集的对应用户体感动作的体感动作数据;
计算单元,用于基于所述体感动作数据计算出所述用户体感动作的速度特征;
判断单元,用于判断所述速度特征是否处于预设的阈值区间;
若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;
浏览器控制动作获取单元,用于查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;
执行单元,用于通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
优选的,所述计算单元包括:
方向夹角计算子单元,用于计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴方向之间的方向夹角;
特征获取子单元,用于将所述方向夹角最小的轴的方向作为所述用户体感动作的特征方向,并获取所述用户体感动作在该特征方向的速度特征;
其中,所述速度特征包括:所述用户体感动作的速度方向与该特征方向之间的特征方向夹角,所述用户体感动作在所述特征方向上的特征速度值和/或特征加速度值。
优选的,该装置包括:
操作指引显示单元,用于基于智能显示终端屏幕显示用户体感动作的操作指引。
优选的,包括:
智能显示终端,包括如权利要求9至11任意一项所述的利用体感遥控设备控制浏览器的装置;以及
体感遥控设备;
其中,所述智能显示终端与所述体感遥控设备基于无线连接进行通信;
所述体感遥控设备包括:
体感动作数据采集单元,用于基于运动传感器采集用户体感动作对应的体感动作数据;
体感动作数据发送单元,用于基于无线连接将所述体感动作数据发送至智能显示终端。
优选的,所述运动传感器包括:
加速度传感器、角速度传感器和/或陀螺仪。
优选的,所述无线连接包括:
红外连接、WIFI连接和/或蓝牙连接。与现有技术相比,本申请具有以下优点:
本申请提供的操作简单、方便快捷,并且无需配置额外的设备,节省资源。
现有技术提供的智能电视终端上控制浏览器浏览网页的方法,操作不便、繁琐费时,并且造成了资源浪费。
本申请提供的一种利用体感遥控设备控制浏览器的方法,包括:基于智能显示终端,读取体感遥控设备采集的对应用户体感动作的体感动作数据;基于所述体感动作数据计算出所述用户体感动作的速度特征;判断所述速度特征是否处于预设的阈值区间;若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
本申请提供的上述利用体感遥控设备控制浏览器的方法,通过用户发出的用户体感动作对应浏览器的浏览器控制动作,操作简单、方便快捷,避免了现有技术通过按键控制浏览网页存在的操作不便、繁琐费时的问题;并且体感设备已经广泛用于智能显示终端,使其无需配置额外的设备,节省资源。
附图说明
图1是本申请第一实施例提供的一种利用体感遥控设备控制浏览器的方法处理流程图。
图2是本申请第一实施例提供的一种利用体感遥控设备控制浏览器的方法中步骤S104的处理流程图。
图3是本申请第二实施例提供的一种利用体感遥控设备控制浏览器的装置示意图。
图4是本申请第三实施例提供的一种利用体感遥控设备控制浏览器的系统示意图。
具体实施方式
在下面的描述中阐述了很多具体细节以便于充分理解本申请。但是本申请能够以很多不同于在此描述的其它方式来实施,本领域技术人员可以在不违背本申请内涵的情况下做类似推广,因此本申请不受下面公开的具体实施的限制。
本申请提供一种利用体感遥控设备控制浏览器的方法,以解决现有技术存在的操作不便和繁琐费时问题。本申请另外提供一种利用体感遥控设备控制浏览器的装置和利用体感遥控设备控制浏览器的系统。
实施例一
本申请第一实施例提供一种利用体感遥控设备控制浏览器的方法。
参照图1,其示出了本申请第一实施例提供的一种利用体感遥控设备控制浏览器的方法处理流程图。
S101;基于智能显示终端屏幕显示用户体感动作的操作指引。
本实施例中,所述基于智能显示终端屏幕显示用户体感动作的操作指引是指在智能显示终端的屏幕上显示用户操作指引的用户指导界面,该用户指导界面以文字和/或图像组成;所述用户指导界面默认状态为开启,即:在所述智能显示终端的屏幕上显示该用户指导界面;用户可通过设置关闭所述在智能显示终端的屏幕上显示该用户指导界面功能,或者设置忽略本次指导界面功能的显示。
此外,所述基于智能显示终端屏幕显示用户体感动作的操作指引还可以采用本实施例之外的其它方法实现;
例如:通过播放声音向用户发出操作指引的提示,或者通过在智能显示终端屏幕播放预先录制的用户操作指引视频,均可用来实现本步骤所述基于智能显示终端屏幕显示用户体感动作的操作指引。
完成向用户发出操作指引的提示之后,进入读取智能显示终端接收的体感遥控设备上传的对应用户体感动作的体感动作数据。
S102;基于智能显示终端,读取体感遥控设备采集的对应用户体感动作的体感动作数据。
所述体感遥控设备包括体感遥控器、体感手柄、智能移动终端和/或可穿戴式设备;用户体感动作是指用户借助体感设备(比如:体感遥控器、体感手柄、智能移动终端和/或可穿戴式设备)完成的人体动作或者表情。
在本实施例中,用户通过体感遥控设备完成的用户体感动作包括:向上挥动动作、向下挥动动作、向左挥动动作、向右挥动动作、向前挥动动作和/或向后挥动动作;此处所说的“上、下”是相对于空间而言,即所述向上挥动动作和向下挥动动作的方向对应于三维空间中y轴的方向;与之相类似,所述向左挥动动作和向右挥动动作的方向对应于三维空间中x轴的方向,向前挥动动作和向后挥动动作的方向对应于三维空间中z轴的方向。
在读取体感遥控设备采集的用户体感动作对应的体感动作数据之前,还包括以下步骤:
首先,用户借助体感遥控设备发出用户体感动作;
其次,体感遥控设备基于自身内置的传感器(比如:加速度传感器、角速度传感器和/或陀螺仪)采集运动状态数据,即:采集用户体感动作对应的体感动作数据;
最后,采集到所述用户体感动作对应的体感动作数据之后,体感遥控设备基于无线连接(比如:红外连接、WIFI连接和/或蓝牙连接)将所述体感动作数据上传至智能显示终端。
完成上述三步准备工作之后,读取智能显示终端接收的体感遥控设备上传的用户体感动作对应的体感动作数据。
S103;基于所述体感动作数据计算出所述用户体感动作的速度特征。
根据上述步骤S102读取到智能显示终端接收的体感遥控设备上传的用户体感动作对应的体感动作数据之后,基于所述体感动作数据进行计算,计算出用户体感动作的速度特征,所述速度特征包括:所述用户体感动作的速度方向与该特征方向之间的特征方向夹角,所述用户体感动作在所述特征方向的特征速度值和特征加速度值,并且将该速度特征作为衡量和判断用户体感动作的标准。
本实施例中,所述速度特征计算过程分为以下两步:
1)计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴方向之间的方向夹角;
所述速度方向是指该用户体感动作的运动方向,本实施例中,将所述用户体感动作的速度方向作为所述用户体感动作的运动方向。确定所述用户体感动作的运动方向之后,计算所述用户体感动作的速度方向与所述x轴、y轴和z轴正方向或者负方向之间的方向夹角。
2)将所述方向夹角最小的轴的方向作为所述用户体感动作的特征方向,并获取所述用户体感动作在该特征方向的速度特征;
根据上述步骤计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴正方向或者负方向之间的方向夹角之后,选取方向夹角最小的轴的方向作为所述用户体感动作的特征方向。
例如:所述用户体感动作的速度方向与x轴正方向之间的方向夹角最小,则判定该用户体感动作是向右方向的动作,即:向右挥动动作。
所述特征方向确定之后,则将该特征方向对应的方向夹角、速度以及加速度作为所述用户体感动作的速度特征;
其中,所述用户体感动作与该特征方向之间方向夹角为特征方向夹角;
所述用户体感动作在该特征方向的速度大小为特征速度值;
所述用户体感动作在该特征方向的加速度大小为特征加速度值。
除此之外,还可以采用其它的特征作为衡量和判断用户体感动作的标准,比如用户体感动作的位移特征(位移大小、位移方向)作为衡量和判断用户体感动作的标准,在此不作限定。
S104;判断所述速度特征是否处于预设的阈值区间。
本步骤所述判断所述速度特征是否处于预设的阈值区间的目的在于对用户体感动作进行有效性判断,将无效的用户操作过滤;用户利用体感遥控设备发出用户体感动作时,所述用户体感动作没有统一的标准,用户可能会有一些无意识的体感动作被检测到,例如拿取体感遥控设备的动作,可能并没有控制浏览器的意识,却有可能被认为是对浏览器的控制,而出现误操作。
针对上述存在的问题,本实施例确定了统一的判断标准,即:阈值区间。
阈值区间对无效的用户体感动作起到过滤作用,防止大量无效的用户体感动作产生作用,此外,对应每一个用户体感动作,会以进程间通信的方式向浏览器发送指令,如果未经阈值区间的判断过滤,一旦接收到用户发出的大量的用户体感动作对应的速度特征,会产生大量的进程数据,造成进程瘫痪甚至进程崩溃;阈值区间的设立保证了系统的稳定性和合理性。
参照图2,其示出了本申请第一实施例提供的一种利用体感遥控设备控制浏览器的方法中步骤S104的处理流程图。
本实施例中,判断所述用户体感动作对应速度特征是否处于预设的阈值区间,包括如下三步:
步骤S201;判断所述特征方向夹角是否处于预设的方向阈值区间;
本实施例中,所述方向阈值区间为特征方向上的一个正负范围的夹角区间;以X轴正方向为例,方向阈值区间可以取与X轴正方向相比正负15°的夹角区间,如果超过这个范围,则体感动作发出者很可能并非进行浏览器控制;
若所述特征方向夹角处于所述方向阈值区间内,则执行步骤S202,进一步判断所述特征方向的特征速度值是否处于预设的速度阈值区间的范围;
若所述特征方向夹角超出了所述最小阈值和最大阈值限定的方向阈值区间 的范围,则将该特征方向夹角对应的用户体感动作判定为在特征方向无效的操作,执行步骤S204,不作任何处理即可。
步骤S202;判断所述特征方向的特征速度值是否处于预设的速度阈值区间;
本步骤所述的速度阈值区间包括最小阈值和最大阈值;体感动作发出者如果意图进行浏览器操作,则其速度会在一个合理的区间范围内,不可能太慢,也不可能太快,因此,可以使用预设的速度阈值区间来判断体感动作是否属于对浏览器的控制操作;该速度可以是整个操作过程的平均速度;
若所述特征速度值处于所述最小阈值和最大阈值限定的速度阈值区间的范围,则执行步骤S203,进一步判断所述特征方向的特征加速度值是否处于预设的加速度阈值区间的范围;
若所述特征速度夹角超出了所述最小阈值和最大阈值限定的速度阈值区间范围,则将该特征速度值对应的用户体感动作判定为在特征方向无效的操作,执行步骤S204,不作任何处理即可。
步骤S203;判断所述特征方向的特征加速度值是否处于预设的加速度阈值区间;
与上述方向阈值区间和速度阈值区间相类似,本步骤所述的加速度阈值区间包括最小阈值和最大阈值;同样的,体感动作发出者如果意图进行浏览器操作,则其速度变化会在一个合理的区间范围内,速度变化太快或者太慢都是不合理的,因此,可以使用预设的加速度阈值区间来判断体感动作是否属于对浏览器的控制操作;该加速度可以是整个体感动作过程的平均加速度;
若所述特加征速度值处于所述最小阈值和最大阈值限定的速度阈值区间的范围,则执行步骤S105,查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;
若所述特征加速度夹角超出了所述最小阈值和最大阈值限定的加速度阈值区间范围,则将该特征加速度值对应的用户体感动作判定为在特征方向无效的操作,执行步骤S204,不作任何处理即可。
经过上述步骤S201、步骤S202以及步骤S203的判断,且所述用户体感动作对应的特征方向夹角、特征速度值和特征加速度值三者均处于上述相应的阈值区间范围内,该用户体感动作才能被判定为是在特征方向上有效的操作,可以进一步查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户 体感动作对应的浏览器控制动作。
需要说明的是,本实施例中,所述用户体感动作对应的速度特征的判断顺序是首先判断所述特征方向夹角是否处于预设的方向阈值区间,其次判断所述特征方向的特征速度值是否处于预设的速度阈值区间,最后判断所述特征方向的特征加速度值是否处于预设的加速度阈值区间;
除此之外,所述特征方向夹角、特征速度值以及特征加速度值是否处于相应阈值区间的判断次序并不是固定,还可以根据不同的系统或者场景调整所述特征方向夹角、特征速度值以及特征加速度值是否处于相应阈值区间的判断次序,比如首先判断所述特征方向的特征速度值是否处于预设的速度阈值区间或者判断所述特征方向的特征加速度值是否处于预设的加速度阈值区间,本实施例在此不作限定。
S105;查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作。
所述浏览器控制动作是指能够控制浏览器从而对其进行操作的指令动作。
本实施例中,所述浏览器控制动作包括:向上滚动显示网页、向下滚动显示网页、返回上一级网页、进入下一级网页、放大显示网页和/或缩小显示网页。
根据上述步骤S104判定所述用户体感动作为有效操作之后,本步骤通过查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作,即:将该有效的用户体感动作对应到所述浏览器控制动作。
本实施例中,所述预设的用户体感动作-浏览器控制动作对应关系数据如下:
所述用户体感动作中向上挥动动作对应于向上滚动显示网页的浏览器控制动作;
所述用户体感动作中向下挥动动作对应于向下滚动显示网页的浏览器控制动作;
所述用户体感动作中向左挥动动作对应于返回上一级网页的浏览器控制动作;
所述用户体感动作中向右挥动动作对应于进入下一级网页的浏览器控制动作;
所述用户体感动作中向前挥动动作对应于放大显示网页的浏览器控制动 作;
所述用户体感动作中向后挥动动作对应于缩小显示网页的浏览器控制动作。
除此之外,所述用户体感动作与所述浏览器控制动作之间还可以采用本实施例之外的对应关系,在此不作限定。
S106;通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
根据上述步骤S105,获取所述用户体感动作对应的所述浏览器控制动作之后,本步骤中,向浏览器发送所述用户体感动作对应浏览器控制动作的执行命令,通知浏览器执行所述用户体感动作对应的浏览器控制动作;浏览器接收到所述用户体感动作对应的浏览器控制动作的执行命令后,执行该命令,并将执行该命令之后的结果(比如:跳转到上一级之后的网页页面或者进入下一级之后的网页页面等)在所述智能显示终端屏幕显示。
实施例二
本申请第二实施例提供一种利用体感遥控设备控制浏览器的装置。
参照图3,其示出了本申请第二实施例提供的一种利用体感遥控设备控制浏览器的装置示意图。
由于装置实施例基本相似于方法实施例,所以描述得比较简单,相关的部分请参见方法实施例的对应说明即可。下述描述的装置实施例仅仅是示意性的。
本实施例所述的利用体感遥控设备控制浏览器的装置,包括:
数据读取单元301,用于读取体感遥控设备采集的对应用户体感动作的体感动作数据;
计算单元302,用于基于所述体感动作数据计算出所述用户体感动作的速度特征;
判断单元303,用于判断所述速度特征是否处于预设的阈值区间;
若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;
浏览器控制动作获取单元304,用于查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;
执行单元305,用于通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
可选的,所述计算单元302包括:
方向夹角计算子单元302-1,用于计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴方向之间的方向夹角;
特征获取子单元302-2,用于将所述方向夹角最小的轴的方向作为所述用户体感动作的特征方向,并获取所述用户体感动作在该特征方向的速度特征;
其中,所述速度特征包括:所述用户体感动作的速度方向与该特征方向之间的特征方向夹角,所述用户体感动作在所述特征方向的特征速度值和/或特征加速度值。
可选的,所述利用体感遥控设备控制浏览器的装置,包括:
操作指引显示单元306,用于基于智能显示终端屏幕显示用户体感动作的操作指引。
实施例三
本申请第三实施例提供一种利用体感遥控设备控制浏览器的系统。
参照图4,其示出了本申请第三实施例提供的一种利用体感遥控设备控制浏览器的系统示意图。
所述利用体感遥控设备控制浏览器的系统包括:
智能显示终端410和体感遥控设备420;
其中,所述智能显示终端410包括:
上述实施例所述的利用体感遥控设备控制浏览器的装置411;
所述利用体感遥控设备控制浏览器的装置的说明参见上述实施例二即可,本实施例不再赘述。
体感遥控设备420包括:
体感动作数据采集单元421和体感动作数据发送单元422;
所述体感动作数据采集单元421用于基于运动传感器采集用户体感动作对应的体感动作数据;
所述体感动作数据发送单元422用于将所述体感动作数据采集单元421采集的体感动作数据通过无线连接发送至智能显示终端410;
需要说明的是,所述运动传感器是指所述体感遥控设备420内置的采集运动数据的设备,比如:智能手环内置的加速度传感器或角速度传感器,体感手柄内置或者智能移动终端内置的陀螺仪;此外,所述无线连接是指所述体感遥控设备420与所述智能显示终端410之间进行数据传输的通信方式,例如:体感遥控器通过红外连接与智能显示终端(比如:智能电视终端)进行通信,体感手柄通过蓝牙连接与智能显示终端(比如:智能电视终端、PC)进行通信,智能移动终端(比如:智能手机、平板电脑)以及可穿戴式设备(比如:智能手环、智能戒指)通过蓝牙或者WIFI连接与智能显示终端进行通信。
本申请虽然以较佳实施例公开如上,但其并不是用来限定本申请,任何本领域技术人员在不脱离本申请的精神和范围内,都可以做出可能的变动和修改,因此本申请的保护范围应当以本申请权利要求所界定的范围为准。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
1、计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括非暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
2、本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。

Claims (14)

  1. 一种利用体感遥控设备控制浏览器的方法,其特征在于,包括:
    基于智能显示终端,读取体感遥控设备采集的对应用户体感动作的体感动作数据;
    基于所述体感动作数据计算出所述用户体感动作的速度特征;
    判断所述速度特征是否处于预设的阈值区间;
    若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;
    查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;
    通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
  2. 根据权利要求1所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述基于所述体感动作数据计算出所述用户体感动作的速度特征包括:
    计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴方向之间的方向夹角;
    将所述方向夹角最小的轴的方向作为所述用户体感动作的特征方向,并获取所述用户体感动作在该特征方向上的速度特征;
    其中,所述速度特征包括:所述用户体感动作的速度方向与该特征方向之间的特征方向夹角,所述用户体感动作在所述特征方向的特征速度值和特征加速度值。
  3. 根据权利要求2所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述判断所述速度特征是否处于预设的阈值区间包括:
    判断所述特征方向夹角是否处于预设的方向阈值区间、所述特征方向的特征速度值是否处于预设的速度阈值区间以及所述特征方向的特征加速度值是否处于预设的加速度阈值区间;
    若所述特征方向夹角、特征速度值和特征加速度值三者均处于相应的阈值区间内,则判定所述特征加速度值对应的用户体感动作为有效操作。
  4. 根据权利要求1所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述浏览器控制动作包括:
    向上滚动显示网页、向下滚动显示网页、返回上一级网页、进入下一级网 页、放大显示网页和/或缩小显示网页。
  5. 根据权利要求1所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述用户体感动作包括:
    向上挥动动作、向下挥动动作、向左挥动动作、向右挥动动作、向前挥动动作和/或向后挥动动作;
    其中,所述向上挥动动作和向下挥动动作的方向对应于三维空间中y轴的方向,向左挥动动作和向右挥动动作的方向对应于三维空间中x轴的方向,向前挥动动作和向后挥动动作的方向对应于三维空间中z轴的方向。
  6. 根据权利要求1所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述读取体感遥控设备采集的对应用户体感动作的体感动作数据之前包括:
    基于智能显示终端的屏幕显示用户体感动作的操作指引。
  7. 根据权利要求1所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述体感遥控设备包括:
    体感遥控器、体感手柄、智能移动终端和/或可穿戴式设备。
  8. 根据权利要求1所述的利用体感遥控设备控制浏览器的方法,其特征在于,所述智能显示终端包括:
    智能电视终端和/或PC。
  9. 一种利用体感遥控设备控制浏览器的装置,其特征在于,包括:
    数据读取单元,用于读取体感遥控设备采集的对应用户体感动作的体感动作数据;
    计算单元,用于基于所述体感动作数据计算出所述用户体感动作的速度特征;
    判断单元,用于判断所述速度特征是否处于预设的阈值区间;
    若所述速度特征处于预设的阈值区间,则判定该速度特征对应的用户体感动作为有效操作;
    浏览器控制动作获取单元,用于查阅预设的用户体感动作-浏览器控制动作对应关系数据,获取该用户体感动作对应的浏览器控制动作;
    执行单元,用于通知浏览器执行所述浏览器控制动作,并在所述智能显示终端上显示。
  10. 根据权利要求9所述的利用体感遥控设备控制浏览器的装置,其特征在于,所述计算单元包括:
    方向夹角计算子单元,用于计算出所述用户体感动作的速度方向与所述x轴、y轴和z轴方向之间的方向夹角;
    特征获取子单元,用于将所述方向夹角最小的轴的方向作为所述用户体感动作的特征方向,并获取所述用户体感动作在该特征方向的速度特征;
    其中,所述速度特征包括:所述用户体感动作的速度方向与该特征方向之间的特征方向夹角,所述用户体感动作在所述特征方向上的特征速度值和/或特征加速度值。
  11. 根据权利要求9所述的利用体感遥控设备控制浏览器的装置,其特征在于,包括:
    操作指引显示单元,用于基于智能显示终端屏幕显示用户体感动作的操作指引。
  12. 一种利用体感遥控设备控制浏览器的系统,其特征在于,包括:
    智能显示终端,包括如权利要求9至11任意一项所述的利用体感遥控设备控制浏览器的装置;以及
    体感遥控设备;
    其中,所述智能显示终端与所述体感遥控设备基于无线连接进行通信;
    所述体感遥控设备包括:
    体感动作数据采集单元,用于基于运动传感器采集用户体感动作对应的体感动作数据;
    体感动作数据发送单元,用于基于无线连接将所述体感动作数据发送至智能显示终端。
  13. 根据权利要求12所述的利用体感遥控设备控制浏览器的系统,其特征在于,所述运动传感器包括:
    加速度传感器、角速度传感器和/或陀螺仪。
  14. 根据权利要求12所述的利用体感遥控设备控制浏览器的系统,其特征在于,所述无线连接包括:
    红外连接、WIFI连接和/或蓝牙连接。
PCT/CN2015/079687 2014-06-11 2015-05-25 一种利用体感遥控设备控制浏览器的方法、装置及系统 WO2015188691A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410258048.2 2014-06-11
CN201410258048.2A CN105204606A (zh) 2014-06-11 2014-06-11 一种利用体感遥控设备控制浏览器的方法、装置及系统

Publications (1)

Publication Number Publication Date
WO2015188691A1 true WO2015188691A1 (zh) 2015-12-17

Family

ID=54832881

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/079687 WO2015188691A1 (zh) 2014-06-11 2015-05-25 一种利用体感遥控设备控制浏览器的方法、装置及系统

Country Status (3)

Country Link
CN (1) CN105204606A (zh)
HK (1) HK1215736A1 (zh)
WO (1) WO2015188691A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105657505A (zh) * 2016-01-05 2016-06-08 天脉聚源(北京)传媒科技有限公司 一种视频遥控播放的方法及装置
CN105681915A (zh) * 2016-03-24 2016-06-15 深圳Tcl数字技术有限公司 快速截取智能电视播放内容的方法和系统

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105760433B (zh) * 2016-02-01 2019-07-02 四川长虹电器股份有限公司 一种快速选中智能电视网页元素的方法
CN105955634A (zh) * 2016-04-20 2016-09-21 上海斐讯数据通信技术有限公司 移动智能终端的截图方法和截图系统
CN105975081A (zh) * 2016-05-24 2016-09-28 深圳市敢为软件技术有限公司 体感控制方法和装置
CN108348084A (zh) * 2016-09-30 2018-07-31 深圳市赛亿科技开发有限公司 烹饪系统及方法
CN108961711B (zh) * 2018-04-28 2020-06-02 深圳市牛鼎丰科技有限公司 遥控移动装置的控制方法、装置、计算机设备和存储介质
CN109740092B (zh) * 2018-12-26 2022-02-01 深圳市网心科技有限公司 浏览器系统、消息处理方法、电子设备、装置及存储介质
CN110427106B (zh) * 2019-07-19 2022-07-12 武汉恒新动力科技有限公司 体感动作数据处理方法、设备及计算机可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110248915A1 (en) * 2009-07-14 2011-10-13 Cywee Group Ltd. Method and apparatus for providing motion library
CN102937839A (zh) * 2012-11-21 2013-02-20 合一网络技术(北京)有限公司 一种对终端设备进行体感遥控的遥控装置及方法
CN103399635A (zh) * 2013-07-24 2013-11-20 佳都新太科技股份有限公司 基于智能手机的体感遥控电脑方案
CN103686278A (zh) * 2013-12-05 2014-03-26 海信集团有限公司 电视的遥控方法、手机、电视机和系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110248915A1 (en) * 2009-07-14 2011-10-13 Cywee Group Ltd. Method and apparatus for providing motion library
CN102937839A (zh) * 2012-11-21 2013-02-20 合一网络技术(北京)有限公司 一种对终端设备进行体感遥控的遥控装置及方法
CN103399635A (zh) * 2013-07-24 2013-11-20 佳都新太科技股份有限公司 基于智能手机的体感遥控电脑方案
CN103686278A (zh) * 2013-12-05 2014-03-26 海信集团有限公司 电视的遥控方法、手机、电视机和系统

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105657505A (zh) * 2016-01-05 2016-06-08 天脉聚源(北京)传媒科技有限公司 一种视频遥控播放的方法及装置
CN105681915A (zh) * 2016-03-24 2016-06-15 深圳Tcl数字技术有限公司 快速截取智能电视播放内容的方法和系统
CN105681915B (zh) * 2016-03-24 2019-05-14 深圳Tcl数字技术有限公司 快速截取智能电视播放内容的方法和系统

Also Published As

Publication number Publication date
HK1215736A1 (zh) 2016-09-09
CN105204606A (zh) 2015-12-30

Similar Documents

Publication Publication Date Title
WO2015188691A1 (zh) 一种利用体感遥控设备控制浏览器的方法、装置及系统
JP6816858B2 (ja) 携帯端末に対する動作関連入力によって複数個のオブジェクトの表示を制御する方法及び携帯端末
KR102571369B1 (ko) 디스플레이 제어 방법, 저장 매체 및 전자 장치
US20190339740A1 (en) User terminal device and displaying method thereof
EP2743795B1 (en) Electronic device and method for driving camera module in sleep mode
CA2892143C (en) Using clamping to modify scrolling
TWI706312B (zh) 調整界面操作圖示分佈範圍的裝置、方法及觸控螢幕設備
KR102037465B1 (ko) 사용자 단말 장치 및 이의 디스플레이 방법
KR102139110B1 (ko) 전자 디바이스 및 전자 디바이스에서 그립 센싱을 이용한 제어 방법
EP2752754A2 (en) Remote mouse function method and terminals
CN107407945A (zh) 从锁屏捕获图像的系统和方法
CN104914995B (zh) 一种拍照方法及终端
CN108920069B (zh) 一种触控操作方法、装置、移动终端和存储介质
CN104182126B (zh) 一种移动终端拨号方法及装置
WO2017113711A1 (zh) 页面浏览方法及装置
US20150199021A1 (en) Display apparatus and method for controlling display apparatus thereof
US20180283873A1 (en) Calibration method based on dead reckoning technology and portable electronic device
WO2015078300A1 (zh) 一种电视控制方法及电视遥控器
WO2015131630A1 (zh) 桌面图标的置换方法及装置
JP3172865U (ja) メモリカードが挿入可能な無線通信カード
WO2014032504A1 (zh) 终端自定义手势的方法及其终端
CN107852425A (zh) 图像显示装置及其操作方法
CN109033100A (zh) 提供页面内容的方法及装置
CN103716543A (zh) 一种移动终端及其摄像装置控制方法
US20190212834A1 (en) Software gyroscope apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15807007

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15807007

Country of ref document: EP

Kind code of ref document: A1