CN105808123A - Intelligent terminal interaction system and method based on remote control device - Google Patents

Intelligent terminal interaction system and method based on remote control device Download PDF

Info

Publication number
CN105808123A
CN105808123A CN201610148634.0A CN201610148634A CN105808123A CN 105808123 A CN105808123 A CN 105808123A CN 201610148634 A CN201610148634 A CN 201610148634A CN 105808123 A CN105808123 A CN 105808123A
Authority
CN
China
Prior art keywords
interface
interface zone
interactive elements
zone
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610148634.0A
Other languages
Chinese (zh)
Other versions
CN105808123B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Min Jinfang
Original Assignee
Min Jinfang
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Min Jinfang filed Critical Min Jinfang
Priority to CN201610148634.0A priority Critical patent/CN105808123B/en
Publication of CN105808123A publication Critical patent/CN105808123A/en
Application granted granted Critical
Publication of CN105808123B publication Critical patent/CN105808123B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an intelligent terminal interaction system and method based on a remote control device. The system comprises the remote control device, an intelligent terminal and an interface system which is operated on the intelligent terminal. The method comprises the following steps: dividing the main window of an intelligent terminal interface system into a plurality of interface areas in a layer-by-layer recursion way; and arranging an interface mapping panel and a control panel on the remote control device, wherein the interface mapping panel contains a plurality of interface mapping units which correspond to the interface areas generated when each layer of the interface system is divided one by one, and the control panel comprises a plurality of control units. When a user operates the interface mapping unit on the remote control device, the interface area is selected by the user layer by layer, and finally, the target interaction element of the user obtains a focus. In addition, the user can use various control units to control the positioning of the interface area or the focus.

Description

A kind of intelligent terminal interactive system and method based on remote control unit
Technical field
The present invention relates to intelligent terminal field, particularly a kind of intelligent terminal interactive system and method based on remote control unit.
Background technology
Along with the development of intellectual technology, intelligent television (Intelligent set top box), video wall and various big screen intelligent end product have come into user market.Current large-size screen monitors intelligent terminal is generally equipped with independent operating system, carries colourful application, has the user interface of complexity, is greatly improved the recreation experience of user.
Under the application scenarios of mobile intelligent terminal, the modes such as touch screen technology makes user can pass through touch screen event, gesture and mobile intelligent terminal carry out the mutual of smoothness.But, under the application scenarios of intelligent television, Intelligent set top box and other big screen intelligent terminal, owing to there is certain physical distance between user and terminal, touch screen technology is no longer applicable.
Currently, this type of end product of major part still adopts traditional remote controller interactive mode;Also some uses the interactive mode by the bi-directionally transmitted telecommand of WLAN or bluetooth equipment and screen pixel data;A small amount of product is also had to employ some more advanced interactive modes, such as the body feeling interaction technology etc. based on photographic head, ultrasound wave or eye tracking.The advantage of conventional remote controls technology is control command one way propagation, and the identification of alternative events is very accurate, and real-time is high, low cost of manufacture.But under the scene of intelligent terminal's complicated interface layout, shortcoming is also apparent from, that is, complex operation, and efficiency is low, poor user experience.Advantage based on WLAN or the telecontrol engineering of bluetooth equipment is order and data two-way propagation, touch screen event can be realized, gesture interaction etc., shortcoming is to need to rely on external network, manufacturing cost is higher, and, user needs sight line to be transferred on the touch screen picture of remote control unit when operation, disturbs Consumer's Experience.The advantage of body feeling interaction technology is directly perceived and hommization, but owing to current technology is ripe all not enough, and it is easily subject to interference, also it is short of to some extent in the practicality of the accuracy of alternative events identification, real-time and technology, manufacturing cost is also costly, at present, but without the matured product that this type of technology of large-scale application occurs on market.
Summary of the invention
The problem to be solved in the present invention is to provide a kind of intelligent terminal interactive system and method based on remote control unit so that user under the complicated interface layout scene of intelligent terminal, more intuitively, simply, can interact with intelligent terminal effectively.The invention have the advantage that
(1) mode of unique " step-by-step recursion divides, selects interface zone " is adopted to carry out locating interface focus, more more quick than the location mode of traditional remote controller, and the remote control mode that complete compatibility is traditional;
(2) position based on focal position, more simpler based on the telecontrol engineering of pixel location than other;
(3) guidance command one way propagation, it does not have the passback of data, is independent of external network;
(4) interactive mode simple, intuitive, when the interface of complex topology, if with traditional remote control mode, the number of times meeting linear increase of user operation remote controller, and the number of operations of the present invention is limited in an only small scope, greatly reduce the number of times of user operation (such as, during conventional practice Webpage, the number of operations navigating to certain page elements is likely to up to tens times, and the number of operations of the present invention generally can be limited in two to four times, seldom can have more than the situation of four times);
(4) without sight line is transferred on remote control unit during user operation, the fluency of Consumer's Experience is not affected;
(5) adopt forecasting mechanism, further reduce the number of times of user's remote manipulation.
According to an aspect of the invention, it is provided a kind of intelligent terminal interactive system based on remote control, including:
Remote control unit, including interface image panel and control panel, for accepting the operation of user, and produces corresponding control signal.Interface image panel on described remote control unit comprises multiple interface image unit, and described interface image unit is used for mapping described interface system every layer and divides produced interface zone;Control panel on described remote control unit includes the control unit representing " determination ", " return ", " rollback " and " up and down " four direction.Described interface image unit and control unit are passable, but are not limited to realize by the mode of button in kind, and other implementation can be touch manner, induction mode or software analog form etc..
Intelligent terminal, is used for receiving described control signal, and control signal is converted to alternative events, and gives and operate in the intrasystem interface system of intelligent terminal software and process.
Interface system, is used for receiving and responding alternative events, shows interactive interface on a terminal screen, and is interface zone interface step-by-step recursion STRATIGRAPHIC DIVISION.Described interface system includes the interactive elements in all two dimensions on screen or three dimensional window and window, described window and interactive elements can obtain interface focus, responds user operation.Described interactive elements includes various to obtain interface focus, the two dimension of response user operation or three-dimensional micromodule, include but not limited to: picture, video area, button, menu, submenu, radio box, check box, list, text box, drop-down list box, form, control strip, slide block, irregular interaction area, hyperlink, other interactive component etc..
According to another aspect of the present invention, it is provided that a kind of intelligent terminal interactive method based on remote control.The method comprising the steps of:
Step one, according to selected splitting scheme, is divided into interface zone the main window step-by-step recursion of intelligent terminal's interface system;
Step 2, user, by repeatedly clicking the interface image unit on described remote control unit, successively chooses interface zone, and final locating interface focus is to target interactive elements;
Step 3, user, by clicking described " determination " control unit, quickly determines interface focus;
Step 4, user, by clicking described " return " control unit, is switched to last layer interface zone " selected state " of interface zone;User, by " length is pressed " described " return " control unit, controls " selected state " key and return back to top layer interface zone;
Step 5, user, by clicking direction controlling unit, controls " selected state " and shifts between adjacent interfaces region, or control interface focus shifts between adjacent interactive elements;
Step 6, user can select to open zoom mode;Under zoom mode, the interface zone when user successively chooses interface zone, being in " selected state " by default scaling, after interface zone loses " selected state ", can return to again normal size.
In an embodiment of the present invention, the main window of described interface system refers on screen interface visible, the top-level windows being active, if interface system creates multiple top-level windows in running, then the top-level windows being active becomes main window;Described main window acquiescence becomes top layer interface zone.
In an embodiment of the present invention, described selected splitting scheme can have multiple, and a kind of typical splitting scheme is sphere of movements for the elephants shape splitting scheme;Described sphere of movements for the elephants shape splitting scheme, when dividing for every layer, is divided into, according to sphere of movements for the elephants shape, the lower interface region that four equal in magnitude, shape is identical current interface region;Optional splitting scheme is it is also possible that but be not limited to " nine grids splitting scheme ".
In an embodiment of the present invention, described step-by-step recursion ground divides interface zone, refers to and adopts selected splitting scheme, from top layer interface zone, according to the number of plies set, successively divides the process of each interface zone.
In an embodiment of the present invention, on described remote control unit, the number of interface map unit is identical with the number that selected splitting scheme every layer divides the lower interface region produced, and described lower interface region is mapped to described interface image unit one by one.
In an embodiment of the present invention, described successively select interface zone, finally interface focus navigated to the step of target interactive elements and include:
Step one, under initial situation, top layer interface zone becomes the interface zone of " selected state ";
Step 2, each interface image unit that the lower interface region of " selected state " interface zone is mapped on described remote control unit one by one;
Step 3, interface system waits and receives the alternative events from remote control unit;
Step 4, if alternative events type is that certain interface image unit is clicked, then interface zone acquisition " selected state " that clicked interface image unit is mapped, jump to step 10;
Step 5, if alternative events type is clicked for " determination " control unit, if interface has the interface zone being in " selected state ", and described interface zone comprises multiple interactive elements, then according to described prediction algorithm, focus is automatically positioned predicted target interactive elements, and the step of described locating interface focus completes, and jumps to step 9;
Step 6, if alternative events type is " return ", control unit is clicked, if there being the interface zone of the non-top layer being in " selected state " on interface, then " selected state " of described interface zone is transferred to upper interface region, jumps to step 10;
Step 7, if alternative events type is long pressed for " return " control unit, if there being the interface zone of the non-top layer being in " selected state " on interface, then " selected state " of described interface zone is transferred to top layer interface zone, jumps to step 2;
Step 8, if alternative events type is that certain direction controlling unit is clicked, if interface has the interface zone being in " selected state ", and described interface zone is region, described side extrorse non-top bed boundary, and interface content can roll, then interface content rolls along the rightabout in described direction, and new content of pages enters the interface zone of described " selected state ", jumps to step 10;Otherwise, " selected state " of described interface zone is transferred to described direction interface zone adjacent, same level, jumps to step 10;
Step 9, if receiving the instruction terminating to run, then exits, otherwise jumps to step 3;
Step 10, if being in " selected state " interface zone only comprise an interactive elements, then focus moves directly to described interactive elements, and the step of described locating interface focus completes, and jumps to step 9;Otherwise, if the interface zone being in " selected state " comprises multiple interactive elements, then adopt selected prediction algorithm, it was predicted that certain interactive elements in described interface zone is target interactive elements, and on interface, give the user visual cues, then branch to step 2.
In an embodiment of the present invention, it is determined that the method whether certain interactive elements is included in certain interface zone is: if the entire area of certain interactive elements both falls within interface zone, then judge that this interactive elements is included in interface zone;If interactive elements only has area to fall in interface zone, then according to the area accounting value pre-set to judge whether this interactive elements is included in this interface zone.
In an embodiment of the present invention, described prediction algorithm may be, but not limited to, " maximum area interactive elements is preferential " algorithm and " interactive elements placed in the middle is preferential " algorithm:
The method of described " maximum area interactive elements is preferential " algorithm predicts target interactive elements is: the interactive elements that in interface zone, occupied area is maximum is predicted to be target interactive elements;
The method of described " interactive elements placed in the middle is preferential " algorithm predicts target interactive elements is: the interactive elements that in interface zone, position is the most placed in the middle is predicted to be target interactive elements.
Accompanying drawing explanation
Fig. 1 is the remote control unit schematic diagram adopting " matrix pattern splitting scheme " in one embodiment of the invention.
Fig. 2 is the remote control unit schematic diagram adopting " nine grids splitting scheme " in one embodiment of the invention.
Fig. 3 is that the interface zone recurrence adopting " matrix pattern splitting scheme " in one embodiment of the invention divides schematic diagram.
Fig. 4 is that the interface zone recurrence adopting " nine grids splitting scheme " in one embodiment of the invention divides schematic diagram.
Fig. 5 adopts the interface zone recurrence of " matrix pattern splitting scheme " to select ground floor to select schematic diagram in one embodiment of the invention.
Fig. 6 adopts the interface zone recurrence of " matrix pattern splitting scheme " to select the second layer to select schematic diagram in one embodiment of the invention.
Fig. 7 adopts the interface zone recurrence of " matrix pattern splitting scheme " to select third layer to select schematic diagram in one embodiment of the invention.
Fig. 8 is the interface system main window transition diagram adopting " matrix pattern splitting scheme " in one embodiment of the invention.
Fig. 9 is the main flow chart that one embodiment of the invention median surface system processes alternative events.
Figure 10 is the sub-process figure that one embodiment of the invention median surface system processes alternative events.
Detailed description of the invention
Technological means and effect that predetermined goal of the invention is taked is reached in order to the present invention is expanded on further, below in conjunction with accompanying drawing and preferred embodiment, to propose according to the present invention based on its detailed description of the invention of intelligent terminal interactive system and method for remote control unit, method, step and effect, describe in detail as after.
Should be appreciated that specific embodiment described herein is only in order to explain invention, is not intended to limit the present invention.
It is emphasized that, remote control unit described herein, being an embodiment realized with button, those skilled in the art is when knowing, any embodiment otherwise realized (such as touch screen, sensing, stick) shall not depart from the scope of following technical scheme.
It is emphasized that embodiment described herein and control flow, be only a preferred embodiment, those skilled in the art is when knowing, any embodiment built around inventive concept shall not depart from the scope of following technical scheme.
According to one embodiment of present invention, it is provided that a kind of intelligent terminal's remote control interactive system, this system includes remote control unit, intelligent terminal and interface system.
As it is shown in figure 1, S101 is the remote control unit adopting " matrix pattern splitting scheme ", S102 is " other control panel " on remote control unit, and S103 is " interface image panel " on remote control unit, and S104 is " control panel " on remote control unit.S105 is certain in " interface image panel " " interface image unit " on remote control unit.S106 is " determination " control unit on remote control unit in " control panel ", and S107 is " return " control unit on remote control unit in " control panel ", and S108 is certain the direction controlling unit on remote control unit in " control panel ".
As in figure 2 it is shown, S109 is the remote control unit adopting " nine grids splitting scheme ", " nine grids splitting scheme " remote control unit is essentially identical with " matrix pattern splitting scheme " remote control unit, and unique difference is " interface image panel "." the interface image panel " of " matrix pattern splitting scheme " remote control unit has four " interface image unit " in sphere of movements for the elephants shape layout, and " the interface image panel " of " nine grids splitting scheme " remote control unit has nine " interface image unit " in " nine grids " layout.S110 is " interface image panel " on remote control unit.
As it is shown on figure 3, S201 is intelligent terminal's screen, S202 operates in the main window in the upper interface system of intelligent terminal, and interface system adopts " matrix pattern splitting scheme " step-by-step recursion to divide interface zone.S203 is that top layer interface zone divides in 4 the lower interface regions produced, S204 is that certain interface zone of the second layer divides in 4 the lower interface regions produced, S205 is that certain interface zone of third layer divides in 4 interface zones produced, and S206 is that the 4th layer of certain interface zone divides in 4 interface zones produced.
Owing to the present embodiment have employed " matrix pattern splitting scheme ", therefore " the interface image panel " on remote control unit comprises 4 " interface image unit " in Fig. 1, accordingly, each layer of Fig. 3 median surface system divides and produces 4 interface zones, 4 " the interface image unit " that these 4 interface zones are mapped in Fig. 1 one by one.
As shown in Figure 4, in another embodiment, interface system have employed " nine grids splitting scheme " step-by-step recursion and divides interface zone.S207 is that top layer interface zone divides in 9 the lower interface regions produced, S208 is that certain interface zone of the second layer divides in 9 the lower interface regions produced, and S209 is that certain interface zone of third layer divides in 9 the lower interface regions produced;Correspondingly, as in figure 2 it is shown, S110 is " interface image panel " on remote control unit, " interface image panel " 9 " interface image unit " should be comprised;Each layer of Fig. 4 median surface system divides and produces 9 interface zones, 9 " the interface image unit " that these 9 interface zones are mapped in Fig. 2 one by one.
According to one embodiment of present invention, user, by repeatedly clicking the interface image unit on remote control unit, successively chooses interface zone, and final locating interface focus is to target interactive elements, and selection course is as follows:
As it is shown in figure 5, S301 have employed certain interactive elements on the interface system of " matrix pattern splitting scheme ", S302 is the interactive elements being currently owned by interface focus, and S303 is target interactive elements, and namely user wants interface focus to move on to S303 from S302.
In an initial condition, the main window acquiescence of interface system becomes top layer interface zone, and top layer interface zone acquiescence is in " selected state ".Each interface image unit that the lower interface region of " selected state " interface zone is mapped on remote control unit one by one.
When user clicks " the interface image unit " in " interface image panel " upper left corner in Fig. 1, intelligent terminal for reception, after remote signal, is converted to alternative events signal, passes to interface system;Interface system resolves alternative events, the interface zone S304 in the upper left corner in four interface zones of top layer interface zone lower floor is set to " choosing " state, and this interface zone " highlighted " is shown;Simultaneously according to prediction algorithm, it was predicted that S305 is target interactive elements, and S305 is now reinforced display.
As shown in Figure 6, when user again taps on " the interface image unit " in " interface image panel " upper left corner, in four sub-interface zones of S304 lower floor, the interface zone S306 in the upper left corner obtains " choosing " state, and is highlighted;Simultaneously according to prediction algorithm, it was predicted that S307 is target interactive elements, and S307 is now reinforced display.
As it is shown in fig. 7, when user again taps on " the interface image unit " in " interface image panel " upper left corner, in four sub-interface zones of S306 lower floor, the interface zone S308 in the upper left corner obtains " choosing " state, and is highlighted;Simultaneously according to prediction algorithm, it was predicted that S303 is target interactive elements, and S303 is now reinforced display.
Now, comprising two interface elements in interface zone S306, wherein target interactive elements S303 is occupied more placed in the middle, and area is also bigger, so the hit of predicted algorithm.If user clicks " determination " control unit, the focus at interface will be transferred directly to interactive elements S303 from interactive elements S302, and the step of location focus completes.
It should be noted that, when above-mentioned user selects each region, bed boundary, the prediction algorithm of interface system all can provide the interactive elements of prediction, and on interface, user is pointed out, if the predicted algorithm hit of target interactive elements, user only need to click " determination " control unit again, it is possible to rapidly interface focus is navigated to target location.If user ignores prediction algorithm, continue successively to select interface zone, then, when the interface zone in being only elected to only comprises an interactive elements, interface focus just can navigate to this target interactive elements.It can be seen that prediction algorithm can reduce the number of plies of selection, thus reducing the number of times of user operation.
In an embodiment of the present invention, the main window of described interactive interface refer to visible on intelligent terminal's screen, be active, all top-level windows being active interactive elements that comprise other;If the interactive interface of intelligent terminal is in running, create multiple window, then the window being active becomes main window.
As shown in Figure 8, S401 is the top-level windows of original interface system, is defaulted as main window.Currently, the interactive operation of user causes that a new window S402 has been ejected again at interface, and window originally becomes " inactive " state, and new window is active, so new window replaces original window and becomes main window.New boundary division and the selection course of interface zone now act on new main window.As shown in Figure 8, respectively certain interface zone in each layer in new window from S403 to S406.User is once exit new window, and window originally becomes again the selection course of main window, boundary division and interface zone and acts on again original main window.
In an embodiment of the present invention, the main flow of interface system process user's alternative events is as shown in Figure 9 and Figure 10:
S501, interface system is in original state;
S502, under initial situation, top layer interface zone becomes the interface zone of " selected state ";
S503, each interface image unit that the lower interface region of " selected state " interface zone is mapped on described remote control unit one by one;
S504, waits and receives the alternative events from remote control unit;
S505, if certain interface image unit is clicked, if it is, continue down to run, otherwise jumps to S512;
S506, interface zone acquisition " selected state " that clicked interface image unit is mapped;
S507, the interface zone of " selected state " whether only one of which interactive elements, if it is, continue down to run, otherwise jump to S510;
S508, the focus at interface is switched to described interactive elements;
S509, these alternative events are disposed, and exit signal if received, then exit, and otherwise jump to S504;
S510, whether the interface zone of " selected state " comprises multiple interactive elements, if it is, continue down to run, otherwise jumps to S509;
S511, it was predicted that interactive elements, and provide visual cues, then branch to S503;
S512, if " determination " control unit is clicked, if it is, continue down to run, otherwise jumps to S515;
S513, whether interface has the interface zone being in " selected state " and comprising interactive elements, if it is, continue down to run, otherwise jumps to S509;
S514, focus navigates to interactive elements predicted in " selected state " interface zone, then branches to S509;
S515, processes other alternative events sub-process.
In an embodiment of the present invention, interface system processes other alternative events sub-process as shown in Figure 10:
S516, if " return " control unit is clicked, if it is, continue down to run, otherwise jumps to S521;
S517, whether interface has the interface zone being in " selected state ", non-top layer, if it is, continue down to run, otherwise jumps to S520;
S518, " selected state " of described interface zone is transferred to upper interface region;
S519, jumps to S507;
S520, jumps to S509;
S521, if " return " control unit is long pressed, if it is, continue down to run, otherwise jumps to S525;
S522, whether interface has the interface zone being in " selected state ", non-top layer, if it is, continue down to run, otherwise jumps to S520;
S523, " selected state " of described interface zone is transferred to top layer interface zone;
S524, jumps to step S503;
S525, if certain direction controlling unit is clicked, if it is, continue down to run, otherwise jumps to S520;
S526, whether interface has the interface zone being in " selected state ", if it is, continue down to run, otherwise jumps to S520;
S527, if for the interface zone of the non-top layer in edge, described direction, and interface content can roll, if it is, continue down to run, otherwise jumps to S529;
S528, interface content rolls along the rightabout in described direction, and new content of pages enters the interface zone of described " selected state ";
S529, " selected state " of described interface zone is transferred to described direction interface zone adjacent, same level, then branches to S519.
In an embodiment of the present invention, user can open the zoom mode of interface zone.Under zoom mode, when user successively selects interface zone, the interface zone being in selected state can amplify according to a certain percentage automatically, and after selected state is lost, described interface zone recovers normal size, and top layer interface zone is without amplifying.
The above, be only a preferred embodiment of the present invention, and the present invention not does any pro forma restriction.Although the present invention is disclosed above with preferred embodiment, but it is not intended to limit the present invention, any those of ordinary skill in the art, without departing within the scope of technical solution of the present invention, when the technology contents of available the disclosure above makes some variations or is modified to the Equivalent embodiments of equivalent variations, in every case it is without departing from technical solution of the present invention content, according to any simple modification, equivalent variations and modification that above example is made by the technical spirit of the present invention, belong in the scope of technical solution of the present invention.

Claims (8)

1. based on an intelligent terminal interactive system for remote control unit, including:
Remote control unit, including interface image panel and control panel, for accepting the operation of user, and produces corresponding wireless control signal;
Intelligent terminal, is used for receiving described control signal, and control signal is converted to alternative events, and gives and operate in the intrasystem interface system of intelligent terminal software and process;
Interface system, is used for receiving and responding alternative events, shows interactive interface on a terminal screen, and is interface zone interface step-by-step recursion STRATIGRAPHIC DIVISION.
2. system according to claim 1, it is characterised in that:
Interface image panel on described remote control unit comprises multiple interface image unit, and described interface image unit is used for mapping described interface system every layer and divides produced interface zone;
Control panel on described remote control unit includes the control unit representing " determination ", " return " and " up and down " four direction;
Described interface image unit and control unit are passable, but are not limited to realize by the mode of button in kind, and other implementation can be touch manner, induction mode or software analog form etc..
3. system according to claim 1, it is characterised in that:
Described interface system includes the interactive elements in all two dimensions on screen or three dimensional window and window, described window and interactive elements can obtain interface focus, responds user operation.
4. the intelligent terminal interactive method based on remote control unit, it is characterised in that the method comprising the steps of:
Step one, according to selected splitting scheme, is divided into interface zone the main window step-by-step recursion of intelligent terminal's interface system;
Step 2, user, by repeatedly clicking the interface image unit on described remote control unit, successively chooses interface zone, and final locating interface focus is to target interactive elements;
Step 3, user, by clicking described " determination " control unit, quickly determines interface focus;
Step 4, user, by clicking described " return " control unit, is switched to last layer interface zone " selected state " of interface zone;User, by " length is pressed " described " return " control unit, controls " selected state " key and return back to top layer interface zone;
Step 5, user, by clicking direction controlling unit, controls " selected state " and shifts between adjacent interfaces region, or control interface focus shifts between adjacent interactive elements;
Step 6, user can select to open zoom mode;Under zoom mode, the interface zone when user successively chooses interface zone, being in " selected state " by default scaling, after interface zone loses " selected state ", can return to again normal size.
5. method according to claim 4, it is characterised in that:
The main window of described interface system refers on screen interface visible, the top-level windows being active, if interface system creates multiple top-level windows in running, then the top-level windows being active becomes main window;Described main window acquiescence becomes top layer interface zone;
Described selected splitting scheme can have multiple, and a kind of typical splitting scheme is sphere of movements for the elephants shape splitting scheme;Described sphere of movements for the elephants shape splitting scheme, when dividing for every layer, is divided into, according to sphere of movements for the elephants shape, the lower interface region that four equal in magnitude, shape is identical current interface region;Optional splitting scheme is it is also possible that but be not limited to " nine grids splitting scheme ";
Described step-by-step recursion ground divides interface zone, refers to and adopts selected splitting scheme, from top layer interface zone, according to the number of plies set, successively divides the process of each interface zone;
On described remote control unit, the number of interface map unit is identical with the number that selected splitting scheme every layer divides the lower interface region produced, and described lower interface region is mapped to described interface image unit one by one.
6. method according to claim 4, it is characterised in that described successively select interface zone, finally navigates to the step of target interactive elements interface focus and includes:
Step one, under initial situation, top layer interface zone becomes the interface zone of " selected state ";
Step 2, each interface image unit that the lower interface region of " selected state " interface zone is mapped on described remote control unit one by one;
Step 3, interface system waits and receives the alternative events from remote control unit;
Step 4, if alternative events type is that certain interface image unit is clicked, then interface zone acquisition " selected state " that clicked interface image unit is mapped, jump to step 10;
Step 5, if alternative events type is clicked for " determination " control unit, if interface has the interface zone being in " selected state ", and described interface zone comprises multiple interactive elements, then according to described prediction algorithm, focus is automatically positioned predicted target interactive elements, and the step of described locating interface focus completes, and jumps to step 9;
Step 6, if alternative events type is " return ", control unit is clicked, if there being the interface zone of the non-top layer being in " selected state " on interface, then " selected state " of described interface zone is transferred to upper interface region, jumps to step 10;
Step 7, if alternative events type is long pressed for " return " control unit, if there being the interface zone of the non-top layer being in " selected state " on interface, then " selected state " of described interface zone is transferred to top layer interface zone, jumps to step 2;
Step 8, if alternative events type is that certain direction controlling unit is clicked, if interface has the interface zone being in " selected state ", and described interface zone is region, described side extrorse non-top bed boundary, and interface content can roll, then interface content rolls along the rightabout in described direction, and new content of pages enters the interface zone of described " selected state ", jumps to step 10;Otherwise, " selected state " of described interface zone is transferred to described direction interface zone adjacent, same level, jumps to step 10;
Step 9, if receiving the instruction terminating to run, then exits, otherwise jumps to step 3;
Step 10, if being in " selected state " interface zone only comprise an interactive elements, then focus moves directly to described interactive elements, and the step of described locating interface focus completes, and jumps to step 9;Otherwise, if the interface zone being in " selected state " comprises multiple interactive elements, then adopt selected prediction algorithm, it was predicted that certain interactive elements in described interface zone is target interactive elements, and on interface, give the user visual cues, then branch to step 2.
7. method according to claim 6, it is characterised in that:
Judge that the method whether certain interactive elements is included in certain interface zone is: if the entire area of certain interactive elements both falls within interface zone, then judge that this interactive elements is included in interface zone;If interactive elements only has area to fall in interface zone, then according to the area accounting value pre-set to judge whether this interactive elements is included in this interface zone.
8. method according to claim 6, it is characterised in that:
Described prediction algorithm may be, but not limited to, " maximum area interactive elements is preferential " algorithm and " interactive elements placed in the middle is preferential " algorithm:
The method of described " maximum area interactive elements is preferential " algorithm predicts target interactive elements is: the interactive elements that in interface zone, occupied area is maximum is predicted to be target interactive elements;
The method of described " interactive elements placed in the middle is preferential " algorithm predicts target interactive elements is: the interactive elements that in interface zone, position is the most placed in the middle is predicted to be target interactive elements.
CN201610148634.0A 2016-03-16 2016-03-16 A kind of intelligent terminal interactive system and method based on remote control device Expired - Fee Related CN105808123B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610148634.0A CN105808123B (en) 2016-03-16 2016-03-16 A kind of intelligent terminal interactive system and method based on remote control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610148634.0A CN105808123B (en) 2016-03-16 2016-03-16 A kind of intelligent terminal interactive system and method based on remote control device

Publications (2)

Publication Number Publication Date
CN105808123A true CN105808123A (en) 2016-07-27
CN105808123B CN105808123B (en) 2019-06-07

Family

ID=56468564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610148634.0A Expired - Fee Related CN105808123B (en) 2016-03-16 2016-03-16 A kind of intelligent terminal interactive system and method based on remote control device

Country Status (1)

Country Link
CN (1) CN105808123B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106844606A (en) * 2017-01-16 2017-06-13 青岛海信宽带多媒体技术有限公司 The focus processing method and processing device of webpage
CN109782985A (en) * 2018-12-07 2019-05-21 广州市诚毅科技软件开发有限公司 A kind of visual intelligent key control method, system and storage medium
CN114063993A (en) * 2022-01-12 2022-02-18 北京智象信息技术有限公司 Focus movement processing method and system and computer readable storage medium
CN114712849A (en) * 2022-05-16 2022-07-08 北京视游互动科技有限公司 Cross-platform application operation method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134578A1 (en) * 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
CN102880416A (en) * 2012-08-31 2013-01-16 广东欧珀移动通信有限公司 Remote control unlocking method for mobile equipment in remote control protocol (RCP) communication process
CN103096155A (en) * 2012-12-24 2013-05-08 四川长虹电器股份有限公司 Method of quick positioning of mouse function achieved by remote controller

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134578A1 (en) * 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
CN102880416A (en) * 2012-08-31 2013-01-16 广东欧珀移动通信有限公司 Remote control unlocking method for mobile equipment in remote control protocol (RCP) communication process
CN103096155A (en) * 2012-12-24 2013-05-08 四川长虹电器股份有限公司 Method of quick positioning of mouse function achieved by remote controller

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106844606A (en) * 2017-01-16 2017-06-13 青岛海信宽带多媒体技术有限公司 The focus processing method and processing device of webpage
CN109782985A (en) * 2018-12-07 2019-05-21 广州市诚毅科技软件开发有限公司 A kind of visual intelligent key control method, system and storage medium
CN114063993A (en) * 2022-01-12 2022-02-18 北京智象信息技术有限公司 Focus movement processing method and system and computer readable storage medium
CN114063993B (en) * 2022-01-12 2022-07-26 北京智象信息技术有限公司 Focus movement processing method and system and computer readable storage medium
CN114712849A (en) * 2022-05-16 2022-07-08 北京视游互动科技有限公司 Cross-platform application operation method and device, electronic equipment and storage medium
CN114712849B (en) * 2022-05-16 2022-10-21 北京视游互动科技有限公司 Cross-platform application operation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN105808123B (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN105808123A (en) Intelligent terminal interaction system and method based on remote control device
CN101667058B (en) Interactive method for switching focuses among multiple systems
KR102071575B1 (en) Moving robot, user terminal apparatus, and control method thereof
EP2896205B1 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
CN103309555B (en) The method and device of focus based on multiwindow switching
CN103562839B (en) Multi-application environment
US9467729B2 (en) Method for remotely controlling smart television
CN103914258A (en) Mobile terminal and method for operating same
US20090113478A1 (en) System and method for interacting with a program guide displayed on a portable electronic device
CN103546818A (en) Method and device for focus control of list display interface of smart television
CN102622868B (en) A kind of method for remotely controlling, display control unit, telepilot and system
CN103941973A (en) Batch selection method and device and touch screen terminal
US20120315607A1 (en) Apparatus and method for providing an interface in a device with touch screen
CN104778001A (en) Picture control method and picture control system
CN103699316B (en) A kind of method and terminal for transplanting directionkeys by afloat contact
CN103648045A (en) Method and device for switching display interface through side navigation bar
CN104063117A (en) Household appliance as well as control device and method thereof
CN109194815A (en) Operating method, device and computer readable storage medium based on multi-screen terminal
CN103377235A (en) Display method and system for terminal webpage window and touch terminal
CN109189301A (en) Screen capture method and device
CN108205402A (en) Mobile terminal and its background process processing method
CN104038829B (en) A kind of application switching method and device, electronic equipment
KR101384493B1 (en) System for interworking and controlling devices and user device used in the same
CN103024568A (en) Control method and remote control system of smart TV through air mouse
US20230370713A1 (en) Focusing method and apparatus, electronic device, and medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190607

CF01 Termination of patent right due to non-payment of annual fee