CN106201284A - user interface synchronization system and method - Google Patents
user interface synchronization system and method Download PDFInfo
- Publication number
- CN106201284A CN106201284A CN201510340993.1A CN201510340993A CN106201284A CN 106201284 A CN106201284 A CN 106201284A CN 201510340993 A CN201510340993 A CN 201510340993A CN 106201284 A CN106201284 A CN 106201284A
- Authority
- CN
- China
- Prior art keywords
- electronic installation
- eye
- user
- instruction
- gaze
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 41
- 230000009471 action Effects 0.000 claims abstract description 60
- 238000004458 analytical method Methods 0.000 claims abstract description 45
- 238000009434 installation Methods 0.000 claims description 144
- 230000033001 locomotion Effects 0.000 claims description 93
- 238000012545 processing Methods 0.000 claims description 65
- 230000005540 biological transmission Effects 0.000 claims description 38
- 238000006073 displacement reaction Methods 0.000 claims description 38
- 230000001360 synchronised effect Effects 0.000 claims description 15
- 230000005611 electricity Effects 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 230000000284 resting effect Effects 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 abstract description 5
- 230000004424 eye movement Effects 0.000 abstract 5
- 238000013507 mapping Methods 0.000 abstract 3
- 210000001508 eye Anatomy 0.000 description 153
- 230000000875 corresponding effect Effects 0.000 description 36
- 238000010586 diagram Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 210000005252 bulbus oculi Anatomy 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000010409 thin film Substances 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000002207 retinal effect Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 239000000700 radioactive tracer Substances 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 208000001692 Esotropia Diseases 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004020 luminiscence type Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003565 oculomotor Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- VEMKTZHHVJILDY-UHFFFAOYSA-N resmethrin Chemical compound CC1(C)C(C=C(C)C)C1C(=O)OCC1=COC(CC=2C=CC=CC=2)=C1 VEMKTZHHVJILDY-UHFFFAOYSA-N 0.000 description 1
- 230000004256 retinal image Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
A user interface synchronization system comprises a graphic output module, a command conversion module, a mapping module and an eye movement command analysis module, wherein the graphic output module and the command conversion module are coupled to an electronic device, and the mapping module and the eye movement command analysis module are coupled to a wearable device. The image output module accesses the image data of the electronic device and transmits the image data to the wearable device through a wireless network. The mapping module displays the image data on the output unit through the wearable device so as to allow a user to watch the image data. The eye movement instruction analysis module analyzes the acquired eye movement instruction and transmits the eye movement instruction to the electronic device through a wireless network. The instruction conversion module converts the eye action instruction when receiving the eye action instruction so as to output the eye action instruction as an action instruction which can be executed by the electronic device.
Description
Technical field
The present invention relates to a kind of user interface and synchronize system and method, espespecially a kind of for electricity
The Tong Bu system and method for user interface between sub-device with Wearable device.
Background technology
Eye is dynamic follows the trail of (eye tracking), refer to the eye image by analyzing user or
The motor pattern of eyeball opposing headers, realizes oculomotor tracing program.Eye tracker
Be a kind of can tracking measurement eyeball position and the equipment of Eyeball motion information, visual system,
Psychology, cognitive linguistics research in be widely used.Eye moves tracking at present has multiple
Method, eye more typically moves tracer technique and includes Purkinje image tracing method
(Dual-Purkinje-Image, DPI), infrared image systems approach (Infra-Red
Video System, IRVS), infrared ray electrooculogram method (Infra-Red Oculography,
IROG) etc., can use judge the direction of gaze of user by obtaining the eye image of user.
Coming in the recent period, the mobile terminal of part brand releases the Technology Integration that eye moves tracking to moving
The technology of dynamic terminal, by following the trail of the direction of gaze of user, mobile terminal can produce correspondence
Operational order, to assign the instruction of correspondence to this mobile terminal.This kind of technology can be easy to use
Family operating display curtain, when the wherein proficiency of user cannot be available operation mobile terminal,
By following the trail of the eyeball of user, the instruction of correspondence still can be inputted.Another kind of situation can be applicable to
Movie playback, the direction watched attentively by user, it is judged that whether user watches display screen attentively,
Judge when user does not watch display screen attentively, temporarily start and break function, it is ensured that user will not
Miss any wonderful in film.
But, according to current existing technology, via the front lens shooting user of mobile terminal
Eye, acquired image is easily subject to the impact of environmental factors (such as Gao Guang, the low ring of light
Border), it is likely to encounter difficulty when following the trail of eye motion.It is intended to carry out accurate eye motion tracking
Time (such as gazing direction detecting), due to the front camera lens of mobile terminal, display screen and user's eye
The distance in portion is not fixed, it is impossible to obtain the direction of gaze of user by the way of range of triangle,
It is only capable of the image with regard to user and judges that user watches attentively to the left or to the right briefly.
Summary of the invention
The main object of the present invention, is that solving electronic installation in prior art cannot accurately lead to
Cross and carry out the problem that eye moves tracer technique operation.
For solving the problems referred to above, the present invention provides a kind of user interface to synchronize system, and being used for will
Electronic installation matches with Wearable device, and described user interface synchronizes system and includes:
Images outputting module, is coupled to this electronic installation, to access the image of this electronic installation
Data, and the image data of this electronic installation is sent to this Wearable dress via wireless network
Put;
Image module, is coupled to this Wearable device, this image data is shown in this wearing
On the output unit of formula device, watch operation attentively for user;
The dynamic instruction analysis module of eye, is coupled to this Wearable device, analyzes this Wearable device
Accessed eye motion instruction;And
Instruction modular converter, is coupled to this electronic installation or this Wearable device, is receiving
During the instruction of this eye motion, the instruction of this eye motion is changed, this eye motion to be referred to
Order is output as being available for the action command that this electronic installation performs.
Further, this image module sets up display on the output unit of this Wearable device
The user interface windows of this image data, and press according to the length and width of the display screen of electronic installation
This user interface windows of adjustment is zoomed in or out according to equal proportion.
Further, this dynamic instruction analysis module system according to accessed eye image,
Analyze the direction of gaze of this user, and formed on this user interface windows and can watch attentively according to this
The cursor that direction is moved.
Further, when the direction of gaze of this user rests on this user interface windows,
It is corresponding in the display of this electronic installation that this dynamic instruction analysis module then records this direction of gaze
Coordinate position on screen, and when this direction of gaze substantially stays on same graphic interface
When exceeding set threshold time, triggering command is sent to this instruction via wireless network
Modular converter is to start one or more program corresponding to this graphic interface.
Further, this dynamic instruction analysis module at the direction of gaze this user being detected is
When the most quickly moving, this eye motion instruction of page turning on the left of transmission turns to this instruction
Die change block, so that this electronic installation performs by the action command of left side page turning to the right, this eye
Dynamic instruction analysis module is the most quickly to move in the direction of gaze this user being detected
Time, this eye motion of page turning on the right side of transmission instructs to this instruction modular converter, so that this electricity
Sub-device performs by the action command of right side page turning to the left.
Further, this dynamic instruction analysis module is when the trigger action of this user being detected
Set reference coordinate, persistently detect the direction of gaze of this user, and record this direction of gaze phase
X-axis displacement and Y-axis displacement to this reference coordinate, when this X-axis displacement
Or this Y-axis displacement more than threshold value time, transmission corresponding to this eye moving direction and movement
The eye motion of distance instructs to this instruction modular converter, so that this electronic installation performs correspondence
Action command in this eye moving direction scrolling.
Further, this wireless network use wireless fidelity direct-connected (WiFi Direct) agreement,
Bluetooth is wirelessly transferred (Bluetooth) or virtual radio AP (Wi-Fi soft AP).
It is another object of the present invention to provide a kind of controlled end electronic installation, include aobvious
Display screen curtain, wireless transmission unit, and connect this display screen and this wireless transmission unit
Processor.This processor cording has Graphics Processing Unit, for image data is sent to this
To provide the user interface for user operation on display screen.This processor includes computing list
Unit is for carry and performs following program:
Images outputting module, for accessing the image data of this electronic installation, and by this electronics
Image data shown by device is sent to Wearable device via wireless network, supplies this to wear
Wear formula device to export for user's visual operation.
Instruction modular converter, receives, via wireless network, the eye that this Wearable device is provided
Action command, the action that the instruction of this eye motion is output as being available for the execution of this electronic installation refers to
Order.
Further, this wireless network use wireless fidelity direct-connected (WiFi Direct) agreement,
Bluetooth is wirelessly transferred (Bluetooth) or virtual radio AP (Wi-Fi soft AP).
It is another object of the present invention to provide a kind of main control end Wearable device, include
Output unit, wireless transmission unit, image unit, and be connected to this output unit, be somebody's turn to do
Wireless transmission unit and the processor of this image unit.This image unit system is used for shooting and taking
Obtain the eye image of user.This processor cording has Graphics Processing Unit, for by image number
According to the user interface being sent on this output unit for user operation.This processor includes
Arithmetic element is for carry and performs following program:
Image module, is the image data being obtained electronic installation by wireless network, and should
Image data is sent to this Graphics Processing Unit of this Wearable device with display at this output list
In unit, watch operation attentively for user.
The dynamic instruction analysis module of eye, obtains via the eye shadow obtained captured by this image unit
Picture, and instructed by this eye image capturing eye motion, by wireless network, this eye is moved
It is sent to this electronic installation, to start one or more program of this electronic installation as instruction.
Further, this image module sets up display on the output unit of this Wearable device
The user interface windows of this image data, and the length and width of the display screen according to this electronic installation
This user interface windows of adjustment is zoomed in or out according to equal proportion.
Further, this dynamic instruction analysis module system according to this accessed eye image,
Analyze the direction of gaze of this user, and formed on this user interface windows and can watch attentively according to this
The cursor that direction is moved.
Further, when the direction of gaze of this user stays on this user interface windows,
It is corresponding in the display of this electronic installation that this direction of gaze is noted down by this dynamic instruction analysis module system
Coordinate position on screen, and detect this direction of gaze substantially stay in same graphically
When exceeding set threshold time on interface, triggering command is sent to via wireless network
This electronic installation is to start one or more program corresponding to this graphic interface.
Further, this dynamic instruction analysis module at the direction of gaze this user being detected is
When the most quickly moving, the eye motion of page turning on the left of transmission instructs to this electronic installation,
So that this electronic installation performs by the program of left side page turning to the right, this eye moves instruction analysis mould
Block, when the direction of gaze this user being detected is the most quickly to move, transmits right side-overturn
The eye motion of page instructs to this electronic installation, so that this electronic installation performs by right side to the left
The program of rollover page.
Further, this dynamic instruction analysis module ties up to detect the trigger action of this user
Time set reference coordinate, persistently detect the direction of gaze of this user, and record this direction of gaze
The relatively X-axis displacement of this reference coordinate and Y-axis displacement, when this X-axis move away from
From or this Y-axis displacement more than threshold value time, transmission corresponding to this eye moving direction and shifting
The eye motion of dynamic distance instructs to this electronic installation so that this electronic installation perform corresponding to
The program of this eye moving direction scrolling.
Further, this wireless network use wireless fidelity direct-connected (WiFi Direct) agreement,
Bluetooth is wirelessly transferred (Bluetooth) or virtual radio AP (Wi-Fi soft AP).
It is another object of the present invention to provide a kind of Wearable device and the boundary of electronic installation
Face synchronous method, including: access the image data of this electronic installation, and by this electronic installation
Image data be sent to this Wearable device via wireless network;By this Wearable device
Image data is shown on the output unit of this Wearable device, watches operation attentively for user;
Analyze the instruction of this eye motion accessed by Wearable device, and this eye motion is instructed
It is sent to this electronic installation via wireless network;And when receiving the instruction of this eye motion
The instruction of this eye motion is changed, to be output as being available for this electricity by the instruction of this eye motion
The action command that sub-device performs.
Further, this Wearable device is after receiving this image data, according to this electronics
The length and width of the display screen of device zoom in or out foundation according to equal proportion and show this image data
User interface windows.
Further, this Wearable device system is according to accessed eye image, and analyzing should
The direction of gaze of user, and formed on this user interface windows and can move according to this direction of gaze
Dynamic cursor.
Further, when the direction of gaze of this user rests on this user interface windows, then
Record the corresponding coordinate position on the display screen of this electronic installation of this direction of gaze, and
Substantially stay at this direction of gaze and on same graphic interface, exceed set threshold time
Time, triggering command is sent to this electronic installation to start this graphical boundary via wireless network
One or more program corresponding to face.
Further, when the direction of gaze this user being detected is the most quickly to move,
The eye motion of page turning on the left of transmission instructs to this electronic installation, so that this electronic installation performs
By the action command of left side page turning to the right, it is by the right side at the direction of gaze this user being detected
When quickly moving to a left side, the eye motion instruction of page turning on the right side of transmission to this electronic installation, with
This electronic installation is made to perform by the action command of right side page turning to the left.
Further, set reference coordinate when the trigger action of this user being detected, persistently detect
The direction of gaze of this user, and record this direction of gaze and move relative to the X-axis of this reference coordinate
Distance and Y-axis displacement, when this X-axis displacement or this Y-axis displacement are more than threshold
During value, transmit the eye motion corresponding to this eye moving direction and displacement and instruct to being somebody's turn to do
Electronic installation, so that this electronic installation performs the action corresponding to this eye moving direction scrolling
Instruction.
Further object of the present invention, is to provide a kind of embodied on computer readable programmed recording medium, its
Upper record one program, after electronic installation and Wearable device load this program and perform, be
Method as above can be completed.
Further object of the present invention, is to provide a kind of computer program, when this calculating
When machine program product is loaded in electronic installation and Wearable device execution, as above institute can be completed
The method stated.
Therefore, the present invention has a following excellent effect compared to aforementioned known techniques:
1. the image data of electronic installation can be sent to by the user interface synchronization system of the present invention
On the output unit of Wearable device, so that by the eye motion operation electronics following the trail of user
Device.
2. can be maintained at fixing spacing between the front camera lens of the present invention and user's eye, be easier to
The eye motion of detection user.
Accompanying drawing explanation
Fig. 1: represent that user interface of the present invention synchronizes the block schematic diagram of system.
Fig. 2: represent that user interface of the present invention synchronizes the use view of system.
Fig. 3: represent the schematic diagram () of user interface windows.
Fig. 4: represent eye motion produced track schematic diagram () on user interface windows.
Fig. 5: represent eye motion produced track schematic diagram (two) on user interface windows.
Fig. 6: represent the schematic diagram (two) of user interface windows.
Fig. 7: represent the schematic diagram of another kind of user interface windows.
Fig. 8;Represent the schematic flow sheet () of user interface synchronous method of the present invention.
Fig. 9: represent the schematic flow sheet (two) of user interface synchronous method of the present invention.
Figure 10: represent the schematic flow sheet (three) of user interface synchronous method of the present invention.
Figure 11: represent the schematic flow sheet (four) of user interface synchronous method of the present invention.
Symbol description
100 user interfaces synchronize system
10 electronic installations
11 display screens
12 processing units
13 Graphics Processing Unit
14 storage elements
16 wireless transmission unit
17 images outputting modules
18 instruction modular converters
CU1 processor
20 Wearable devices
21 output units
22 processing units
23 Graphics Processing Unit
24 storage elements
25 image units
26 wireless transmission unit
27 image modules
28 dynamic instruction analysis module
CU2 processor
W user interface windows
W1 vernier
W2 intervalometer
W3, W4, W5, W6 arrow (upper and lower, left and right)
Step S201~S205
Step S2051A~S2054A
Step S2051B~S2053B
Step S2051C~S2055C
Detailed description of the invention
Detailed description for the present invention and technology contents, existing just coordinate illustrate as
Under.Furthermore, the diagram in the present invention, for convenience of explanation, its ratio may not be by actual ratio
Example is drawn, and has situation about exaggerating, and such diagram and ratio thereof are non-for limiting the present invention
Scope.
The present invention is that a kind of user interface synchronizes system 100, for by electronic installation 10
Picture be sent on Wearable device 20, follow the trail of user's by Wearable device 20
Eye motion is to operate this electronic installation 10.
Described electronic installation 10 is at least to include display screen 11, processing unit
12 (Central Processing Unit, CPU) and can the figure of input and output image
Shape processing unit 13 (Graphics Processing Unit, GPU).Specifically,
This electronic installation 10 can be (such as) cellular telephone, smart phone, flat board calculating
Machine, handheld action communication device, personal digital assistant (Personal Digital
Assistant, PDA) or this portable electronic devices of class, additionally, this electronic installation
10 also can be the tools such as computer, desktop computer, notebook computer, car-mounted computer
There is display and control the electronic installation at interface.
Described Wearable device 20 refers in particular to a kind of wearable type dress being worn on user's head
Put, user can be provided exercisable user interface by output unit 21, by shooting
Unit 25 shoots the eye image of the eye acquisition user of user, watching attentively by user
The user interface that direction operation is above-mentioned.Described Wearable device 20 at least includes and carries
For image to user's eye output unit 21, shooting user's eye to obtain user's eye
The image unit 25 of portion's image, processing unit 22 (Central Processing Unit,
And can the Graphics Processing Unit 23 (Graphics of input and output image CPU),
Processing Unit,GPU).Specifically, this Wearable device 20 can be intelligence
Type glasses, eye mark tracker, augmented reality device, virtual reality device or class this
Intelligent object wearing device.
As it is shown in figure 1, synchronize the block schematic diagram of system for user interface of the present invention, as
Shown in figure:
It is below that the hardware structure being respectively directed to electronic installation 10 and Wearable device 20 enters
Row explanation, after hardware structure has illustrated, rear can be carried out for the part of software architecture
Further description.
Electronic installation:
As it was previously stated, described electronic installation 10 is as controlled end, with by image data
It is sent to this Wearable device 20, and is chased after by the eye motion of this Wearable device 20
Track function passes through this electronic installation 10 of wireless network operations.This electronic installation 10 is to include
Have display screen 11, processing unit 12 (Processing Unit), can input and output shadow
The Graphics Processing Unit 13 (Graphics Processing Unit, GPU) of picture, storage
Unit 14 and wireless transmission unit 16.
Described processing unit 12 can collectively form processor with Graphics Processing Unit 13
CU1, makes this processing unit 12 can be integrated in one chip with Graphics Processing Unit 13
On, therefore reduce the volume accounted for needed for assembly.For example, described processor
CU1 can be such as ARM Holdings, Ltd. exploitationSeries of processes
The Godson that the Institute of Computing Technology (ICT) of device and the Chinese Academy of Sciences is developed
(Loongson) processors etc., are not limited in the present invention.
In a further preferred embodiment, described processing unit 12 and graphics process list
Unit 13 can constitute individually processor, processes the work such as logical operations and image procossing respectively
Make, and the common or collaborative subprogram that processes instructs.
In a further preferred embodiment, described processing unit 12 can be with storage element 14
Collectively forming processor, this processing unit 12 can load what this storage element 14 was prestored
Program, and perform the algorithm of correspondence.
In this embodiment, this processing unit 12 is to collectively form with Graphics Processing Unit 13
Processor CU1, this processor CU1 are also coupled to this storage element 14.This processor
CU1 can be central processing unit (Central Processing Unit, CPU), or
Other programmables also have the microprocessor of general service or specific use
(Microprocessor), digital signal processor (Digital Signal
Processor, DSP), programmable controller, ASIC
(Application Specific Integrated Circuits, ASIC), can journey
Sequence logical device (Programmable Logic Device, PLD) or other be similar to
Device or the combination of these devices.
Described display screen 11 is for showing graph data, such as user operation circle
Face, graphic interface or multimedia image etc., by by image or operation interface display
This display screen 11 reads for user.Described display screen 11 can be actively
Formula array organic light emitting apparatus (AMOLED) display, thin film transistor (TFT) (Thin Film
Transistor, TFT) this display device of display or other classes, in the present invention
Do not limited.Described display screen 11 is driven by control circuit, by input
Corresponding signal is to data line drive circuit and scan line drive circuit, to drive panel to exist
Luminescence unit (pixel elements) in respective coordinates.Described display screen 11 passes through figure
After the processing unit 13 data in access storage element 14, by corresponding multimedia number
According to issuing on display screen 11, visual for user.
Described wireless transmission unit 16 is can be carried out data transmission by wireless network.
Specifically, this wireless network use wireless fidelity direct-connected (WiFi Direct) agreement,
Bluetooth is wirelessly transferred (Bluetooth) or virtual radio AP (Wi-Fi soft AP).
In a further preferred embodiment, described wireless transmission unit 16 can pass through less radio-frequency
Identification (Radio Frequency Identification, RFID) technology fills with Wearable
Put 20 to match, use in carrying out with this Wearable device 20, short-range wireless
Data are transmitted.
Wearable device:
This Wearable device 20 is can be as main control end, by receiving this electronic installation 10
Image data, and by this image data export to user's eye, watch behaviour attentively for user
Make.As it was previously stated, this Wearable device 20 is to include output unit 21, process list
Unit 22 (Central Processing Unit, CPU), can the figure of input and output image
Shape processing unit 23 (Graphics Processing Unit, GPU), image unit 25,
And wireless transmission unit 26.
Described processing unit 22 is roughly the same with the processing unit 12 of electronic installation 10,
Here, the explanation for processing unit 22 is the most no longer repeated.With electronic installation 10
Processing unit 12, in a preferred embodiment, described processing unit 22 can be with figure
Shape processing unit 23 collectively forms processor CU2, makes at this processing unit 22 and figure
Reason unit 23 can be integrated on one chip.In a further preferred embodiment, described
Processing unit 22 and Graphics Processing Unit 23 can constitute individually processor, locate respectively
Manage the work such as logical operations and image procossing, and the common or collaborative subprogram that processes refers to
Order.In a further preferred embodiment, described processing unit 22 can be with storage element 24
Collectively forming processor, this processing unit 22 can load what this storage element 24 was prestored
Program, and perform the algorithm of correspondence.
In this embodiment, this processing unit 22 is to collectively form with Graphics Processing Unit 23
Processor CU2, this processor CU2 system is coupled to this storage element 24.This processing unit
22 can be central processing unit (Central Processing Unit, CPU), or its
His programmable also has the microprocessor of general service or specific use
(Microprocessor), digital signal processor (Digital Signal
Processor, DSP), programmable controller, ASIC
(Application Specific Integrated Circuits, ASIC), can journey
Sequence logical device (Programmable Logic Device, PLD) or other be similar to
Device or the combination of these devices.
Described output unit 21 is for showing graph data, and is sent by graph data
To the eye of user for user's visual operation.This output unit 21 can be display screen,
The most active array organic light emitting apparatus (AMOLED) display, thin film transistor (TFT)
This display device of (thin film transistor, TFT) display or other classes.
In a further preferred embodiment, described output unit 21 can be retinal display,
(Retinal Imaging Display, RID) technology is projected by retinal image, will
Picture directly projects to be watched for user on the retina.Retinal display is by glass
Reflection, light beam can imaging the most on the retina, allow image strip and naked eyes be seen
Outdoor scene merges.Described output unit 21 is to pass through Graphics Processing Unit 23 (GPU)
After data in access storage element 24, corresponding multi-medium data is published on use
The eye at family, visual for user.
Described image unit 25 uses and is equipped with photosensitive coupling component (Charge
Coupled Device, CCD) or Complimentary Metal-Oxide quasiconductor (Complementary
Metal-Oxide Semiconductor, CMOS) video camera, the most not
Limited.This image unit 25 is the eye image for shooting user, acquired
Eye image will be sent to eye and move instruction analysis module 28 and be further analyzed,
To follow the trail of the eye motion of user and this eye motion be converted to the eye motion of correspondence
Instruction.
Described wireless transmission unit 26 is can be carried out data transmission by wireless network.
Specifically, this wireless network use wireless fidelity direct-connected (WiFi Direct) agreement,
Bluetooth is wirelessly transferred (Bluetooth) or virtual radio AP (Wi-Fi soft AP).
The hardware structure more than having been for electronic installation 10 and Wearable device 20 is carried out in detail
Thin explanation, here, according further to above-mentioned hardware structure, corresponding user of the present invention
Interface synchronizes the framework of system and is described in detail:
In conjunction with Fig. 2, synchronize the use view of system for user interface of the present invention,
As shown in the figure:
This electronic installation 10 can be matched by the present invention with Wearable device 20.Joining
After success, described electronic installation 10 can be by wireless network by display screen 11
Image data be sent to this Wearable device 20, in order to user is filled by this Wearable
Put this display screen 11 of 20 readings.Described Wearable device 20 is can be by shooting
Unit 25 shoots the eye image of user, with by this electronics of eye motion radio operation
Device 10, uses one or more program starting this electronic installation 10.
With reference to Fig. 1, it is to include being coupled to this electricity that described user interface synchronizes system 100
The images outputting module 17 of sub-device 10, instruction modular converter 18, and it is coupled to this
The image module 27 of Wearable device 20, eye move instruction analysis module 28.
In the present embodiment, described images outputting module 17 and instruction modulus of conversion
Block 18 is to be pre-stored in the storage element 14 of this electronic installation 10, so that this electronics dress
The processing unit 12 putting 10 in this images outputting module 17 of carry and instructs modular converter
Its algorithm is performed after the program of 18.Described image module 27 and the dynamic instruction point of eye
Analysis module 28 is to be pre-stored in the storage element 24 of this Wearable device 20, so that should
The processing unit 22 of Wearable device 20 moves in this image module 27 of carry and eye
Its algorithm is performed after the program of instruction analysis module 28.In a further preferred embodiment,
Described instruction modular converter 18 also can load on this Wearable device 20, this Wearable
The instruction of this eye motion is converted to power by device 20 by this instruction modular converter 18
After the action command that sub-device 10 performs, then via wireless network, this action command is passed
Deliver to this electronic installation 10 to start one or more program.In above-mentioned steps, described
Wearable device 20 be that this action command is encrypted, this electronic installation 10 only needs
By this action command decipher after and perform, described mode can be regarded as the present invention's
Another like is implemented.
Described images outputting module 17 is coupled to this electronic installation 10, to access this electricity
The image data of sub-device 10, and by the image data of this electronic installation 10 via wireless
Network is sent to this Wearable device 20.This images outputting module 17 is for accessing this
The image data (user interface) provided in Graphics Processing Unit 13, acquired shadow
As the image synchronous that data system is shown with on this display screen 11, or according to pre-
Set or user sets, positive closing display screen 11, make this images outputting module 17
Directly image data is sent to this Wearable device 20 by wireless network, to reduce
Burden that this images outputting module 17 is extra and reduce the power consumption of electronic installation.
Described image module 27 is via this wireless transmission unit 26 and this images outputting
Module 17 forms pairing, and this image module 27 is to wear for image data is shown in this
Wear on the output unit 21 of formula device 20, watch operation attentively for user.Such as Fig. 3, it is somebody's turn to do
Image module 27 can set up the user interface windows W showing this image data, and foundation
The length and width of the display screen 11 of electronic installation 10 zoom in or out this use of adjustment according to equal proportion
Family interfaces windows W, so that user is in preferable this user interface of visual experience range of operation
Window W.Described image module 27 lies in and shows on this user interface windows W that one can
The cursor W1 of movement, this cursor W1 system follows the direction of gaze of user and moves, described
Direction of gaze system moves instruction analysis module 28 by eye and calculates.
In a preferred embodiment, this images outputting module 17 can pass through Video Streaming
After a series of image data is compressed by (Streaming media), through wireless network
Segmentation transmits data, and is decompressed by the image module 27 being coupled to this Wearable device 20
Crossfire package, so that being shown in the defeated of this Wearable device 20 in real time by image data
Go out on unit 21.
When this image data shows at the user interface windows that this output unit 21 is provided
After W, user can operate the cursor W1 on this user interface windows W by eye motion,
Or produce corresponding this user of eye motion command operating by continuous print eye motion
Interfaces windows W.This dynamic instruction analysis module 28 analyze user eye motion after,
This eye motion is converted to eye motion instruction by system, and the instruction of this eye motion is passed through
Wireless network is sent to be coupled to the instruction modular converter 18 of this electronic installation 10, for
Instruction modular converter 18 is analyzed, and finds this eye motion to instruct corresponding to this electricity further
The action command of sub-device 10, is referred to corresponding to this action to be started by processing unit 12
One or more program of order.In a preferred embodiment, this instruction modular converter 18 is
Include look-up table, this eye can be moved by this instruction modular converter 18 of this look-up table
Make instruction and be converted to the corresponding action command to this electronic installation 10.Therefore when dressing
When formula device 20 and electronic installation 10 successfully match, user can be by this Wearable device
20 operation this user interface windows W, now, image unit 25 will persistently catch user
Eye image, pass through captured eye motion, user can pass through Wearable device
20 operate this electronic installation 10 by eye motion.
Specifically, it is to include two kinds of functions that this eye moves instruction analysis module 28, its
One is the eye motion corresponding by the eye image analysing computer of user, by this eye motion
Judge the direction of gaze of user;By the direction of gaze of user, it is two by judging that user is intended to
The eye motion instruction of input.
For the technology by eye image acquirement eye motion (direction of gaze), can adopt
With Purkinje image tracing method (Dual-Purkinje-Image, DPI), infrared
Line image system method (Infra-Red Video System, IRVS), infrared ray eye are dynamic
Figure method (Infra-Red Oculography, IROG), does not the most give
Limit.By above-mentioned method, it is can be gathered by multiple eye images of this user
The multiple sample points being mapped on this output unit 21 to this eyes image are (such as Fig. 4 institute
Show), acquired sample point is that the eye to be inputted for analyzing user further moves
Instruct.
The following is the eye motion instruction to part is main to illustrate, user such as can be led to
Cross eye action command input page turning action command to this electronic installation 10, make this electricity
Sub-device 10 overturns the picture on user interface windows W to reach the effect of page turning.Ginseng
According to Fig. 4, for disclosing a series of sample point during scrolling on the right side of user operation, in system
In acquiescence, set centre position as the starting point of eye path instructions and end point, at figure
In showing, original position system is shown as zero, and end position system is shown as, detected
The sequentially arrangement in point-like of coordinate order system.This eye moves instruction analysis module 28 and is detecting
The direction of gaze system of this user is quickly moved to right (right positions) by left (i.e. centre position)
Time dynamic, the instruction of the eye motion instruction of page turning on the left of transmission to this electronic installation 10 turns
Die change block 18, now instruction modular converter 18 is in receiving the eye motion of page turning on the left of this
During instruction, by the action command that look-up table calling is corresponding, this processing unit 12 is made to hold
Row is by the event of left side page turning to the right.
In the part of page turning to the right, refer to Fig. 5, for disclosing user operation right side-overturn
A series of sample point during page, in system default, in the example shown, original position system
Being shown as zero, end position system is shown as, detected coordinate order system sequentially in
Point-like arranges.This dynamic instruction analysis module 28 is detecting the direction of gaze of this user
When being quickly to be moved by right (i.e. centre position) to left (leftward position), page turning on the right side of transmission
Eye motion instruction to the instruction modular converter 18 of this electronic installation 10, now instruct
Modular converter 18 finds the action of correspondence to refer to by look-up table in receiving page turning system on the right side of this
Order, makes this processing unit 12 perform by the event of right side page turning to the left.
When page-turning instruction has inputted, described electronic installation 10 is to enter the short time
Do not react the phase (such as one second), this is to be to be avoided user to pass through eye motion page turning
After, when eyes return to centre position, it is judged as page-turning instruction.
Preferably implementing aspect at another, user can be by eye motion instruction input scrolling page
The action command in face to this electronic installation 10, make the page in this user interface W towards with
The direction scrolling that family is watched attentively.Specifically, described eye moves instruction analysis module 28
Detect the eye motion of this user, set a benchmark when the trigger action of user being detected
(described trigger action can be to blink, close one's eyes, draw a circle or other predefined eyes to point
Action), and persistently record this direction of gaze X-axis displacement relative to this reference coordinate
And Y-axis displacement.When this X-axis displacement or this Y-axis displacement are more than threshold value
Time, it is that transmission includes corresponding to this eye side of movement that this eye moves instruction analysis module 28
Instruct to this instruction to the eye motion of (i.e. X-axis or the positive and negative values of Y-axis) and displacement
Modular converter 18, this instruction modular converter 18 is that this instruction is sent to processing unit 12,
So that this processing unit 12 performs to refer to corresponding to the action of this eye moving direction scrolling
Order.Wherein, acquired eye moving direction using as judge scrolling direction reference value,
Acquired displacement is using as the reference value judging scrolling speed.
With reference to Fig. 6, user can operate this use by watching user interface windows W attentively
Cursor W1 in interfaces windows W of family.Described eye moves instruction analysis module 28 according to institute
The eye image got, analyzes the direction of gaze of this user, and at this user interface
Form, on mouth W, the cursor W1 that can move according to this direction of gaze.When watching attentively of this user
When direction rests on this user interface windows W, this eye moves instruction analysis module 28 and is
Note down the corresponding coordinate on the display screen 11 of this electronic installation 10 of this direction of gaze
Position, and exceed set when this direction of gaze substantially rests on same graphic interface
During threshold time, it is that triggering command is sent to this instruction modular converter via wireless network
18, this instruction modular converter 18 is the action that this triggering command is converted to touch startup
Instruction, starts corresponding to the software program corresponding to the graphic interface on this coordinate position
Or instruction.
As shown in Figure 6, in a preferred embodiment, image module 27 is receiving this
During image data, graphic interface corresponding on this output unit 21 can be delimited model
Enclose and objectification (scope delimited can be by the Graphics Processing Unit 13 of electronic installation 10
Obtain), when the direction of gaze of user rests on the model that wherein a graphic interface delimited
When enclosing interior, vernier W1 is converted into intervalometer W2, and indicates startup graphic interface
The required time.Content as shown, this intervalometer W2 include percentage figures, with
And timing bar, when the direction of gaze of user rests on graphic interface, cursor W1 is i.e.
Being converted to intervalometer W2, now percentage figures and timing bar will show the remaining time,
When percentage figures reaches 100%, now timing bar is also shown as full lattice, and this eye moves
Instruction analysis module 28 is that via wireless network, triggering command is sent to this instruction conversion
Module 18, to start the software program corresponding to this graphic interface or instruction.
See also Fig. 7, be to disclose another preferred embodiment, described user circle
Face-port mouth W can be provided with multiple operating area, and user can watch attentively corresponding to each operating space
Graphic interface on territory, to produce the eye motion instruction corresponding to this operating area.
Such as user, cursor W1 (direction of gaze) rested on the arrow figure W3 of right direction
Time, described eye moves instruction analysis module 28 and the eye motion of page turning on the right side of transmission is referred to
Make the instruction modular converter 18 to this electronic installation 10;When cursor W1 (is watched attentively by user
Direction) when resting on the arrow figure W4 of left direction, described eye moves instruction analysis mould
The instruction of the eye motion instruction of page turning on the right side of transmission to this electronic installation 10 is turned by block 28
Die change block 18;The arrow in direction on the upside of cursor W1 (direction of gaze) is rested on by user
During figure W5, described eye moves instruction analysis module 28 by the eye of page turning on the upside of transmission
Action command is to the instruction modular converter 18 of this electronic installation 10;When user is by cursor
When W1 (direction of gaze) rests on the arrow figure W6 in direction, downside, described eye is dynamic to be referred to
Order is analyzed module 28 and is instructed the eye motion of page turning on the downside of transmission to this electronic installation 10
Instruction modular converter 18;Described instruction modular converter 18 is receiving above-mentioned eye
After portion's action command, the action of correspondence can be found by look-up table or OO mode
Instruction, and this action command is sent to this processing unit 12, to start corresponding one
Or multiple program.
The following is and coordinate diagram to carry out in detail for the user interface synchronous method of the present invention
Explanation, with reference to Fig. 8, for the schematic flow sheet of user interface synchronous method of the present invention,
As shown in the figure:
The user interface synchronous method of the present invention, applies at electronic installation 10 and Wearable
On device 20, can be by the picture on electronic installation 10 be transferred to Wearable device
On 20, with by eye control this electronic installation of function radio operation of this Wearable device 20
The user interface of 10.The idiographic flow being related to interface synchronization is as follows:
Time initial, first electronic installation 10 is matched with Wearable device 20, described
Pairing can by encryption, set up key or other can perform by the way of being mutually authenticated, with
Make to set up between this electronic installation 10 and Wearable device 20 online.(step S201)
When pairing completes, the image data accessing this electronic installation 10 (such as shows
Image on screen 11), and by the image data of this electronic installation 10 via wireless network
Network is sent to this Wearable device 20.(step S202)
Image data, after receiving this image data, is shown by this Wearable device 20
On the output unit 21 of this Wearable device 20, watch operation attentively for user.Described
Wearable device 20 be the display screen 11 according to this electronic installation 10 length and width by
The user interface windows W setting up this image data of display is zoomed in or out according to equal proportion.
(step S203)
This Wearable device 20 is according to accessed eye image, analyzes this user
Direction of gaze, and formed on this user interface windows W and can move according to this direction of gaze
Dynamic cursor W1.(step S204)
Analyze the instruction of this eye motion accessed by Wearable device 20, and by this eye
Portion's action command is sent to this electronic installation 10 via wireless network.This electronic installation 10
When receiving the instruction of this eye motion, the instruction of this eye motion is changed, with should
Eye motion instruction is output as being available for the action command that this electronic installation 10 performs.(step
S205)
It is below that the step three kinds of different embodiments of act for step S205 are said
It is bright, it will be appreciated that, three kinds of embodiments all can be carried out in step S205 simultaneously below:
The first embodiment, referring to the content of Fig. 9.First, first obtain each
The scope that graphic interface is corresponding in display screen array, this scope can be by this electronics
Obtaining in the Graphics Processing Unit 13 of device 10, the scope system of each graphic interface divides
Do not include corresponding one or more program to this graphic interface.(step S2051A)
When the direction of gaze of this user rests on this user interface windows, record this direction of gaze
Corresponding coordinate position on the display screen 11 of this electronic installation 10, to confirm to use
The direction of gaze (step S2052A) at family.The direction of gaze of user rests on a wherein figure
Time on shape interface, start the time of staying of intervalometer record user's direction of gaze, and
Judge whether this direction of gaze exceedes set threshold time, such as 1~2 second (step
S2053A), set by resting on exceed on same graphic interface when the direction of gaze of user
During fixed threshold time (eye motion instruction), triggering command is passed via wireless network
Deliver to this electronic installation 10 to start one or more program corresponding to this graphic interface
(action command).(step S2054A).Otherwise, if if this direction of gaze leaves this figure
In the range of shape interface, then return to step S2052A, continue watching attentively of detection user
Direction.
The second embodiment, referring to the content of Figure 10.First, decision-making is started
Program, is the direction of gaze (step S2051B) persistently detecting user in decision-making process,
When the direction of gaze system detecting this user the most quickly moves, page turning on the left of transmission
Eye motion instruct to this electronic installation 10 so that this electronic installation 10 performs by a left side
The action command (step S2052B) of lateral right side page turning.When the note this user being detected
When apparent direction system the most quickly moves, the eye motion of page turning on the right side of transmission instructs extremely
This electronic installation 10, so that this electronic installation 10 performs moving by right side page turning to the left
Instruct.(step S2053B)
The third embodiment, referring to the content of Figure 11.First, this Wearable
Device 20 detects user's whether behavior one trigger action (step S2051C).In detection
Set a reference coordinate during to the trigger action of this user, persistently detect watching attentively of this user
Direction, and record this direction of gaze relative to the X-axis displacement of this reference coordinate and Y-axis
Displacement (step S2052C).Continue, it is judged that this X-axis displacement or this Y-axis
Displacement, whether more than threshold value (step S2053C), if so, enters next step,
If it is not, return to step S2052C.When this X-axis displacement or this Y-axis displacement
During more than threshold value, it is to judge this eye moving direction (step S2054C).To include
The eye motion instruction of this eye moving direction and displacement is sent to this electronic installation
10, so that this electronic installation 10 performs the action corresponding to this eye moving direction scrolling
Instruction (step S2055C).In above-mentioned step, if the direction of gaze of user away from
In time returning to the state less than this threshold value, then involution is to the program before step 2051C, holds
Whether continuous detection user produces a trigger action.
Heretofore described method step can act also as a kind of embodied on computer readable record
Media are implemented, and are used for being stored in the computers such as disc, hard disk, semiconductor memory
Medium capable of reading record, and it is positioned in electronics dress by this computer-readable medium storing
Put and accessed use by this electronic installation or electronic equipment.
Method step of the present invention can act also as a kind of computer program to be implemented,
For being stored in the hard disk of the webserver, memory, such as app store,
Google play, windows fair or other similar application programs are issued online
Platform, can be by downloading for user charges after being uploaded onto the server by computer program
Mode implement.
In sum, the user interface synchronization system of the present invention can be by the image of electronic installation
Data are sent on the output unit of Wearable device, so that in the eye by following the trail of user
Portion's motion action electronic installation.Can maintain solid between the front camera lens of the present invention and user's eye
Fixed spacing, is easier to detect the eye motion of user.
Below the present invention is described in detail, only as described above, the only present invention
Preferred embodiment, when not limiting, with this, the scope that the present invention implements, the most all depend on
What scope of the present invention patent was made changes on an equal basis and modifies, and all should still belong to the present invention's
In patent covering scope.
Claims (24)
1. user interface synchronizes a system, for being carried out with Wearable device by electronic installation
Pairing, it is characterised in that described user interface synchronizes system and includes:
Images outputting module, is coupled to this electronic installation, to access the image of this electronic installation
Data, and the image data of this electronic installation is sent to this Wearable dress via wireless network
Put;
Image module, is coupled to this Wearable device, this image data is shown in this wearing
On the output unit of formula device, watch operation attentively for user;
The dynamic instruction analysis module of eye, is coupled to this Wearable device, analyzes this Wearable device
Accessed eye motion instruction;And
Instruction modular converter, is coupled to this electronic installation or this Wearable device, is receiving
During the instruction of this eye motion, the instruction of this eye motion is changed, this eye motion to be referred to
Order is output as being available for the action command that this electronic installation performs.
2. user interface as claimed in claim 1 synchronizes system, it is characterised in that described
Image module sets up the user showing this image data on the output unit of this Wearable device
Interfaces windows, and according to this electronic installation display screen length and width according to equal proportion amplify or
Reduce this user interface windows of adjustment.
3. user interface as claimed in claim 2 synchronizes system, it is characterised in that described
The dynamic instruction analysis module of eye is based on accessed eye image, analyzes watching attentively of this user
Direction, and on this user interface windows, form the cursor that can move according to this direction of gaze.
4. user interface as claimed in claim 3 synchronizes system, it is characterised in that when this
When the direction of gaze of user rests on this user interface windows, described eye moves instruction analysis mould
Block is to record the corresponding coordinate bit on the display screen of this electronic installation of this direction of gaze
Put, and exceed set threshold on same graphic interface when this direction of gaze substantially rests on
During the value time, triggering command is sent to described instruction modular converter to open via wireless network
Dynamic one or more programs corresponding to this graphic interface.
5. user interface as claimed in claim 3 synchronizes system, it is characterised in that described
The dynamic instruction analysis module of eye is the most quickly to move at the direction of gaze this user being detected
Time, this eye motion of page turning on the left of transmission instructs to this instruction modular converter, so that this electricity
Sub-device performs by the action command of left side page turning to the right, and described eye moves instruction analysis module
When the direction of gaze this user being detected is the most quickly to move, page turning on the right side of transmission
The instruction of this eye motion to described instruction modular converter so that described electronic installation perform by
The action command of right side page turning to the left.
6. user interface as claimed in claim 5 synchronizes system, it is characterised in that described
The dynamic instruction analysis module of eye sets a reference coordinate when the trigger action of this user being detected,
Persistently detect the direction of gaze of this user, and record this direction of gaze X relative to this reference coordinate
Axle displacement and Y-axis displacement, when this X-axis displacement or this Y-axis displacement
During more than threshold value, transmit the eye motion corresponding to this eye moving direction and displacement and refer to
Order is to described instruction modular converter, so that described electronic installation performs to move corresponding to this eye
The action command of direction scrolling.
7. user interface as claimed in claim 1 synchronizes system, it is characterised in that described
Wireless network uses the direct-connected agreement of wireless fidelity, bluetooth is wirelessly transferred or virtual radio connects
Access point.
8. a controlled end electronic installation, it is characterised in that including: display screen, wireless
Transmission unit and be connected to the processor of this display screen and this wireless transmission unit, institute
State processor to include:
Graphics Processing Unit, for being sent on this display screen provide use by image data
The user interface of family operation;
Arithmetic element, for carry and perform following program:
Images outputting module, for accessing the image data of this electronic installation, and by this electronics
Image data shown by device is sent to Wearable device via wireless network, by this wearing
Formula device exports for user's visual operation;And instruction modular converter, via wireless network
Receive the eye motion instruction that this Wearable device is provided, by the instruction output of this eye motion
For being available for the action command that this electronic installation performs.
9. electronic installation as claimed in claim 8, it is characterised in that described wireless network
The direct-connected agreement of wireless fidelity, bluetooth is used to be wirelessly transferred or virtual radio access point.
10. a main control end Wearable device, it is characterised in that including: output unit,
Wireless transmission unit, image unit and be connected to described output unit, described wireless biography
Defeated unit and the processor of described image unit, described image unit is for shooting and obtaining
The eye image of user, described processor includes:
Graphics Processing Unit, for being sent on this output unit provide use by image data
The user interface of family operation;
Arithmetic element, for carry and perform following program:
Image module, is the image data being obtained electronic installation by wireless network, and should
Image data is sent to this Graphics Processing Unit of this Wearable device with display at this output list
In unit, watch operation attentively for user;
The dynamic instruction analysis module of eye, obtains by this eye shadow obtained captured by this image unit
Picture, and instructed by this eye image capturing eye motion, by wireless network, this eye is moved
It is sent to this electronic installation, to start one or more programs of this electronic installation as instruction.
11. Wearable devices as claimed in claim 10, it is characterised in that described reflection
Module sets up the user showing this image data on this output unit of described Wearable device
Interfaces windows, and according to this electronic installation display screen length and width according to equal proportion amplify or
Reduce this user interface windows of adjustment.
12. Wearable devices as claimed in claim 11, it is characterised in that described eye moves
Instruction analysis module system, according to this accessed eye image, analyzes the side of watching attentively of this user
To, and on this user interface windows, form the cursor that can move according to this direction of gaze.
13. Wearable devices as claimed in claim 12, it is characterised in that as this user
Direction of gaze when resting on this user interface windows, described eye moves instruction analysis module will
Record the corresponding coordinate position on the display screen of this electronic installation of this direction of gaze, and
When detect this direction of gaze rest on exceed set threshold value on same graphic interface time
Between time, triggering command is sent to this electronic installation via wireless network graphical to start this
One or more programs corresponding to interface.
14. Wearable devices as claimed in claim 12, it is characterised in that described eye moves
Instruction analysis module detects when the direction of gaze of this user is the most quickly to move, and passes
Pass left side page turning eye motion instruct to this electronic installation so that this electronic installation perform by
The program of left side page turning to the right, described eye moves instruction analysis module and the note of this user detected
When apparent direction is the most quickly to move, the eye motion instruction of page turning on the right side of transmission is to being somebody's turn to do
Electronic installation, so that this electronic installation performs by the program of right side page turning to the left.
15. Wearable devices as claimed in claim 12, it is characterised in that described eye moves
Instruction analysis module system sets reference coordinate when the trigger action of this user being detected, persistently examines
Survey the direction of gaze of this user, and record the X-axis shifting relative to this reference coordinate of this direction of gaze
Dynamic distance and Y-axis displacement, when this X-axis displacement or this Y-axis displacement are more than
During threshold value, transmit the eye motion corresponding to this eye moving direction and displacement and instruct extremely
This electronic installation, so that this electronic installation performs the journey corresponding to this eye moving direction scrolling
Sequence.
16. Wearable devices as claimed in claim 10, it is characterised in that described wireless
Network uses the direct-connected agreement of wireless fidelity, bluetooth to be wirelessly transferred or virtual radio access point.
17. 1 kinds of Wearable devices and the interface synchronous method of electronic installation, it is characterised in that
Including:
Access the image data of this electronic installation, and by the image data of this electronic installation via
Wireless network is sent to this Wearable device;
By this Wearable device, this image data shown the output list at this Wearable device
In unit, watch operation attentively for user;
Analyze the instruction of this eye motion accessed by Wearable device, and by this eye motion
Instruction is sent to this electronic installation via wireless network;When receiving the instruction of this eye motion
The instruction of this eye motion is changed, to be output as being available for this electricity by the instruction of this eye motion
The action command that sub-device performs.
18. interface as claimed in claim 17 synchronous method, it is characterised in that described in wear
Wear formula device after receiving this image data, according to the display screen of described electronic installation
Length and width zoom in or out according to equal proportion and set up the user interface windows showing this image data.
19. interface as claimed in claim 18 synchronous method, it is characterised in that this wearing
Formula device is based on accessed eye image, analyzes the direction of gaze of this user, and
The cursor that can move is formed according to this direction of gaze on this user interface windows.
20. interface as claimed in claim 19 synchronous method, it is characterised in that this user
Direction of gaze when resting on this user interface windows, record this direction of gaze corresponding in
Coordinate position on the display screen of this electronic installation, and substantially rest at this direction of gaze
When exceeding set threshold time on same graphic interface, by triggering command via wireless
Network is sent to this electronic installation to start the one or more journeys corresponding to this graphic interface
Sequence.
21. interface as claimed in claim 19 synchronous method, it is characterised in that work as detection
When direction of gaze to this user is the most quickly to move, the eye of page turning on the left of transmission
Action command is to this electronic installation, so that this electronic installation performs by left side page turning to the right
Action command, when the direction of gaze this user being detected is the most quickly to move, passes
Pass right side page turning eye motion instruct to this electronic installation so that this electronic installation perform by
The action command of right side page turning to the left.
22. interface as claimed in claim 19 synchronous method, it is characterised in that work as detection
To setting reference coordinate during the trigger action of this user, persistently detect the direction of gaze of this user,
And record this direction of gaze relative to the X-axis displacement of this reference coordinate and Y-axis displacement,
When this X-axis displacement or this Y-axis displacement are more than threshold value, transmission is corresponding to this eye
The eye motion of portion's moving direction and displacement instructs to this electronic installation, so that this electronics
Device performs the action command corresponding to this eye moving direction scrolling.
23. 1 kinds of computer-readable medium storings, it is characterised in that can read described
A program is recorded, when electronic installation and Wearable device load this program and hold on record media
After row, the method as according to any one of claim 17 to the 22 can be completed.
24. 1 kinds of computer programs, it is characterised in that when this computer program
When being loaded in electronic installation and Wearable device execution, claim 17 to the 22 can be completed
According to any one of method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW104113704A TWI571768B (en) | 2015-04-29 | 2015-04-29 | A human interface synchronous system, device, method, computer readable media, and computer program product |
TW104113704 | 2015-04-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106201284A true CN106201284A (en) | 2016-12-07 |
CN106201284B CN106201284B (en) | 2020-03-24 |
Family
ID=57453126
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510340993.1A Active CN106201284B (en) | 2015-04-29 | 2015-06-18 | User interface synchronization system and method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106201284B (en) |
TW (1) | TWI571768B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106774893A (en) * | 2016-12-15 | 2017-05-31 | 飞狐信息技术(天津)有限公司 | A kind of virtual reality exchange method and virtual reality device |
CN107820599A (en) * | 2016-12-09 | 2018-03-20 | 深圳市柔宇科技有限公司 | The method of adjustment of user interface, adjustment system and wear display device |
CN112560572A (en) * | 2020-10-24 | 2021-03-26 | 北京博睿维讯科技有限公司 | Camera shooting and large screen interaction processing method, device and system |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180150204A1 (en) * | 2016-11-30 | 2018-05-31 | Google Inc. | Switching of active objects in an augmented and/or virtual reality environment |
US10511842B2 (en) * | 2017-10-06 | 2019-12-17 | Qualcomm Incorporated | System and method for foveated compression of image frames in a system on a chip |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101258436A (en) * | 2005-09-08 | 2008-09-03 | 瑞士电信流动电话公司 | Communication device, system and method |
CN101272727A (en) * | 2005-09-27 | 2008-09-24 | 潘尼公司 | A device for controlling an external unit |
US7762665B2 (en) * | 2003-03-21 | 2010-07-27 | Queen's University At Kingston | Method and apparatus for communication between humans and devices |
CN103472915A (en) * | 2013-08-30 | 2013-12-25 | 深圳Tcl新技术有限公司 | Reading control method and reading control device on basis of pupil tracking and display equipment |
TWM472854U (en) * | 2013-11-27 | 2014-02-21 | Chipsip Technology Co Ltd | Wearable display |
CN103885589A (en) * | 2014-03-06 | 2014-06-25 | 华为技术有限公司 | Eye movement tracking method and device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100235786A1 (en) * | 2009-03-13 | 2010-09-16 | Primesense Ltd. | Enhanced 3d interfacing for remote devices |
JP5539945B2 (en) * | 2011-11-01 | 2014-07-02 | 株式会社コナミデジタルエンタテインメント | GAME DEVICE AND PROGRAM |
-
2015
- 2015-04-29 TW TW104113704A patent/TWI571768B/en active
- 2015-06-18 CN CN201510340993.1A patent/CN106201284B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7762665B2 (en) * | 2003-03-21 | 2010-07-27 | Queen's University At Kingston | Method and apparatus for communication between humans and devices |
CN101258436A (en) * | 2005-09-08 | 2008-09-03 | 瑞士电信流动电话公司 | Communication device, system and method |
CN101272727A (en) * | 2005-09-27 | 2008-09-24 | 潘尼公司 | A device for controlling an external unit |
CN103472915A (en) * | 2013-08-30 | 2013-12-25 | 深圳Tcl新技术有限公司 | Reading control method and reading control device on basis of pupil tracking and display equipment |
TWM472854U (en) * | 2013-11-27 | 2014-02-21 | Chipsip Technology Co Ltd | Wearable display |
CN103885589A (en) * | 2014-03-06 | 2014-06-25 | 华为技术有限公司 | Eye movement tracking method and device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107820599A (en) * | 2016-12-09 | 2018-03-20 | 深圳市柔宇科技有限公司 | The method of adjustment of user interface, adjustment system and wear display device |
CN107820599B (en) * | 2016-12-09 | 2021-03-23 | 深圳市柔宇科技股份有限公司 | User interface adjusting method and system and head-mounted display device |
CN106774893A (en) * | 2016-12-15 | 2017-05-31 | 飞狐信息技术(天津)有限公司 | A kind of virtual reality exchange method and virtual reality device |
CN106774893B (en) * | 2016-12-15 | 2019-10-18 | 飞狐信息技术(天津)有限公司 | A kind of virtual reality exchange method and virtual reality device |
CN112560572A (en) * | 2020-10-24 | 2021-03-26 | 北京博睿维讯科技有限公司 | Camera shooting and large screen interaction processing method, device and system |
Also Published As
Publication number | Publication date |
---|---|
CN106201284B (en) | 2020-03-24 |
TW201638723A (en) | 2016-11-01 |
TWI571768B (en) | 2017-02-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10198867B2 (en) | Display control device, display control method, and program | |
US9288388B2 (en) | Method and portable terminal for correcting gaze direction of user in image | |
US20210166486A1 (en) | Electronic apparatus and method for controlling thereof | |
EP3537709B1 (en) | Electronic device photographing method and apparatus | |
CN106201284A (en) | user interface synchronization system and method | |
CN105210144B (en) | Display control unit, display control method and recording medium | |
CN108712603B (en) | Image processing method and mobile terminal | |
US9412190B2 (en) | Image display system, image display apparatus, image display method, and non-transitory storage medium encoded with computer readable program | |
US11798177B2 (en) | Hand tracking method, device and system | |
WO2016197639A1 (en) | Screen picture display method and apparatus | |
CN108536367B (en) | Interactive page jamming processing method, terminal and readable storage medium | |
EP3349095A1 (en) | Method, device, and terminal for displaying panoramic visual content | |
US20180121711A1 (en) | Display control method and apparatus | |
US9313391B1 (en) | Camera interfaces for electronic devices | |
US20160212318A1 (en) | Information processing device, information processing method, and program | |
CN104423568A (en) | control system, input device and control method for display screen | |
US9536133B2 (en) | Display apparatus and control method for adjusting the eyes of a photographed user | |
US20200342833A1 (en) | Head mounted display system and scene scanning method thereof | |
TW201709022A (en) | Non-contact control system and method | |
US11600241B2 (en) | Display control device, imaging device, display control method, and display control program | |
CN110764852B (en) | Screenshot method, terminal and computer readable storage medium | |
US10742883B2 (en) | Data processing method for generating composite image data indicating positional changes of an object | |
KR20150039352A (en) | Electronic device and control method thereof | |
WO2016208216A1 (en) | User interface device and distance sensor | |
JP2014082648A (en) | Electronic apparatus, image correction method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |