CN109885167B - Data processing method, data transmission method and device - Google Patents

Data processing method, data transmission method and device Download PDF

Info

Publication number
CN109885167B
CN109885167B CN201910137060.0A CN201910137060A CN109885167B CN 109885167 B CN109885167 B CN 109885167B CN 201910137060 A CN201910137060 A CN 201910137060A CN 109885167 B CN109885167 B CN 109885167B
Authority
CN
China
Prior art keywords
information data
data
eye movement
eye
movement information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910137060.0A
Other languages
Chinese (zh)
Other versions
CN109885167A (en
Inventor
王云飞
黄通兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN201910137060.0A priority Critical patent/CN109885167B/en
Publication of CN109885167A publication Critical patent/CN109885167A/en
Application granted granted Critical
Publication of CN109885167B publication Critical patent/CN109885167B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The application discloses a data processing method, a data transmission method and a data transmission device, wherein an upper layer unit acquires an effective mark and an eye movement information data packet sent by a lower layer unit; the eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the effective mark is used for indicating whether the data in the eye movement information data packet is effective data or not; and the upper layer unit analyzes the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is effective data. Since the eye movement information packet includes a plurality of data such as left eye information data, right eye information data, synthesized gaze information data, and the like, the types and forms of data obtained by the plurality of applications in the upper layer unit are the same and compatible with each other. And the upper layer unit can obtain the required eye movement information data only by acquiring the eye movement information data packet once, so that the efficiency is improved.

Description

Data processing method, data transmission method and device
Technical Field
The present invention relates to the field of electronic information technologies, and in particular, to a data processing method, a data transmission method, and a data transmission device.
Background
Currently, many applications in the market need to acquire eye movement data of users and provide related services for the users, such as games, User's Interfaces (UI) and other applications.
In the prior art, a component for collecting eye movement data of a user usually only collects data according to application requirements, and the components for collecting eye movement data of the user are different in types and forms of eye movement data transmitted according to different applications, so that different applications obtain different transmission data, and the data of the different applications cannot be compatible with each other. And when the existing component transmits data according to the application requirements, the data are transmitted according to different requirements of the application, so that when a certain application has multiple requirements, the component needs to be transmitted for multiple times, and the efficiency is low.
Disclosure of Invention
Based on the above deficiencies of the prior art, the present application provides a data processing method, a data transmission method and a device, so as to implement an eye movement information data packet sent by a lower layer unit, which is suitable for various applications and improves data transmission efficiency.
In order to achieve the above object, the following solutions are proposed:
the first aspect of the invention discloses a data processing method, which comprises the following steps:
the upper layer unit acquires an effective mark and an eye movement information data packet sent by the lower layer unit; the eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used for indicating whether the data in the eye movement information data packet is valid data or not;
and the upper layer unit analyzes the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain eye movement information data which corresponds to the requirement information and is effective data.
Optionally, in the data processing method, the parsing, by the upper layer unit, the eye movement information packet according to the valid indicator and the requirement information of the upper layer unit to obtain eye movement information data, which is corresponding to the requirement information and is valid data, includes:
the upper layer unit judges whether the data in the eye movement information data packet is valid data or not according to the valid mark;
if the data in the eye movement information data packet has valid data, the upper layer unit analyzes the eye movement information data packet according to the valid mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is valid data.
Optionally, in the data processing method, the synthesized gaze information data is: the synthesized gaze information data is obtained from at least one of the left eye information data and the right eye information data.
Optionally, in the data processing method, the left-eye information data at least includes: left eye gaze information data; the right-eye information data includes at least: right eye gaze information data.
The second aspect of the present invention discloses a data transmission method, including:
the lower layer unit collects various eye movement information data of a user; the plurality of eye movement information data at least comprise left eye information data, right eye information data and synthesized watching information data;
the lower layer unit creates an effective mark according to the collected various eye movement information data; the effective mark is used for indicating whether the various eye movement information data collected by the lower layer unit are effective data or not;
the lower layer unit combines the various eye movement information data into an eye movement information data packet according to a specified structure;
and the lower layer unit transmits the effective mark and the eye movement information data packet to the upper layer unit.
Optionally, in the data transmission method, the synthesized gaze information data is: the synthesized gaze information data is obtained from at least one of the left eye information data and the right eye information data.
Optionally, in the data transmission method, the left-eye information data at least includes: left eye gaze information data; the right-eye information data includes at least: right eye gaze information data.
A third aspect of the present invention discloses a data processing apparatus, where the data processing apparatus is an upper unit, and the upper unit includes:
the acquisition unit is used for acquiring the effective mark and the eye movement information data packet sent by the lower layer unit; the eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used for indicating whether the data in the eye movement information data packet is valid data or not;
and the analysis unit is used for analyzing the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain eye movement information data which corresponds to the requirement information and is effective data.
Optionally, in the data processing apparatus, the parsing unit includes:
the judging unit is used for judging whether the data in the eye movement information data packet is valid data or not according to the valid mark;
and the analysis subunit is used for analyzing the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is effective data if the data in the eye movement information data packet has effective data.
The fourth aspect of the present invention discloses a data transmission device, wherein the data transmission device is a lower layer unit, and the lower layer unit includes:
the acquisition unit is used for acquiring various eye movement information data of a user; the plurality of eye movement information data at least comprise left eye information data, right eye information data and synthesized watching information data;
the creating unit is used for creating an effective mark according to the collected various eye movement information data; the effective mark is used for indicating whether the various eye movement information data collected by the lower layer unit are effective data or not;
a synthesizing unit configured to combine the plurality of types of eye movement information data into an eye movement information packet according to a specified structure;
and the transmission unit is used for transmitting the effective mark and the eye movement information data packet to an upper layer unit.
According to the technical scheme, the upper unit acquires the effective mark and the eye movement information data packet sent by the lower unit, and the upper unit analyzes the eye movement information data packet according to the effective mark and the requirement information of the upper unit to obtain the eye movement information data which corresponds to the requirement information and is effective data. The eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used to indicate whether the data in the eye movement information packet is valid data. Since the eye movement information packet includes a plurality of data such as left eye information data, right eye information data, synthesized gaze information data, and the like, the types and forms of data obtained by the plurality of applications in the upper unit are the same, and data between different applications can be compatible with each other. And the upper layer unit only needs to obtain the eye movement information data packet once and then analyzes the eye movement information data packet according to the effective mark and the requirement information, so that the required eye movement information data can be obtained, and the efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic flow chart illustrating a data processing method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a data transmission method according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a data transmission device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present application discloses a data processing method, including the following steps:
s101, the upper layer unit obtains the effective mark and the eye movement information data packet sent by the lower layer unit.
The eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data. The valid flag is used to indicate whether the data in the eye movement information packet is valid data.
It should be noted that the order in which the upper unit obtains the effective mark and the eye movement information data packet does not affect the implementation of the embodiment of the present application, and the upper unit may obtain the effective mark and the eye movement information data packet simultaneously, or may obtain the effective mark and the eye movement information data packet sequentially. Wherein, the upper unit mainly refers to various applications, such as games, UIs, and the like. The lower layer unit mainly refers to hardware, software, a chip and other components which can provide eye movement information data for the upper layer unit. The upper layer unit and the lower layer unit can belong to the same device or different devices.
Because the eye movement information data packet is obtained by various applications in the upper layer unit, the data types and forms obtained by the various applications are the same, and the data between different applications can be compatible with each other. Moreover, since the eye movement information data packet acquired in S101 includes various data such as left eye information data, right eye information data, synthesized gaze information data, and the like, the upper unit only needs to acquire the eye movement information data packet once, that is, only needs to execute S101 once, and then performs analysis according to the effective mark and the requirement information of the upper unit, so as to obtain the required eye movement information data, and does not need to receive data transmitted by the lower unit for different requirement information many times, thereby improving the efficiency.
Optionally, in another specific embodiment of the present application, the left-eye information data includes at least left-eye gaze information data; the right-eye information data includes at least right-eye gaze information data. Specifically, the gaze information data may include any one or more of a line-of-sight vector, a gaze point, and a gaze area. The sight vector parameter may be a parameter of a three-dimensional sight direction. The gaze point may be an intersection of the line-of-sight vector and an object, where the object may be an actual object or a virtual object. The gaze region may be configured as a conical, circular, or other shaped region centered on the line of sight or gaze point.
For example, the eyeball center position or the corneal curvature center position may be used as a starting point position of a three-dimensional space, after the sight line vector parameter is analyzed by the upper unit, the sight line may collide with a plurality of objects in a virtual space to determine an object gazed by the user, obtain a gazing point of the user, and calculate a gazing area of the user according to a preset area shape through the gazing point and the sight line.
Optionally, in another specific embodiment of the present application, the left-eye information data may further include left-eye feature information data; the right-eye information data may further include right-eye characteristic information data. Optionally, the left eye feature information data or the right eye feature information data may include one or more of pupil shape parameters such as a pupil position parameter, a pupil diameter, a pupil length and a pupil length, a pupil area, and a pupil axis, an eye opening degree parameter, an eyelid position parameter, an eye corner position parameter, an iris shape parameter, an iris feature parameter, and a purkinje spot position parameter.
Optionally, in another specific embodiment of the present application, the synthesized gaze information data is synthesized gaze information data obtained from at least one of left eye information data and right eye information data. Specifically, the synthesized gaze information may be equal to the left-eye gaze information data or the right-eye gaze information data, or may be calculated from the left-eye gaze information data and the right-eye gaze information data. It is also possible to obtain not only by the left-eye information data and the right-eye information data, but also by combining the left-eye information data or the right-eye information data with other data. For example, the left-eye information data and the right-eye information data may be combined with the head movement information data obtained by a device such as a gyroscope to calculate the synthesized gaze information data.
The valid flag is used to indicate whether the data in the eye movement information packet is valid data. Specifically, the valid flag is used to indicate whether data in an eye movement information packet, such as left eye information data, right eye information data, and synthesized gaze information data, is valid data.
When the eye movement information data packet only contains the left eye information data, the right eye information data and the synthesized gaze information data, optionally, the valid flag may be set to be data of three bits, where each bit is used to indicate whether one data is valid data or not. For example, a value of 0 for one of the bits indicates that the data is invalid, and a value of 1 indicates that the data is valid.
Alternatively, the valid flag may be set to be a single-bit data, and different values of the single-bit data are used to represent whether the left-eye information data, the right-eye information data, and the synthesized gaze information data are valid data. For example, a value of 0 indicates that all three data are invalid data, a value of 1 indicates that the left-eye information data is valid data, and the others are invalid data; when the numerical value is 2, the right-eye information data is valid data, and the other data is invalid data; a value of 3 indicates that all three data are valid data.
Alternatively, the valid flag may be set to be two bits of data, where each bit represents whether the left-eye information data and the right-eye information data are valid or not. For example, a value of 0 for one of the bits indicates that the data is invalid, and a value of 1 indicates that the data is valid. When each digit value is displayed as valid data, the synthetic watching information data can be deduced to be valid data; when one of the numerical values is displayed as invalid data, it is concluded that the synthesized gaze information data is invalid data.
S102, the upper layer unit analyzes the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain eye movement information data which corresponds to the requirement information and is effective data.
It should be noted that the upper layer unit has different required information according to the scene where the upper layer unit is located, the requirements of the user and the designer, and the required information of a certain application in the upper layer unit is not constant. The upper layer unit comprises a plurality of applications, and the requirement information of different applications is different from each other. For example, when a user wants to perform point-of-regard rendering, the upper layer unit needs to obtain screen positions or area information watched by both eyes respectively, that is, the demand information at this time is left eye information data and right eye information data; when a user aims, the upper layer unit needs to obtain the synthesized watching information; when a user needs to perform gaze depth analysis, the upper layer unit is required to obtain the sight line vectors of both eyes, that is, the demand information at this time is left eye information data and right eye information data.
Optionally, in another specific embodiment of the present application, an implementation manner of S102 includes:
and the upper layer unit judges whether the data in the eye movement information data packet is valid data or not according to the valid mark.
If the data in the eye movement information data packet has valid data, the upper layer unit analyzes the eye movement information data packet according to the valid mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is the valid data.
Specifically, the upper layer unit may determine whether valid data exists in the data in the eye movement information packet through the valid flag. When the valid indication indicates that none of the data in the eye movement information packet is valid data, the upper unit does not parse the eye movement information packet. When the effective indication indicates that effective data exists in the data in the eye movement information data packet, the upper layer unit analyzes the eye movement information data which corresponds to the requirement information and is effective data in the eye movement information data packet according to the effective indication and the requirement information. If the valid data indicated by the valid indication does not include the eye movement information data corresponding to the demand information, the upper layer unit does not analyze the eye movement information data packet any more. The upper layer unit finally obtains the eye movement information data which corresponds to the demand information and is valid data.
According to the data processing method, the upper unit acquires the effective marks and the eye movement information data packets sent by the lower unit, and the upper unit analyzes the eye movement information data packets according to the effective marks and the requirement information of the upper unit to obtain the eye movement information data which correspond to the requirement information and are effective data. The eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used to indicate whether the data in the eye movement information packet is valid data. Since the eye movement information packet includes a plurality of data such as left eye information data, right eye information data, synthesized gaze information data, and the like, the types and forms of data obtained by the plurality of applications in the upper unit are the same, and data between different applications can be compatible with each other. And the upper layer unit only needs to obtain the eye movement information data packet once and then analyzes the eye movement information data packet according to the effective mark and the requirement information, so that the required eye movement information data can be obtained, and the efficiency is improved.
Referring to fig. 2, based on the data processing method disclosed in the embodiment of the present application, the embodiment of the present application also discloses a data processing apparatus correspondingly, where the data processing apparatus is an upper unit, and the upper unit includes: an acquisition unit 201 and an analysis unit 202.
The obtaining unit 201 is configured to obtain the valid flag and the eye movement information packet sent by the lower layer unit. The eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used to indicate whether the data in the eye movement information packet is valid data.
Optionally, in another specific embodiment of the present application, the synthesized gaze information data is synthesized gaze information data obtained from at least one of left eye information data and right eye information data.
Optionally, in another specific embodiment of the present application, the left-eye information data at least includes: left eye gaze information data; the right-eye information data includes at least: right eye gaze information data.
Optionally, in another specific embodiment of the present application, the left-eye information data may further include left-eye feature information data; the right-eye information data may further include right-eye characteristic information data.
The parsing unit 202 is configured to parse the eye movement information packet according to the valid indicator and the requirement information of the upper unit, so as to obtain eye movement information data corresponding to the requirement information and being valid data.
Optionally, in a specific embodiment of the present application, an implementation manner of the parsing unit 202 includes:
and the judging unit is used for judging whether the data in the eye movement information data packet is valid data or not according to the valid mark.
And the analysis subunit is used for analyzing the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is effective data if the data in the eye movement information data packet has the effective data.
The specific principle and the implementation process of each unit and subunit in the data processing apparatus disclosed in the embodiment of the present application are the same as those of the data processing method disclosed in the embodiment of the present application, and reference may be made to corresponding parts in the data processing method disclosed in the embodiment of the present application, which are not described herein again.
In the data processing apparatus provided by the application, the effective mark and the eye movement information data packet sent by the lower layer unit are acquired by the acquisition unit 201, and the eye movement information data packet is analyzed by the analysis unit 202 according to the effective mark and the requirement information of the upper layer unit, so that the eye movement information data which corresponds to the requirement information and is effective data is obtained. The eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used to indicate whether the data in the eye movement information packet is valid data. Because the eye movement information data packet contains various data such as left eye information data, right eye information data, synthesized gaze information data and the like, the types and forms of data obtained by various applications are the same, and the data between different applications can be compatible with each other. The obtaining unit 201 only needs to obtain the eye movement information data packet once, and then performs analysis according to the effective mark and the requirement information, so that the required eye movement information data can be obtained, and the efficiency is improved.
Referring to fig. 3, an embodiment of the present application provides a data transmission method, including the following steps:
s301, the lower layer unit collects various eye movement information data of the user.
The plurality of eye movement information data at least comprise left eye information data, right eye information data and synthesized watching information data.
Optionally, in a specific embodiment of the present application, the left-eye information data at least includes left-eye gaze information data; the right-eye information data includes at least right-eye gaze information data. Specifically, the gaze information data may include any one or more of a line-of-sight vector, a gaze point, and a gaze area. The sight vector parameter may be a parameter of a three-dimensional sight direction. The gaze point may be an intersection of the line-of-sight vector and an object, where the object may be an actual object or a virtual object. The gaze region may be configured as a conical, circular, or other shaped region centered on the line of sight or gaze point.
For example, the eyeball center position or the corneal curvature center position may be used as a starting point position of a three-dimensional space, after the sight line vector parameter is analyzed by the upper unit, the sight line may collide with a plurality of objects in a virtual space to determine an object gazed by the user, obtain a gazing point of the user, and calculate a gazing area of the user according to a preset area shape through the gazing point and the sight line.
Optionally, in a specific embodiment of the present application, the left-eye information data may further include left-eye feature information data; the right-eye information data may further include right-eye characteristic information data. Optionally, the left eye feature information data or the right eye feature information data may include one or more of pupil shape parameters such as a pupil position parameter, a pupil diameter, a pupil length and a pupil length, a pupil area, and a pupil axis, an eye opening degree parameter, an eyelid position parameter, an eye corner position parameter, an iris shape parameter, an iris feature parameter, and a purkinje spot position parameter.
Optionally, in a specific embodiment of the present application, the synthesized gaze information data is synthesized gaze information data obtained from at least one of left eye information data and right eye information data. Specifically, the synthesized gaze information may be equal to the left-eye gaze information data or the right-eye gaze information data, or may be calculated from the left-eye gaze information data and the right-eye gaze information data. It is also possible to obtain not only by the left-eye information data and the right-eye information data, but also by combining the left-eye information data or the right-eye information data with other data. For example, the left-eye information data and the right-eye information data may be combined with the head movement information data obtained by a device such as a gyroscope to calculate the synthesized gaze information data.
S302, the lower layer unit creates an effective mark according to the collected various eye movement information data.
Wherein, the valid flag is used to indicate whether the data in the eye movement information data packet is valid data. Specifically, the valid flag is used to indicate whether data in an eye movement information packet, such as left eye information data, right eye information data, and synthesized gaze information data, is valid data. When the eye movement information data packet only contains the left eye information data, the right eye information data and the synthesized gaze information data, optionally, the valid flag may be set to be data of three bits, where each bit is used to indicate whether one data is valid data or not. For example, a value of 0 for one of the bits indicates that the data is invalid, and a value of 1 indicates that the data is valid.
Alternatively, the valid flag may be set to be a single-bit data, and different values of the single-bit data are used to represent whether the left-eye information data, the right-eye information data, and the synthesized gaze information data are valid data. For example, a value of 0 indicates that all three data are invalid data, a value of 1 indicates that the left-eye information data is valid data, and the others are invalid data; when the numerical value is 2, the right-eye information data is valid data, and the other data is invalid data; a value of 3 indicates that all three data are valid data.
Alternatively, the valid flag may be set to be two bits of data, where each bit represents whether the left-eye information data and the right-eye information data are valid or not. For example, a value of 0 for one of the bits indicates that the data is invalid, and a value of 1 indicates that the data is valid. When each digit value is displayed as valid data, the synthetic watching information data can be deduced to be valid data; when one of the numerical values is displayed as invalid data, it is concluded that the synthesized gaze information data is invalid data.
And S303, combining the multiple kinds of eye movement information data into an eye movement information data packet according to a specified structure by the lower layer unit.
The lower layer unit combines a plurality of eye movement information data such as left eye information data, right eye information data, synthesized gaze information data, and the like into an eye movement information packet according to a designated combination mode, an arrangement order, a data type, and the like.
It should be noted that the order of executing S302 and executing S303 does not affect the implementation of the embodiment of the present application. The lower layer unit can create an effective mark first and then combine a plurality of eye movement information data into an eye movement information data packet according to a specified structure. The lower layer unit can also combine a plurality of eye movement information data into an eye movement information data packet according to a specified structure and then create an effective mark. Or S302 and S303 may be performed simultaneously.
S304, the lower layer unit transmits the effective mark and the eye movement information data packet to the upper layer unit.
The lower layer unit mainly refers to hardware, software, a chip, and the like, which can provide eye movement information data for the upper layer unit, and the upper layer unit mainly refers to various applications, such as games, UIs, and the like. After the lower layer unit executes S304, the upper layer unit processes the valid flag and the eye movement information data packet transmitted by the lower layer unit according to the method shown in fig. 1, and the specific execution principle and process of the upper layer unit are the same as those of the data processing method disclosed in the embodiment of the present application, and reference may be made to corresponding parts in the data processing method disclosed in the embodiment of the present application, which is not described herein again.
It should be noted that the lower layer unit may send the valid flag and the eye movement information packet at the same time, or may send the valid flag and the eye movement information packet sequentially. Because the eye movement information data packet transmitted by the lower layer unit contains various data such as left eye information data, right eye information data, synthesized gaze information data and the like, the types and forms of data obtained by various applications in the upper layer unit are the same, and the data between different applications can be compatible with each other. And the lower layer unit only needs to transmit the eye movement information data packet to the upper layer unit once, and the multiple applications in the upper layer unit can be analyzed according to the effective marks and the requirement information without transmitting for multiple times according to different requirements of the multiple applications, so that the efficiency is also improved.
In the data transmission method provided by the embodiment of the application, the lower layer unit collects various eye movement information data of a user, then the effective mark is created according to the collected various eye movement information data, the lower layer unit also combines the various eye movement information data into an eye movement information data packet according to a specified structure, and the effective mark and the eye movement information data packet are transmitted to the upper layer unit. The multiple eye movement information data at least comprise left eye information data, right eye information data and synthesized gaze information data, and the effective marks are used for indicating whether the multiple eye movement information data collected by the lower layer unit are effective data or not. Because the eye movement information data packet contains various data such as left eye information data, right eye information data, synthesized gaze information data and the like, the types and forms of data transmitted by the lower layer unit to the various applications in the upper layer unit are the same, so that the data between different applications can be compatible with each other. And the lower layer unit only needs to transmit the eye movement information data packet once, and does not need to transmit the demand information of different applications for multiple times, thereby improving the efficiency.
Referring to fig. 4, based on the data processing method disclosed in the embodiment of the present application, the embodiment of the present application also discloses a data transmission device correspondingly, where the data transmission device is a lower layer unit, and the lower layer unit includes: acquisition unit 401, creation unit 402, synthesis unit 403, and transmission unit 404.
The collecting unit 401 is configured to collect various eye movement information data of the user. The plurality of eye movement information data at least comprise left eye information data, right eye information data and synthesized watching information data.
Optionally, in another specific embodiment of the present application, the synthesized gaze information data is synthesized gaze information data obtained from at least one of left eye information data and right eye information data.
Optionally, in another specific embodiment of the present application, the left-eye information data at least includes: left eye gaze information data; the right-eye information data includes at least: right eye gaze information data.
Optionally, in another specific embodiment of the present application, the left-eye information data may further include left-eye feature information data; the right-eye information data may further include right-eye characteristic information data.
A creating unit 402, configured to create a valid indicator according to the collected multiple kinds of eye movement information data. The effective mark is used for indicating whether the various eye movement information data collected by the lower layer unit are effective data or not.
A combining unit 403 for combining the plurality of types of eye movement information data into an eye movement information packet in a specified structure.
A transmission unit 404, configured to transmit the valid flag and the eye movement information packet to the upper layer unit.
The specific principle and the implementation process of each unit and subunit in the data transmission device disclosed in the embodiment of the present application are the same as those of the data transmission method disclosed in the embodiment of the present application, and reference may be made to corresponding parts in the data transmission method disclosed in the embodiment of the present application, which are not described herein again.
In the data transmission device provided in the embodiment of the application, the acquisition unit 401 acquires various eye movement information data of a user, the creation unit 402 creates an effective mark according to the acquired various eye movement information data, the synthesis unit 403 combines the various eye movement information data into an eye movement information data packet according to a specified structure, and the transmission unit 404 transmits the effective mark and the eye movement information data packet to the upper unit. The multiple eye movement information data at least comprise left eye information data, right eye information data and synthesized gaze information data, and the effective marks are used for indicating whether the multiple eye movement information data collected by the lower layer unit are effective data or not. Since the eye movement information packet includes a plurality of data such as left eye information data, right eye information data, and synthesized gaze information data, the types and forms of data transmitted by the transmission unit 404 to the plurality of applications in the upper layer unit are the same, so that data between different applications can be compatible with each other. And the transmission unit 404 only needs to transmit the eye movement information data packet once, and does not need to transmit the demand information of different applications for multiple times, thereby improving the efficiency.

Claims (10)

1. A data processing method, comprising:
the upper layer unit acquires an effective mark and an eye movement information data packet sent by the lower layer unit; the eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used for indicating whether the data in the eye movement information data packet is valid data or not; the upper layer unit is used for multiple applications, the multiple applications all obtain eye movement information data packets, the obtained data types and the obtained forms are the same, and the data of different applications are compatible with each other;
and the upper layer unit analyzes the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain eye movement information data which corresponds to the requirement information and is effective data.
2. The method according to claim 1, wherein the upper layer unit parses the eye movement information packet according to the valid indicator and the requirement information of the upper layer unit to obtain eye movement information data which corresponds to the requirement information and is valid data, and the method comprises:
the upper layer unit judges whether the data in the eye movement information data packet is valid data or not according to the valid mark;
if the data in the eye movement information data packet has valid data, the upper layer unit analyzes the eye movement information data packet according to the valid mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is valid data.
3. The method of claim 1, wherein the synthetic gaze information data is: the synthesized gaze information data is obtained from at least one of the left eye information data and the right eye information data.
4. The method according to claim 1, wherein the left-eye information data includes at least: left eye gaze information data; the right-eye information data includes at least: right eye gaze information data.
5. A method of data transmission, comprising:
the lower layer unit collects various eye movement information data of a user; the plurality of eye movement information data at least comprise left eye information data, right eye information data and synthesized watching information data;
the lower layer unit creates an effective mark according to the collected various eye movement information data; the effective mark is used for indicating whether the various eye movement information data collected by the lower layer unit are effective data or not;
the lower layer unit combines the various eye movement information data into an eye movement information data packet according to a specified structure;
the lower layer unit transmits the effective mark and the eye movement information data packet to an upper layer unit; the upper layer unit is used for multiple applications, the multiple applications all obtain eye movement information data packets, the obtained data types and the obtained forms are the same, and data among different applications are mutually compatible.
6. The method of claim 5, wherein the synthetic gaze information data is: the synthesized gaze information data is obtained from at least one of the left eye information data and the right eye information data.
7. The method according to claim 5, wherein the left-eye information data includes at least: left eye gaze information data; the right-eye information data includes at least: right eye gaze information data.
8. A data processing apparatus, wherein the data processing apparatus is an upper unit, and the upper unit includes:
the acquisition unit is used for acquiring the effective mark and the eye movement information data packet sent by the lower layer unit; the eye movement information data packet at least comprises left eye information data, right eye information data and synthesized watching information data; the valid flag is used for indicating whether the data in the eye movement information data packet is valid data or not; the upper layer unit is used for multiple applications, the multiple applications all obtain eye movement information data packets, the obtained data types and the obtained forms are the same, and the data of different applications are compatible with each other;
and the analysis unit is used for analyzing the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain eye movement information data which corresponds to the requirement information and is effective data.
9. The apparatus of claim 8, wherein the parsing unit comprises:
the judging unit is used for judging whether the data in the eye movement information data packet is valid data or not according to the valid mark;
and the analysis subunit is used for analyzing the eye movement information data packet according to the effective mark and the requirement information of the upper layer unit to obtain the eye movement information data which corresponds to the requirement information and is effective data if the data in the eye movement information data packet has effective data.
10. A data transmission device, wherein the data transmission device is a lower layer unit, and the lower layer unit includes:
the acquisition unit is used for acquiring various eye movement information data of a user; the plurality of eye movement information data at least comprise left eye information data, right eye information data and synthesized watching information data;
the creating unit is used for creating an effective mark according to the collected various eye movement information data; the effective mark is used for indicating whether the various eye movement information data collected by the lower layer unit are effective data or not;
a synthesizing unit configured to combine the plurality of types of eye movement information data into an eye movement information packet according to a specified structure;
the transmission unit is used for transmitting the effective mark and the eye movement information data packet to an upper layer unit; the upper layer unit is used for multiple applications, the multiple applications all obtain eye movement information data packets, the obtained data types and the obtained forms are the same, and data among different applications are mutually compatible.
CN201910137060.0A 2019-02-25 2019-02-25 Data processing method, data transmission method and device Active CN109885167B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910137060.0A CN109885167B (en) 2019-02-25 2019-02-25 Data processing method, data transmission method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910137060.0A CN109885167B (en) 2019-02-25 2019-02-25 Data processing method, data transmission method and device

Publications (2)

Publication Number Publication Date
CN109885167A CN109885167A (en) 2019-06-14
CN109885167B true CN109885167B (en) 2022-04-01

Family

ID=66929157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910137060.0A Active CN109885167B (en) 2019-02-25 2019-02-25 Data processing method, data transmission method and device

Country Status (1)

Country Link
CN (1) CN109885167B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834446B (en) * 2015-05-04 2018-10-26 惠州Tcl移动通信有限公司 A kind of display screen multi-screen control method and system based on eyeball tracking technology
CN105205379A (en) * 2015-10-28 2015-12-30 广东欧珀移动通信有限公司 Control method and device for terminal application and terminal
EP3371973B1 (en) * 2015-11-06 2023-08-09 Facebook Technologies, LLC Eye tracking using a diffraction pattern of coherent light on the surface of the eye
CN108937965B (en) * 2018-05-03 2021-04-20 华东师范大学 Attention evaluation system and method based on sitting posture analysis

Also Published As

Publication number Publication date
CN109885167A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
US10122985B2 (en) Camera based safety mechanisms for users of head mounted displays
CN111556305B (en) Image processing method, VR device, terminal, display system and computer-readable storage medium
JP6518578B2 (en) Display control apparatus and display control method
CN105068649A (en) Binocular gesture recognition device and method based on virtual reality helmet
CN108763394A (en) Multi-user's eye-tracking data visualization method towards cooperation interaction and system
US20150244984A1 (en) Information processing method and device
CN113552947B (en) Virtual scene display method, device and computer readable storage medium
CN104766056A (en) Human-computer interaction method, device and virtual headset device
WO2019150880A1 (en) Information processing device, information processing method, and program
CN108140124B (en) Prompt message determination method and device and electronic equipment
CN108765541B (en) 3D scene object display method, device, equipment and storage medium
JP6341759B2 (en) Head-mounted information display device and control method for head-mounted information display device
CN109885167B (en) Data processing method, data transmission method and device
US20170154466A1 (en) Interactively augmented reality enable system
CN112965773B (en) Method, apparatus, device and storage medium for information display
CN106708249A (en) Interactive method, interactive apparatus and user equipment
CN104202556A (en) Information acquisition method and device as well as user equipment
US20200257360A1 (en) Method for calculating a gaze convergence distance
EP3796132A1 (en) Heat map presentation device and heat map presentation program
JP2020201575A (en) Display controller, display control method, and display control program
KR101802308B1 (en) Device for 3D augmented reality display based on user's vision and system for providing plural users with 3D augmented reality at same time
CN106341165B (en) Beam presenting method and device
CN116270169B (en) Visual function evaluation training method and device and AR equipment
JP6941715B2 (en) Display device, display program, display method and display system
CN115499787B (en) Intelligent glasses interconnection method and intelligent glasses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant