CN110941344A - Method for obtaining gazing point data and related device - Google Patents

Method for obtaining gazing point data and related device Download PDF

Info

Publication number
CN110941344A
CN110941344A CN201911253148.5A CN201911253148A CN110941344A CN 110941344 A CN110941344 A CN 110941344A CN 201911253148 A CN201911253148 A CN 201911253148A CN 110941344 A CN110941344 A CN 110941344A
Authority
CN
China
Prior art keywords
service module
camera
party application
data
media
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911253148.5A
Other languages
Chinese (zh)
Other versions
CN110941344B (en
Inventor
韩世广
陈岩
方攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911253148.5A priority Critical patent/CN110941344B/en
Publication of CN110941344A publication Critical patent/CN110941344A/en
Application granted granted Critical
Publication of CN110941344B publication Critical patent/CN110941344B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Abstract

The embodiment of the application discloses a method for acquiring gazing point data and a related device, wherein the method comprises the following steps: the third party application acquires the version information of the media platform; determining at least one capability supported by a camera of the electronic device according to the media platform version information; determining at least one capability comprises an eye tracking capability; sending a request for obtaining the data of the gazing point to an eyeball tracking service module through a media service module; and receiving the gazing point data sent by the eyeball tracking service module through the media service module. Therefore, the method and the device for obtaining the point of regard data of the electronic equipment are beneficial to third-party application and improve the compatibility of the electronic equipment.

Description

Method for obtaining gazing point data and related device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a method for obtaining gaze point data and a related apparatus.
Background
With the progress of electronic technology, the electronic device may be provided with an eyeball tracking sensor, and the electronic device may determine a current position where a user gazes at the display screen through the eyeball tracking sensor. With the improvement of hardware equipment, the real function of the hardware equipment can be exerted only by the synchronous support of a software system, and the requirement that a user executes more operations by utilizing a fixation point is met.
Disclosure of Invention
The embodiment of the application provides a method and a related device for obtaining gazing point data, which are beneficial for a third party to obtain gazing point data of electronic equipment and improve the compatibility of the electronic equipment.
In a first aspect, an embodiment of the present application provides a method for obtaining gazing point data, which is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer comprises an eyeball tracking service module, and the method comprises the following steps:
the third party application acquires the version information of the media platform;
the third-party application determines at least one capability supported by a camera of the electronic equipment according to the media platform version information;
the third-party application determining that the at least one capability includes an eye tracking capability;
the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module;
and the third-party application receives the gazing point data sent by the eyeball tracking service module through the media service module.
In a second aspect, an embodiment of the present application provides an apparatus for obtaining gazing point data, which is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer comprises an eyeball tracking service module, the device for acquiring the gazing point data comprises a processing unit, wherein,
the processing unit is used for the third-party application to acquire the version information of the media platform; the third-party application determines at least one capability supported by a camera of the electronic equipment according to the media platform version information; and for the third party application to determine that the at least one capability includes an eye tracking capability; the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module; and the third-party application receives the gazing point data sent by the eyeball tracking service module through the media service module.
In a third aspect, an embodiment of the present application provides an electronic device, including a controller, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the controller, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device first obtains version information of a media platform; secondly, the third-party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; under the condition that the third-party application determines that at least one capability comprises eyeball tracking capability, the third-party application sends a request for obtaining the data of the gazing point to an eyeball tracking service module through a media service module; and finally, receiving the gazing point data sent by the eyeball tracking service module through the media service module. Therefore, in the embodiment of the application, the third-party application in the electronic device realizes communication connection with an eyeball tracking service module built in the system through the module in the media platform frame, so that the gazing point data is obtained, the use way of the gazing point data is enriched, and the compatibility of the electronic device is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2A is a schematic flowchart of a method for obtaining gazing point data according to an embodiment of the present application;
FIG. 2B is a system framework diagram provided by an embodiment of the present application;
fig. 3 is a schematic flowchart of another method for obtaining gazing point data according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a block diagram of functional units of an apparatus for obtaining gaze point data according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The following describes embodiments of the present application in detail.
Referring to fig. 1, please refer to fig. 1, fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure, as shown in the figure, a plane where a display screen of the electronic device 100 is located includes an eye tracking apparatus 101, and the eye tracking apparatus 101 may include an infrared device (e.g., an infrared lamp) and an image capturing device (e.g., a camera); when the electronic device 100 wants to obtain the gazing point data of the user, the electronic device starts the eyeball tracking device 101; the image acquisition equipment can acquire a picture of taking a picture, the electronic equipment processes the picture of taking a picture according to a preset algorithm to obtain face information and eyeball motion information corresponding to the face information, and the gazing point data is obtained through calculation according to the eyeball motion information. Specifically, the eyeball tracking device 101 may start the infrared device first, the infrared device irradiates on human eyes to generate light spots, and the image acquisition device takes pictures of the light spots and the pupils; and processing the picture through a sight line estimation algorithm, and calculating to obtain the gazing direction of human eyes and the gazing point on the display screen.
The eyeball tracking device comprises infrared equipment (such as an infrared lamp) and image acquisition equipment (such as a camera), so that characteristic information related to the change can be acquired, such as extraction of the change characteristics through image capture or scanning, the state and the demand of a user can be predicted through real-time tracking of the change of the eyes, response is carried out, and the purpose of controlling the equipment through the eyes is achieved.
Among other things, the electronic devices may include various handheld devices, vehicle-mounted devices, wearable devices (e.g., smartwatches, smartbands, pedometers, etc.), computing devices or other processing devices connected to wireless modems, as well as various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and so on, having wireless communication functions. For convenience of description, the above-mentioned devices are collectively referred to as a terminal.
Referring to fig. 2A, fig. 2A is a schematic flowchart of a method for obtaining gazing point data according to an embodiment of the present application, where the method is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer comprises an eyeball tracking service module, and as shown in the figure, the method for acquiring the gazing point data comprises the following steps:
step 201, a third party application acquires version information of a media platform.
Specifically, the third application sends a version information request carrying an authentication code to the media service module, the media service module authenticates the authentication code, if the authentication is passed, the media service module sends correct version information to the third-party application, and if the authentication is failed, the media service module sends a null character string to the third-party application.
Optionally, the application layer of the electronic device may include a media management module, the third-party application and the media service module may communicate through the media management module, and the media management module may include a control interface and the like.
Step 202, the third party application determines at least one capability supported by a camera of the electronic device according to the media platform version information.
The media platform version information comprises camera information set by the electronic equipment, and the camera information comprises the capability supported by each camera; for example, the version information may include: current electronic equipment includes six cameras, and wherein three camera is leading camera, and three camera is rearmounted camera, including 3D deep camera, portrait camera with be used for realizing the infrared sensing camera that the eyeball tracked with infrared lamp is supporting in three leading camera. The three rear cameras can include a super wide-angle camera, a wide-angle camera and a telephoto camera. The version information comprises the realizable function of each camera, and the third-party application can select the realizable function.
In step 203, the third party application determines that the at least one capability includes an eye tracking capability.
And 204, the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module.
The media service module provides a resident system service, runs after the electronic equipment is started, executes authentication, responds to a configuration request of a third-party application, and configures configuration information of the third application to a bottom layer of the android system.
Specifically, the third-party application firstly sends an acquisition request of the data of the gazing point to the media management module, the media management module forwards the acquisition request of the data of the gazing point to the media service module, and the media service module can determine data (gazing point data) which needs to be acquired by the current third application according to the acquisition request of the data of the gazing point, and then the media service module sends data request information to the eyeball tracking service module.
In step 205, the third-party application receives, through the media service module, the gaze point data sent by the eyeball tracking service module.
The specific process of the third-party application receiving is as follows: the eyeball tracking service module sends the point-of-regard data to the media service module, the media service module forwards the point-of-regard data to the media management module, the media management module sends the point-of-regard data to the third-party application, and the third-party application receives the point-of-regard data.
Referring to fig. 2B, fig. 2B is a system frame diagram provided in an embodiment of the present application, where an electronic device includes a media service module and an android system, an application layer of the android system is provided with a third party application and a local camera application, a frame layer of the android system includes an eye tracking service module and application interfaces (e.g., a native camera application program interface) of various native applications, an application service (e.g., a camera service module), and a frame layer interface (e.g., Google HAL3interface), a hardware abstraction layer of the android system is provided with a hardware abstraction module (e.g., an android native module, such as a native camera hardware abstraction module CameraHAL), and the native architecture of the android system further includes a kernel (also called a driver) and a hardware layer, and the hardware abstraction layer includes a hardware abstraction layer interface (e.g., HAL3.0), and a hardware abstraction module (e.g., a camera hardware abstraction module) of various native applications, the kernel and hardware layers include drivers for various hardware (e.g., screen drivers, audio drivers, etc.), and various hardware (e.g., eye tracking sensors, front-end image sensors).
The media service module is independent of the android system, the third-party application can communicate with the media service module, and the media service module can communicate with the eyeball tracking service module and the camera service module. The eyeball tracking service module can be communicated with the camera service module so as to acquire the image data of the camera service module. The specific generation process of the image data comprises the following steps: the method comprises the steps that a sensor of a kernel and a hardware layer obtains original image data, the sensor sends the original image data to an image signal processor for processing, the image signal processor sends the processed original image data to a camera hardware abstraction module of a hardware abstraction layer through a driver, the camera hardware abstraction module sends the processed original image data to a camera service module through a hardware abstraction layer interface and a framework layer interface, the camera service module sends the original image data to an eyeball tracking service module, and the eyeball tracking service module calls a preset eyeball tracking algorithm to process the original image data to generate watching point data; the eyeball tracking service module sends the gaze point data to the third-party application through the media service module.
Based on the above framework, the media service module may invoke the driver of the application to enable some hardware through the android native information link, thereby implementing the opening of native application-associated hardware for third-party applications.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device first obtains version information of a media platform; secondly, the third-party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; under the condition that the third-party application determines that at least one capability comprises eyeball tracking capability, the third-party application sends a request for obtaining the data of the gazing point to an eyeball tracking service module through a media service module; and finally, receiving the gazing point data sent by the eyeball tracking service module through the media service module. Therefore, in the embodiment of the application, the third-party application in the electronic device realizes communication connection with an eyeball tracking service module built in the system through the module in the media platform frame, so that the gazing point data is obtained, the use way of the gazing point data is enriched, and the compatibility of the electronic device is improved.
In one possible example, the framework layer further includes a camera service module, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, and after the third party application sends the obtaining request of the gaze point data to the eye tracking service module through the media service module, before the third party application receives the gaze point data sent by the eye tracking service module through the media service module, the method further includes: the eyeball tracking service module sends an eyeball tracking data acquisition request to the camera service module; the camera service module sends an eyeball tracking data acquisition request to the camera hardware abstraction module; the camera hardware abstraction module acquires image data; the camera hardware abstraction module sends the image data to the camera service module; the camera service module sends the image data to the eyeball tracking service module; and the eyeball tracking service module generates fixation point data according to the image data.
The image data sent by the camera service module to the eyeball tracking service module comprises an eye image of the user, the eye image comprises information related to pupils and bright spots, and the bright spots are generated by irradiating infrared lamps at the local end of the electronic equipment to human eyes.
The eyeball tracking service module is internally provided with a preset algorithm for calculating the fixation point, and the eyeball tracking module can calculate according to the image data sent by the camera service module to generate the fixation point data.
As can be seen, in this example, when the eyeball tracking service module receives a request of a third-party application for gaze point data, the eyeball tracking service module sends an instruction to the camera service module to obtain data of eyes of a user, and then generates gaze point data according to an algorithm, so that the advantage of computing the gaze point data at a system level is fully exerted, the time of waiting for a response by the third-party application is shortened, and the efficiency of data processing is improved.
In one possible example, after the third-party application receives, through the media service module, the gaze point data sent by the eye tracking service module, the method further includes: if the fixation point represented by the fixation point data is a first position point, and the duration of the first position point is greater than the preset duration; determining the content information displayed at the first position point on the display screen of the electronic equipment; determining a category of the content information; determining at least one operation to be performed according to the category of the content information; displaying the at least one operation on a display screen of the electronic device; determining a first operation of the at least one operation according to the updated gazing point data; the first operation is performed with respect to the content information.
The preset time duration may be a time duration preset by a third-party application, or a time duration set by a user, for example, 3 seconds.
Wherein the at least one operation determined to be performed according to the category of the content information may be preset; specifically, if the content information is text information, the at least one operation may include: copy, cut, change font, etc. If the content information is a video, the at least one operation may include: pause, play, screen capture, double speed play, fast forward or fast reverse for a preset duration. Specifically, in the process of playing a video by the electronic device, if the gazing point moves clockwise/counterclockwise around the first gazing point after the time that the gazing point stays at the first position point is longer than 3 seconds, the electronic device detects the number of turns of clockwise/counterclockwise movement around the first position point, and determines the speed of double-speed playing according to the number of turns; for example, a clockwise turn increases the playing rate by 0.1, and a counterclockwise turn decreases the playing rate by 0.1, for example, the current playing rate is 1.0 times the playing rate, and in the case of 5 clockwise turns, the playing rate is adjusted to 1.5 times the playing rate.
The third-party application can acquire the gazing point data in real time, so the gazing point data is continuously updated, after at least one operation is displayed, a user can gaze a region displaying a first operation on the display screen, the region stays for a preset time length to show that the user selects the first operation, and the first operation is executed according to the content information.
Therefore, in the example, after the third-party application in the electronic device obtains the gazing point data, different operations can be executed on the content information according to the type of the currently displayed content information and the gazing point data, so that the use scenes of the gazing point data are enriched, and the flexibility and the intelligence of the electronic device are improved.
In one possible example, the application layer is provided with a local camera application, in the course of executing the method of any of the above examples; if the camera service module detects a data communication request sent by the local camera application, the camera service module stops transmission to the eyeball tracking service module; the media service module sends busy state information to the third party application.
At present, the camera service module will only respond to the service of another service when the communication connection of one service is disconnected. The camera service module cannot respond to the data request of the local camera application without the third application disconnecting the communication connection. In the embodiment of the application, the camera service module and the media service module can be in communication connection, the camera service module can send a notification of disconnecting to obtain the data of the gazing point to the media service module when receiving the data obtaining request of the local camera application, the media service module stops obtaining the data of the gazing point of the eyeball tracking service module, the eyeball tracking service module stops sending the eyeball tracking data obtaining request to the camera service module, and the camera service module stops obtaining the data required by the eyeball tracking service module and responds to the data of the local camera application.
As can be seen, in the present example, in the process of acquiring the point of regard service by the third-party application, the user does not influence the use of the local camera application, and the user does not need to intentionally stop the third-party application, thereby improving the intelligence of the electronic device.
In one possible example, after the media service module sends busy state information to the third party application, the method further comprises: if the camera service module detects a service termination instruction sent by the local camera application; the camera service module resumes sending the image data to the eyeball tracking service module; and the media service module sends working state information to the third-party application.
The camera service module recovers data service with the eyeball tracking service module, the camera service module sends image data to the eyeball tracking service module, the eyeball tracking service module processes the image data to generate fixation point data, and the eyeball tracking service module sends the fixation point data to a third party application through the media service module. Meanwhile, after receiving the working state information, the third party application receives the fixation point data and executes the service of the associated fixation point data.
As can be seen, in this example, when the camera service module in the electronic device detects that the local camera application stops using the service, the service for the eyeball tracking service module is recovered, so that the intelligence of the electronic device is improved.
In one possible example, the electronic device is provided with an eyeball tracking sensor, the android system further includes an inner core and a hardware layer, the inner core and the hardware layer further include an image signal processing module, and the camera hardware abstraction module acquires image data, including: the camera hardware abstraction module sends a camera data acquisition request to the image signal processing module; the image signal processing module calls the eyeball tracking sensor to acquire image data; the image signal processing module processes the image data according to a preset algorithm so as to update the image data; and the image signal processing module sends the updated image data to the camera hardware abstraction module.
Therefore, in this example, after the eyeball tracking sensor acquires the image data, the image data is processed by the image signal processing module, so that the quality of the image is improved, and the gaze point calculated according to the image is more accurate.
In one possible example, the third party application obtains media platform version information, including: the third party application sends a request for acquiring version information to the media service module, wherein the request comprises an authentication code; the media service module authenticates the authentication code; and if the authentication code passes authentication, the media service module sends the version information to the third-party application.
Wherein the authentication code may be an authentication code generated by an RSA encryption algorithm.
Wherein, the process of the media service module for authenticating the authentication code comprises the following steps: the media service module acquires an asymmetric private key of a preconfigured third-party application; the media service module decrypts the authentication code by using the asymmetric private key to obtain an APP signature key, a system date and an appointed field of the third-party camera application; and the media service module determines whether the authentication code can be verified according to the APP signature key, the system date and the appointed field.
As can be seen, in this example, when the third-party application sends a data acquisition request, the media service module in the electronic device authenticates the information of the third-party application, thereby ensuring the security of the system.
Consistent with the embodiment shown in fig. 2A, please refer to fig. 3, where fig. 3 is a schematic flowchart of a method for obtaining gazing point data provided in the embodiment of the present application, and is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer comprises an eyeball tracking service module, and as shown in the figure, the method for acquiring the gazing point data comprises the following steps:
step 301, a third party application sends a request for obtaining version information of a media platform to the media service module, where the request includes an authentication code.
Step 302, the media service module authenticates the authentication code.
Step 303, if the authentication code passes the authentication, the media service module sends the version information of the media platform to the third party application.
Step 304, the third party application determines at least one capability supported by a camera of the electronic device according to the media platform version information.
Step 305, the third party application determines that the at least one capability includes an eye tracking capability.
Step 306, the third-party application sends a request for obtaining the gaze point data to the eyeball tracking service module through the media service module.
Step 307, the eyeball tracking service module sends an eyeball tracking data acquisition request to the camera service module.
Step 308, the camera service module sends an eyeball tracking data acquisition request to the camera hardware abstraction module.
Step 309, the camera hardware abstraction module obtains image data.
Step 310, the camera hardware abstraction module sends the image data to the camera service module.
In step 311, the camera service module sends the image data to the eyeball tracking service module.
In step 312, the eyeball tracking service module generates gaze point data according to the image data.
Step 313, the eye tracking service module sends the gaze point data to the third party application through the media service module.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device first obtains version information of a media platform; secondly, the third-party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; under the condition that the third-party application determines that at least one capability comprises eyeball tracking capability, the third-party application sends a request for obtaining the data of the gazing point to an eyeball tracking service module through a media service module; and finally, receiving the gazing point data sent by the eyeball tracking service module through the media service module. Therefore, in the embodiment of the application, the third-party application in the electronic device realizes communication connection with an eyeball tracking service module built in the system through the module in the media platform frame, so that the gazing point data is obtained, the use way of the gazing point data is enriched, and the compatibility of the electronic device is improved.
Consistent with the embodiments shown in fig. 2A and fig. 3, please refer to fig. 4, where fig. 4 is a schematic structural diagram of an electronic device 400 provided in the embodiment of the present application, and as shown in the figure, the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer comprises an eye tracking service module, the electronic device 400 comprises an application processor 410, a memory 420, a communication interface 430 and one or more programs 421, wherein the one or more programs 421 are stored in the memory 420 and configured to be executed by the application processor 410, and the one or more programs 421 comprise instructions for:
the third party application acquires the version information of the media platform;
the third-party application determines at least one capability supported by a camera of the electronic equipment according to the media platform version information;
the third-party application determining that the at least one capability includes an eye tracking capability;
the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module;
and the third-party application receives the gazing point data sent by the eyeball tracking service module through the media service module.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device first obtains version information of a media platform; secondly, the third-party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; under the condition that the third-party application determines that at least one capability comprises eyeball tracking capability, the third-party application sends a request for obtaining the data of the gazing point to an eyeball tracking service module through a media service module; and finally, receiving the gazing point data sent by the eyeball tracking service module through the media service module. Therefore, in the embodiment of the application, the third-party application in the electronic device realizes communication connection with an eyeball tracking service module built in the system through the module in the media platform frame, so that the gazing point data is obtained, the use way of the gazing point data is enriched, and the compatibility of the electronic device is improved.
In one possible example, the framework layer further includes a camera service module, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, and after the third party application sends an acquisition request of the gaze point data to the eye tracking service module through the media service module, in terms of before the third party application receives the gaze point data sent by the eye tracking service module through the media service module, the instructions in the program are specifically configured to perform the following operations: the eyeball tracking service module sends an eyeball tracking data acquisition request to the camera service module; the camera service module sends an eyeball tracking data acquisition request to the camera hardware abstraction module; the camera hardware abstraction module acquires image data; the camera hardware abstraction module sends the image data to the camera service module; the camera service module sends the image data to the eyeball tracking service module; and the eyeball tracking service module generates fixation point data according to the image data.
In one possible example, in terms of the third-party application receiving, by the media service module, the gaze point data sent by the eye tracking service module, the instructions in the program are specifically configured to: if the fixation point represented by the fixation point data is a first position point, and the duration of the first position point is greater than the preset duration; determining the content information displayed at the first position point on the display screen of the electronic equipment; determining a category of the content information; determining at least one operation to be performed according to the category of the content information; displaying the at least one operation on a display screen of the electronic device; determining a first operation of the at least one operation according to the updated gazing point data; the first operation is performed with respect to the content information.
In one possible example, the application layer is provided with a local camera application, and in the process of executing the method of any one of the above examples, the instructions in the program are specifically configured to perform the following operations; if the camera service module detects a data communication request sent by the local camera application, the camera service module stops sending the image data to the eyeball tracking service module; the media service module sends busy state information to the third party application.
In one possible example, the instructions in the program are specifically to, in connection with the media service module sending busy state information to the third party application: if the camera service module detects a service termination instruction sent by the local camera application; the camera service module resumes sending the image data to the eyeball tracking service module; and the media service module sends working state information to the third-party application.
In one possible example, the electronic device is provided with an eye tracking sensor, the android system further includes a kernel and a hardware layer, the kernel and the hardware layer further include an image signal processing module, and in terms of acquiring image data by the camera hardware abstraction module, the instructions in the program are specifically configured to perform the following operations: the camera hardware abstraction module sends a camera data acquisition request to the image signal processing module; the image signal processing module calls the eyeball tracking sensor to acquire image data; the image signal processing module processes the image data according to a preset algorithm so as to update the image data; and the image signal processing module sends the updated image data to the camera hardware abstraction module.
In one possible example, in terms of the third-party application obtaining media platform version information, the instructions in the program are specifically configured to: the third party application sends a request for acquiring version information to the media service module, wherein the request comprises an authentication code; the media service module authenticates the authentication code; and if the authentication code passes authentication, the media service module sends the version information to the third-party application.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one control unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 5 is a block diagram of functional units of an apparatus 500 for acquiring gaze point data according to an embodiment of the present application. The device 500 for obtaining the gazing point data is applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and the android system comprises an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer includes an eyeball tracking service module, the apparatus 500 for obtaining gaze point data includes a processing unit 501, wherein:
the processing unit 501 is configured to obtain version information of a media platform by the third-party application; the third-party application determines at least one capability supported by a camera of the electronic equipment according to the media platform version information; and for the third party application to determine that the at least one capability includes an eye tracking capability; the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module; and the third-party application receives the gazing point data sent by the eyeball tracking service module through the media service module.
The apparatus 500 for obtaining gaze point data may further include a communication unit 502 and a storage unit 503, where the storage unit 503 is configured to store program codes and data of an electronic device. The processing unit 501 may be a processor, the communication unit 502 may be a touch display screen or a transceiver, and the storage unit 503 may be a memory.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device first obtains version information of a media platform; secondly, the third-party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; under the condition that the third-party application determines that at least one capability comprises eyeball tracking capability, the third-party application sends a request for obtaining the data of the gazing point to an eyeball tracking service module through a media service module; and finally, receiving the gazing point data sent by the eyeball tracking service module through the media service module. Therefore, in the embodiment of the application, the third-party application in the electronic device realizes communication connection with an eyeball tracking service module built in the system through the module in the media platform frame, so that the gazing point data is obtained, the use way of the gazing point data is enriched, and the compatibility of the electronic device is improved.
In a possible example, the framework layer further includes a camera service module, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, and after the third-party application sends an acquisition request of the gaze point data to the eye tracking service module through the media service module, before the third-party application receives the gaze point data sent by the eye tracking service module through the media service module, the processing unit 501 is specifically configured to: the eyeball tracking service module sends an eyeball tracking data acquisition request to the camera service module; the camera service module sends an eyeball tracking data acquisition request to the camera hardware abstraction module; the camera hardware abstraction module acquires image data; the camera hardware abstraction module sends the image data to the camera service module; the camera service module sends the image data to the eyeball tracking service module; and the eyeball tracking service module generates fixation point data according to the image data.
In one possible example, after the third-party application receives, through the media service module, the gaze point data sent by the eye tracking service module, the processing unit 501 is specifically configured to: if the fixation point represented by the fixation point data is a first position point, and the duration of the first position point is greater than the preset duration; determining the content information displayed at the first position point on the display screen of the electronic equipment; determining a category of the content information; determining at least one operation to be performed according to the category of the content information; displaying the at least one operation on a display screen of the electronic device; determining a first operation of the at least one operation according to the updated gazing point data; the first operation is performed with respect to the content information.
In a possible example, the application layer is provided with a local camera application, and in the process of executing the method in any of the above examples, the processing unit 501 is specifically configured to: if the camera service module detects a data communication request sent by the local camera application, the camera service module stops transmission to the eyeball tracking service module; the media service module sends busy state information to the third party application.
In one possible example, in terms of the media service module sending busy state information to the third party application, the processing unit 501 is specifically configured to: if the camera service module detects a service termination instruction sent by the local camera application; the camera service module resumes sending the image data to the eyeball tracking service module; and the media service module sends working state information to the third-party application.
In one possible example, the electronic device is provided with an eye tracking sensor, the android system further includes a kernel and a hardware layer, the kernel and the hardware layer further include an image signal processing module, and in terms of acquiring image data by the camera hardware abstraction module, the processing unit 501 is specifically configured to: the camera hardware abstraction module sends a camera data acquisition request to the image signal processing module; the image signal processing module calls the eyeball tracking sensor to acquire image data; the image signal processing module processes the image data according to a preset algorithm so as to update the image data; and the image signal processing module sends the updated image data to the camera hardware abstraction module.
In a possible example, in terms of acquiring, by the third-party application, the media platform version information, the processing unit 501 is specifically configured to: the third party application sends a request for acquiring version information to the media service module, wherein the request comprises an authentication code; the media service module authenticates the authentication code; and if the authentication code passes authentication, the media service module sends the version information to the third-party application.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes a mobile terminal.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising a mobile terminal.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated into one control unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. The method for obtaining the data of the gazing point is applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and the android system comprises an application layer and a framework layer; the application layer is provided with a third party application, the framework layer comprises an eyeball tracking service module, and the method comprises the following steps:
the third party application acquires the version information of the media platform;
the third-party application determines at least one capability supported by a camera of the electronic equipment according to the media platform version information;
the third-party application determining that the at least one capability includes an eye tracking capability;
the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module;
and the third-party application receives the gazing point data sent by the eyeball tracking service module through the media service module.
2. The method according to claim 1, wherein the framework layer further includes a camera service module, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, and after the third party application sends the request for obtaining the gaze point data to the eye tracking service module through the media service module, before the third party application receives the gaze point data sent by the eye tracking service module through the media service module, the method further includes:
the eyeball tracking service module sends an eyeball tracking data acquisition request to the camera service module;
the camera service module sends an eyeball tracking data acquisition request to the camera hardware abstraction module;
the camera hardware abstraction module acquires image data;
the camera hardware abstraction module sends the image data to the camera service module;
the camera service module sends the image data to the eyeball tracking service module;
and the eyeball tracking service module generates fixation point data according to the image data.
3. The method of claim 1 or 2, wherein after the third-party application receives, through the media service module, the gaze point data sent by the eye tracking service module, the method further comprises:
if the fixation point represented by the fixation point data is a first position point, and the duration of the first position point is greater than the preset duration;
determining the content information displayed at the first position point on the display screen of the electronic equipment;
determining a category of the content information;
determining at least one operation to be performed according to the category of the content information;
displaying the at least one operation on a display screen of the electronic device;
determining a first operation of the at least one operation according to the updated gazing point data;
the first operation is performed with respect to the content information.
4. A method according to claim 2 or 3, characterized in that the application layer is provided with a local camera application, in the course of which the method is performed;
if the camera service module detects a data communication request sent by the local camera application, the camera service module stops sending the image data to the eyeball tracking service module;
the media service module sends busy state information to the third party application.
5. The method of claim 4, wherein after the media service module sends busy state information to the third party application, the method further comprises:
if the camera service module detects a communication termination instruction sent by the local camera application;
the camera service module resumes sending the image data to the eyeball tracking service module;
and the media service module sends working state information to the third-party application.
6. The method according to claim 3, wherein the electronic device is provided with an eye tracking sensor, the android system further comprises a kernel and a hardware layer, the kernel and the hardware layer further comprise an image signal processing module, and the camera hardware abstraction module acquires image data, including:
the camera hardware abstraction module sends a camera data acquisition request to the image signal processing module;
the image signal processing module calls the eyeball tracking sensor to acquire image data;
the image signal processing module processes the image data according to a preset algorithm so as to update the image data;
and the image signal processing module sends the updated image data to the camera hardware abstraction module.
7. The method of any of claims 1-6, wherein the third party application obtains media platform version information, comprising:
the third party application sends a request for acquiring the version information of the media platform to the media service module, wherein the request comprises an authentication code;
the media service module authenticates the authentication code;
and if the authentication code passes authentication, the media service module sends the version information to the third-party application.
8. The device for obtaining the data of the gazing point is applied to electronic equipment, the electronic equipment comprises a media service module and an android system, and the android system comprises an application layer and a framework layer; the application layer is provided with a third party application and a local camera application, the framework layer comprises an eyeball tracking service module, the device for acquiring the gazing point data comprises a processing unit, wherein,
the processing unit is used for the third-party application to acquire the version information of the media platform; the third-party application determines at least one capability supported by a camera of the electronic equipment according to the media platform version information; and for the third party application to determine that the at least one capability includes an eye tracking capability; the third-party application sends a request for obtaining the data of the gazing point to the eyeball tracking service module through the media service module; and the third-party application receives the gazing point data sent by the eyeball tracking service module through the media service module.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN201911253148.5A 2019-12-09 2019-12-09 Method for obtaining gazing point data and related device Active CN110941344B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253148.5A CN110941344B (en) 2019-12-09 2019-12-09 Method for obtaining gazing point data and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253148.5A CN110941344B (en) 2019-12-09 2019-12-09 Method for obtaining gazing point data and related device

Publications (2)

Publication Number Publication Date
CN110941344A true CN110941344A (en) 2020-03-31
CN110941344B CN110941344B (en) 2022-03-15

Family

ID=69909566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253148.5A Active CN110941344B (en) 2019-12-09 2019-12-09 Method for obtaining gazing point data and related device

Country Status (1)

Country Link
CN (1) CN110941344B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111953848A (en) * 2020-08-19 2020-11-17 Oppo广东移动通信有限公司 System, method and related device for realizing application function through context awareness

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160025972A1 (en) * 2014-07-23 2016-01-28 OrCam Technologies, Ltd. Systems and methods for analyzing advertisement effectiveness using wearable camera systems
CN105359062A (en) * 2013-04-16 2016-02-24 眼球控制技术有限公司 Systems and methods of eye tracking data analysis
CN106412563A (en) * 2016-09-30 2017-02-15 珠海市魅族科技有限公司 Image display method and apparatus
CN106901686A (en) * 2017-02-28 2017-06-30 北京七鑫易维信息技术有限公司 The execution method of test of eye movement task, server, test lead and system
CN109101352A (en) * 2018-08-30 2018-12-28 Oppo广东移动通信有限公司 Algorithm framework, algorithm call method, device, storage medium and mobile terminal
CN109522789A (en) * 2018-09-30 2019-03-26 北京七鑫易维信息技术有限公司 Eyeball tracking method, apparatus and system applied to terminal device
CN109683705A (en) * 2018-11-30 2019-04-26 北京七鑫易维信息技术有限公司 The methods, devices and systems of eyeball fixes control interactive controls
CN109857537A (en) * 2019-03-06 2019-06-07 网易传媒科技(北京)有限公司 Background service starts method, apparatus, medium and electronic equipment
CN109976528A (en) * 2019-03-22 2019-07-05 北京七鑫易维信息技术有限公司 A kind of method and terminal device based on the dynamic adjustment watching area of head
CN110007482A (en) * 2019-04-12 2019-07-12 上海交通大学医学院附属第九人民医院 A kind of measuring and calculating platform and its application method for calculating multifocal spectacles focal position
CN110086967A (en) * 2019-04-10 2019-08-02 Oppo广东移动通信有限公司 Image processing method, image processor, filming apparatus and electronic equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105359062A (en) * 2013-04-16 2016-02-24 眼球控制技术有限公司 Systems and methods of eye tracking data analysis
US20160025972A1 (en) * 2014-07-23 2016-01-28 OrCam Technologies, Ltd. Systems and methods for analyzing advertisement effectiveness using wearable camera systems
CN106412563A (en) * 2016-09-30 2017-02-15 珠海市魅族科技有限公司 Image display method and apparatus
CN106901686A (en) * 2017-02-28 2017-06-30 北京七鑫易维信息技术有限公司 The execution method of test of eye movement task, server, test lead and system
CN109101352A (en) * 2018-08-30 2018-12-28 Oppo广东移动通信有限公司 Algorithm framework, algorithm call method, device, storage medium and mobile terminal
CN109522789A (en) * 2018-09-30 2019-03-26 北京七鑫易维信息技术有限公司 Eyeball tracking method, apparatus and system applied to terminal device
CN109683705A (en) * 2018-11-30 2019-04-26 北京七鑫易维信息技术有限公司 The methods, devices and systems of eyeball fixes control interactive controls
CN109857537A (en) * 2019-03-06 2019-06-07 网易传媒科技(北京)有限公司 Background service starts method, apparatus, medium and electronic equipment
CN109976528A (en) * 2019-03-22 2019-07-05 北京七鑫易维信息技术有限公司 A kind of method and terminal device based on the dynamic adjustment watching area of head
CN110086967A (en) * 2019-04-10 2019-08-02 Oppo广东移动通信有限公司 Image processing method, image processor, filming apparatus and electronic equipment
CN110007482A (en) * 2019-04-12 2019-07-12 上海交通大学医学院附属第九人民医院 A kind of measuring and calculating platform and its application method for calculating multifocal spectacles focal position

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111953848A (en) * 2020-08-19 2020-11-17 Oppo广东移动通信有限公司 System, method and related device for realizing application function through context awareness
CN111953848B (en) * 2020-08-19 2022-03-11 Oppo广东移动通信有限公司 System, method, related device and storage medium for realizing application function through context awareness

Also Published As

Publication number Publication date
CN110941344B (en) 2022-03-15

Similar Documents

Publication Publication Date Title
US10003769B2 (en) Video telephony system, image display apparatus, driving method of image display apparatus, method for generating realistic image, and non-transitory computer readable recording medium
WO2020216054A1 (en) Sight line tracking model training method, and sight line tracking method and device
US9817235B2 (en) Method and apparatus for prompting based on smart glasses
EP2991339B1 (en) Photographing method and electronic device
KR102083596B1 (en) Display device and operation method thereof
KR102499139B1 (en) Electronic device for displaying image and method for controlling thereof
CN108848313B (en) Multi-person photographing method, terminal and storage medium
CN109040524B (en) Artifact eliminating method and device, storage medium and terminal
WO2015142971A1 (en) Receiver-controlled panoramic view video share
CN112527174B (en) Information processing method and electronic equipment
CN110348198B (en) Identity recognition method, related device and system of simulation object
US9319468B2 (en) Information processing apparatus and information processing method
US20180144546A1 (en) Method, device and terminal for processing live shows
CN112527222A (en) Information processing method and electronic equipment
CN106774849B (en) Virtual reality equipment control method and device
CN112954212B (en) Video generation method, device and equipment
CN109002248B (en) VR scene screenshot method, equipment and storage medium
CN111045518B (en) Method and related device for acquiring attitude data
CN112241199B (en) Interaction method and device in virtual reality scene
CN111352560B (en) Screen splitting method and device, electronic equipment and computer readable storage medium
CN110941344B (en) Method for obtaining gazing point data and related device
CN108156386B (en) Panoramic photographing method and mobile terminal
CN110933314A (en) Focus-following shooting method and related product
KR102164686B1 (en) Image processing method and apparatus of tile images
CN110971924A (en) Method, device, storage medium and system for beautifying in live broadcast process

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant