CN111045518B - Method and related device for acquiring attitude data - Google Patents

Method and related device for acquiring attitude data Download PDF

Info

Publication number
CN111045518B
CN111045518B CN201911253915.2A CN201911253915A CN111045518B CN 111045518 B CN111045518 B CN 111045518B CN 201911253915 A CN201911253915 A CN 201911253915A CN 111045518 B CN111045518 B CN 111045518B
Authority
CN
China
Prior art keywords
service module
gesture data
party application
camera
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911253915.2A
Other languages
Chinese (zh)
Other versions
CN111045518A (en
Inventor
韩世广
方攀
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jinsheng Communication Technology Co ltd
Original Assignee
Shanghai Jinsheng Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jinsheng Communication Technology Co ltd filed Critical Shanghai Jinsheng Communication Technology Co ltd
Priority to CN201911253915.2A priority Critical patent/CN111045518B/en
Publication of CN111045518A publication Critical patent/CN111045518A/en
Application granted granted Critical
Publication of CN111045518B publication Critical patent/CN111045518B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The embodiment of the application discloses a method and a related device for acquiring gesture data, wherein the method comprises the following steps: the third party application obtains the version information of the media platform; the third party application determines at least one capability supported by a camera of the electronic device according to the version information of the media platform; the third party application determining whether the at least one capability includes gesture detection capability; if yes, the third party application sends an acquiring request of the gesture data to the media service module; the media service module interacts with the camera service module according to the acquisition request of the gesture data to obtain the gesture data; the media service module sends gesture data to a third party application; the third party application receives the gesture data. Therefore, the embodiment of the application is beneficial to the third party application to acquire the gesture data of the electronic equipment, and improves the compatibility of the electronic equipment.

Description

Method and related device for acquiring attitude data
Technical Field
The application relates to the technical field of electronic equipment, in particular to a method for acquiring gesture data and a related device.
Background
With the advancement of electronic technology, electronic devices may be provided with low-power real-time enabling sensors, and the electronic devices may determine the gesture of the current user through the low-power real-time enabling sensors. With the promotion of hardware equipment, the synchronous support of the software system can only play the role of the hardware equipment, thereby meeting the requirement of users for executing more operations by utilizing gestures.
Disclosure of Invention
The embodiment of the application provides a method and a related device for acquiring gesture data, which are beneficial to third party application to acquire the gesture data of electronic equipment and promote the compatibility of the electronic equipment.
In a first aspect, an embodiment of the present application provides a method for acquiring gesture data, which is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a frame layer; the application layer is provided with a third party application, the framework layer comprises a camera service module, and the method comprises the following steps:
the third party application obtains the version information of the media platform;
the third party application determines at least one capability supported by a camera of the electronic device according to the media platform version information;
the third party application determining whether the at least one capability includes gesture detection capability;
if yes, the third party application sends an acquiring request of gesture data to the media service module;
the media service module interacts with the camera service module according to the gesture data acquisition request to obtain gesture data;
the media service module sends the gesture data to the third party application;
The third party application receives the gesture data.
In a second aspect, an embodiment of the present application provides a device for acquiring gesture data, where the device is applied to an electronic device, and the electronic device includes a media service module and an android system, and the android system includes an application layer and a frame layer; the application layer is provided with a third party application, the framework layer comprises a camera service module, the device for acquiring gesture data comprises a processing unit, wherein,
the processing unit is used for the third party application to acquire the version information of the media platform; and determining, by the third party application, at least one capability supported by a camera of the electronic device according to the media platform version information; and means for the third party application to determine whether the at least one capability includes gesture detection capability; and if so, the third party application sends an acquisition request of gesture data to the media service module; the media service module is used for interacting with the camera service module according to the gesture data acquisition request to obtain gesture data; and means for the media service module to send the gesture data to the third party application; and receiving the gesture data for the third party application.
In a third aspect, an embodiment of the present application provides an electronic device, including a controller, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the controller, the programs including instructions for performing steps in any of the methods of the first aspect of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program causes a computer to perform some or all of the steps as described in any of the methods of the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in any of the methods of the first aspect of embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device obtains version information of a media platform; secondly, the third party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; the third party application determining whether the at least one capability includes gesture detection capability; if yes, the third party application sends an acquiring request of the gesture data to the media service module; secondly, the media service module interacts with the camera service module according to the acquisition request of the gesture data to obtain the gesture data; the media service module sends gesture data to a third party application; finally, the third party application receives the gesture data. Therefore, the third party application in the embodiment of the application can acquire the gesture data of the system level through the media service module, enriches the service path of the gesture data, and improves the compatibility of the electronic equipment.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2A is a flowchart of a method for acquiring gesture data according to an embodiment of the present application;
FIG. 2B is a system framework diagram provided by an embodiment of the present application;
FIG. 2C is a schematic diagram of determining a facial tilt angle according to an embodiment of the present application;
FIG. 3 is a flowchart of another method for acquiring gesture data according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 5 is a functional unit block diagram of an apparatus for acquiring gesture data according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The embodiments of the present application are described in detail below.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device 100 provided in an embodiment of the present application, as shown in the drawing, a plane on which a display screen of the electronic device 100 is located includes an active real time sensor (AON) sensor 101, where the active real time sensor 101 has an ultra-low power consumption, and can be in a real-time on state, so as to provide an on-line intelligent sensing service for the electronic device 100; specifically, the real-time enabling sensor 101 includes a camera, and the real-time enabling sensor 101 can acquire image data and process the image data according to an algorithm preset in the sensor, so as to achieve the purposes of gesture recognition and face recognition. The real-time enabling sensor 101 includes hardware formed by a trained AI model, and can directly output face recognition and gesture recognition results according to image data.
The electronic device may include various handheld devices, vehicle mounted devices, wearable devices (e.g., smart watches, smart bracelets, pedometers, etc.), computing devices or other processing devices connected to a wireless modem, as well as various forms of User Equipment (UE), mobile Stations (MSs), terminal devices (terminal devices), etc., all with wireless communication capabilities. For convenience of description, the above-mentioned devices are collectively referred to as a terminal.
Referring to fig. 2A, fig. 2A is a flow chart of a method for acquiring gesture data, which is provided in an embodiment of the present application, and is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application, and the framework layer comprises a camera service module. As shown in the figure, the method for acquiring gesture data includes:
in step 201, a third party application obtains media platform version information.
The third application sends a version information request carrying an authentication code to the media service module, the media service module authenticates the authentication code, if the authentication is passed, the media service module sends correct version information to the third party application, and if the authentication is failed, the media service module sends an empty string to the third party application.
Alternatively, the application layer of the electronic device may include a media management module, and the third party application and the media service module may communicate through the media management module, which may include a control interface, etc.
In step 202, a third party application determines at least one capability supported by a camera of an electronic device based on media platform version information.
The media platform version information comprises camera information set by the electronic equipment, and the camera information comprises the supporting capability of each camera; for example, version information may include: the current electronic equipment comprises six cameras, wherein three cameras are front cameras, three cameras are rear cameras, and three front cameras comprise 3D deep cameras, portrait cameras and real-time starting sensors. The three rear cameras can comprise an ultra-wide angle camera, a wide angle camera and a tele camera. The version information includes the achievable functions of each camera, and the third party application can select the achievable functions.
In step 203, the third party application determines whether the at least one capability includes gesture detection capability.
And step 204, if yes, the third party application sends an acquiring request of the gesture data to the media service module.
The media service module provides the service of the resident system, runs after the electronic equipment is started, executes authentication, responds to the configuration request of the third party application, and configures the configuration information of the third application to the bottom layer of the android system.
In step 205, the media service module interacts with the camera service module according to the gesture data acquisition request to obtain gesture data.
In step 206, the media service module sends the gesture data to the third party application.
In step 207, the third party application receives gesture data.
Referring to fig. 2B, fig. 2B is a system frame diagram provided by the embodiment of the present application, where an electronic device includes a media service module and an android system, an application layer of the android system is provided with a third party application and a local camera application, the framework layer of the android system includes application interfaces (such as a native camera application program interface) of various native applications, application services (such as a camera service module), a framework layer interface (such as a Google HAL3 interface), and a hardware abstraction layer of the android system is provided with a hardware abstraction module (such as an android native module, such as a native camera hardware abstraction module, and in addition, the android system native architecture further includes a kernel (also referred to as a driver) and a hardware layer, where the hardware abstraction layer includes a hardware abstraction layer interface (such as a camera hardware abstraction module), and the kernel and the hardware layer includes drivers of various hardware (such as a screen driver, an audio driver, etc.), and various hardware (such as an image signal processor, an eye image sensor, a front-end image sensor, etc.).
Wherein the media service module is set independently of the android system, the third party application can communicate with the media service module, and the media service module can communicate with the camera service module. The media service module may communicate with the camera service module to obtain image data of the camera service module. The specific generation process of the image data comprises the following steps: the method comprises the steps that a sensor of a kernel and a sensor of a hardware layer acquire original image data, the sensor sends the original image data to an image signal processor for processing, the image signal processor sends the processed original image data to a camera hardware abstraction module of the hardware abstraction layer through a driver, the camera hardware abstraction module sends the processed original image data to a camera service module through a hardware abstraction layer interface and a frame layer interface, and the camera service module calls a preset algorithm to process the original image data to generate gesture data; the camera service module sends the gesture data to the third party application through the media service module.
Based on the architecture, the media service module can call the application driver through the android native information link to enable certain hardware, so that the hardware associated with the native application can be opened for the third-party application.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device obtains version information of a media platform; secondly, the third party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; the third party application determining whether the at least one capability includes gesture detection capability; if yes, the third party application sends an acquiring request of the gesture data to the media service module; secondly, the media service module interacts with the camera service module according to the acquisition request of the gesture data to obtain the gesture data; the media service module sends gesture data to a third party application; finally, the third party application receives the gesture data. Therefore, the third party application in the embodiment of the application can acquire the gesture data of the system level through the media service module, enriches the service path of the gesture data, and improves the compatibility of the electronic equipment.
In one possible example, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, the electronic device is provided with a real-time enabling sensor, and the media service module interacts with the camera service module according to the gesture data obtaining request to obtain gesture data, including: the media service module judges a control mode supported by the third party application according to the gesture data acquisition request, wherein the control mode comprises at least one of the following: face control and gesture control; the media service module sends an attitude acquisition request corresponding to the control mode to the camera service module; the camera service module sends a data acquisition instruction to the camera hardware abstraction module according to the gesture acquisition request; the camera hardware abstraction module calls the real-time enabling sensor to acquire image data; the camera hardware abstraction module sends the image data to the camera service module; the camera service module processes the image data to obtain gesture data corresponding to the control mode; the camera service module sends the gesture data to the media service module; the media service module receives the gesture data.
The electronic equipment sends a corresponding gesture acquisition request according to a control mode; if the third party application supports gesture control, the gesture acquisition request is for acquiring gesture data; if the third party application supports face control, the gesture acquisition request is used for acquiring face data; if the third party application supports the common control of the gesture and the face, the gesture acquisition request is used for acquiring gesture data and face data.
The real-time enabling sensor can directly output gesture data, such as gesture data including up-slide, down-slide, left-slide and right-slide; the face data comprises the orientation of a face and the face recognition; enabling the sensor in real time to send the gesture data to the camera hardware abstraction module, and sending the gesture data to the camera service module by the camera hardware abstraction module; the camera service module sends the media service module to the third party application. Or the real-time starting sensor is responsible for acquiring image data in real time and transmitting the image data to the camera hardware abstraction module, the camera hardware abstraction module transmits the image data to the camera service module, the camera service module processes the image data according to a preset algorithm to obtain gesture data, and the gesture data is transmitted to a third party application through the media service module.
In this example, the third party application obtains the gesture data through the module of the system level, so that the function of the system level is fully exerted, and the efficiency of data processing is improved.
In one possible example, the third party application is audio playing software, the gesture data includes gesture data, and after the third party application receives the gesture data, the method further includes: if the gesture data is left-hand slide, the third party application switches the song to the last song; and if the gesture data is rightward sliding, the third party application switches the song to the next song.
In this example, the third party application in the electronic device may perform the operation of song switching by using the gesture data sent by the system level, so that convenience of the user in using the third party music software is improved, and compatibility and intelligence of the electronic device are improved.
In one possible example, the third party application is audio playing software, the gesture data includes face data and gesture data, the face data includes a tilt angle of a face with a preset direction of the electronic device, and after the third party application receives the gesture data, the method further includes: if the inclination angle is within a first preset range and the gesture data is left-sliding, switching the current play list to be the last play list; determining a first song in the previous play list according to the current play mode, and switching the currently played song to be the first song; if the inclination angle is within a second preset range and the gesture data is rightward sliding, switching the current play list to be a next play list; and determining a second song in the next play list according to the current play mode, switching the currently played song to be the second song, wherein the first preset range and the second preset range are different.
Referring to fig. 2C, fig. 2C is a schematic diagram illustrating a determination of a face inclination angle according to an embodiment of the present application. As shown in the figure, the electronic equipment determines pupils according to the face data, and connects the positions of the pupils of the eyes of the user to obtain a target line; the electronic device may set a reference line, as in fig. 2C, where the reference line may be a line parallel to the upper and lower sides of the electronic device (may also be vertical, or a diagonal line of the electronic device, etc.), and after the electronic device obtains the face image, the face image may be placed in a preset area (specifically, an area above the reference line), where the preset area is above the reference line, and the position where the included angle between the target line and the reference line is located and the size of the included angle are determined; if the included angle is on the left side of the electronic equipment, determining that the current face is inclined leftwards; and if the included angle is on the right side of the electronic equipment, determining that the current face is inclined to the right. Specifically, in this example, the first preset range refers to the face being inclined to the left, and the angle is between 15 ° and 45 °; the second preset range means that the face is inclined rightward, and the angle is between 15 ° and 45 °.
In this example, the electronic device may perform operations in combination with the face data and the gesture data, so that convenience in a use process of a user is improved, functionality of the electronic device is enriched, and intelligence of the electronic device is improved.
In one possible example, the electronic device is communicatively connected to an in-vehicle device, the third party application is navigation software, the in-vehicle device may implement driving control for the vehicle, the gesture data includes gesture data, and after the third party application receives the gesture data, the method further includes: if the gesture data is a slide-up gesture, the electronic device sends an acceleration instruction to the vehicle-mounted device, wherein the acceleration instruction is used for indicating the vehicle-mounted device to increase the speed of the vehicle by a first preset value; if the gesture data is a sliding down, the electronic device sends a deceleration instruction to the vehicle-mounted device, wherein the deceleration instruction is used for instructing the vehicle-mounted device to reduce the speed of the vehicle by a second preset value; if the gesture data is left-hand sliding, the electronic equipment sends a left lane changing instruction to the vehicle-mounted equipment so as to instruct the vehicle-mounted equipment to control the vehicle to change lanes to the lane left of the current lane; if the gesture data is right sliding, the electronic device sends a right lane changing instruction to the vehicle-mounted device so as to instruct the vehicle-mounted device to control the vehicle to change lanes to the lane on the right of the current lane.
The first preset value and the second preset value are preset, specifically, the electronic device may determine to execute different strategies according to different positions, for example, the current electronic device determines that the vehicle is running on the urban road according to the positioning system, the first preset value and the second preset value may be 10 km/h, and the first preset value and the second preset value may be 20 km/h if the electronic device determines that the vehicle is running on the expressway according to the positioning system.
Therefore, in the example, the electronic equipment is linked with the vehicle-mounted equipment, intelligent control of the vehicle is realized through gesture data, functions of the electronic equipment are enriched, and the intelligence of the electronic equipment is improved.
In one possible example, the android system further includes a kernel and a hardware layer, the kernel and hardware layer further includes an image signal processor, and the camera hardware abstraction module invokes the real-time enabling sensor to acquire image data, including: the camera hardware abstraction module sends a camera data acquisition request to the image signal processor; the image signal processor calls the real-time enabling sensor to acquire image data; the image signal processor processes the image data according to a preset algorithm to update the image data; the image signal processor sends the updated image data to the camera hardware abstraction module.
In this example, after the image data is acquired by the real-time enabling sensor, the image data is processed by the image signal processor, so that the quality of the image is improved, and further, the gesture data calculated by the camera service module according to the image is more accurate.
In one possible example, the third party application obtaining media platform version information includes: the third party application sends a request for obtaining version information to the media service module, wherein the request comprises an authentication code; the media service module authenticates the authentication code; and if the authentication code passes the authentication, the media service module sends the version information to the third party application.
The authentication code may be an authentication code generated by an RSA encryption algorithm.
The process of the media service module for authenticating the authentication code comprises the following steps: the media service module acquires an asymmetric private key of a preconfigured third party application; the media service module decrypts the authentication code by using the asymmetric private key to obtain an APP signature key, a system date and a stipulated field of the third-party camera application; the media service module determines whether the authentication code can pass verification according to the APP signature key, the system date and the appointed field.
In this example, when the third party application sends a data acquisition request, the media service module in the electronic device authenticates the information of the third party application, thereby ensuring the security of the system.
Referring to fig. 3, fig. 3 is a schematic flow chart of a method for acquiring gesture data, which is provided in an embodiment of the present application and is applied to an electronic device, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application, and the framework layer comprises a camera service module. As shown in the figure, the method for acquiring gesture data includes:
in step 301, a third party application sends a request to a media service module to obtain version information of a media platform, the request including an authentication code.
Step 302, the media service module authenticates the authentication code.
If the authentication code passes the authentication, the media service module transmits version information to the third party application, step 303.
In step 304, the third party application determines at least one capability supported by a camera of the electronic device based on the media platform version information.
In step 305, if the at least one capability includes a gesture detection capability, the third party application sends a request for acquiring gesture data to the media service module.
In step 306, the media service module determines a control mode supported by the third party application according to the gesture data acquisition request.
In step 307, the media service module sends an attitude acquisition request corresponding to the control mode to the camera service module.
In step 308, the camera service module sends a data acquisition instruction to the camera hardware abstraction module according to the gesture acquisition request.
In step 309, the camera hardware abstraction module invokes the real-time enabled sensor to acquire image data.
In step 310, the camera hardware abstraction module sends image data to the camera service module.
In step 311, the camera service module processes the image data to obtain gesture data corresponding to the control mode.
In step 312, the camera service module sends the gesture data to the third party application via the media service module.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device obtains version information of a media platform; secondly, the third party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; the third party application determining whether the at least one capability includes gesture detection capability; if yes, the third party application sends an acquiring request of the gesture data to the media service module; secondly, the media service module interacts with the camera service module according to the acquisition request of the gesture data to obtain the gesture data; the media service module sends gesture data to a third party application; finally, the third party application receives the gesture data. Therefore, the third party application in the embodiment of the application can acquire the gesture data of the system level through the media service module, enriches the service path of the gesture data, and improves the compatibility of the electronic equipment.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device 400 according to an embodiment of the present application, as shown in the fig. 2A and fig. 3, where the electronic device includes a media service module and an android system, and the android system includes an application layer and a framework layer; the application layer is provided with a third party application, the framework layer comprises a camera service module, the electronic device 400 comprises an application processor 410, a memory 420, a communication interface 430 and one or more programs 421, wherein the one or more programs 421 are stored in the memory 420 and configured to be executed by the application processor 410, the one or more programs 421 comprising instructions for performing the steps of:
the third party application obtains the version information of the media platform;
the third party application determines at least one capability supported by a camera of the electronic device according to the media platform version information;
the third party application determining whether the at least one capability includes gesture detection capability;
if yes, the third party application sends an acquiring request of gesture data to the media service module;
the media service module interacts with the camera service module according to the gesture data acquisition request to obtain gesture data;
The media service module sends the gesture data to the third party application;
the third party application receives the gesture data.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device obtains version information of a media platform; secondly, the third party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; the third party application determining whether the at least one capability includes gesture detection capability; if yes, the third party application sends an acquiring request of the gesture data to the media service module; secondly, the media service module interacts with the camera service module according to the acquisition request of the gesture data to obtain the gesture data; the media service module sends gesture data to a third party application; finally, the third party application receives the gesture data. Therefore, the third party application in the embodiment of the application can acquire the gesture data of the system level through the media service module, enriches the service path of the gesture data, and improves the compatibility of the electronic equipment.
In one possible example, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, the electronic device is provided with a real-time enabling sensor, and in terms of interaction between the media service module and the camera service module according to the gesture data obtaining request, the gesture data obtaining method includes the following operations: the media service module judges a control mode supported by the third party application according to the gesture data acquisition request, wherein the control mode comprises at least one of the following: face control and gesture control; the media service module sends an attitude acquisition request corresponding to the control mode to the camera service module; the camera service module sends a data acquisition instruction to the camera hardware abstraction module according to the gesture acquisition request; the camera hardware abstraction module calls the real-time enabling sensor to acquire image data; the camera hardware abstraction module sends the image data to the camera service module; the camera service module processes the image data to obtain gesture data corresponding to the control mode; the camera service module sends the gesture data to the media service module; the media service module receives the gesture data.
In one possible example, the third party application is audio playing software, the gesture data includes gesture data, and after the third party application receives the gesture data, the instructions in the program are specifically configured to: if the gesture data is left-hand slide, the third party application switches the song to the last song; and if the gesture data is rightward sliding, the third party application switches the song to the next song.
In one possible example, the third party application is audio playing software, the gesture data includes face data and gesture data, the face data includes a tilt angle of a face with a preset direction of the electronic device, and after the third party application receives the gesture data, the instructions in the program are specifically configured to perform the following operations: if the inclination angle is within a first preset range and the gesture data is left-sliding, switching the current play list to be the last play list; determining a first song in the previous play list according to the current play mode, and switching the currently played song to be the first song; if the inclination angle is within a second preset range and the gesture data is rightward sliding, switching the current play list to be a next play list; and determining a second song in the next play list according to the current play mode, switching the currently played song to be the second song, wherein the first preset range and the second preset range are different.
In one possible example, the electronic device is communicatively connected to an on-board device, the third party application is navigation software, the on-board device may implement driving control for the vehicle, the gesture data includes gesture data, and after the third party application receives the gesture data, the instructions in the program are specifically configured to perform the following operations: if the gesture data is a slide-up gesture, the electronic device sends an acceleration instruction to the vehicle-mounted device, wherein the acceleration instruction is used for indicating the vehicle-mounted device to increase the speed of the vehicle by a first preset value; if the gesture data is a sliding down, the electronic device sends a deceleration instruction to the vehicle-mounted device, wherein the deceleration instruction is used for instructing the vehicle-mounted device to reduce the speed of the vehicle by a second preset value; if the gesture data is left-hand sliding, the electronic equipment sends a left lane changing instruction to the vehicle-mounted equipment so as to instruct the vehicle-mounted equipment to control the vehicle to change lanes to the lane left of the current lane; if the gesture data is right sliding, the electronic device sends a right lane changing instruction to the vehicle-mounted device so as to instruct the vehicle-mounted device to control the vehicle to change lanes to the lane on the right of the current lane.
In one possible example, the android system further includes a kernel and a hardware layer, the kernel and the hardware layer further include an image signal processor, and in the aspect that the camera hardware abstraction module invokes the real-time enabling sensor to acquire image data, the instructions in the program are specifically configured to perform the following operations: the camera hardware abstraction module sends a camera data acquisition request to the image signal processor; the image signal processor calls the real-time enabling sensor to acquire image data; the image signal processor processes the image data according to a preset algorithm to update the image data; the image signal processor sends the updated image data to the camera hardware abstraction module.
In one possible example, in terms of the third party application obtaining media platform version information, the instructions in the program are specifically for: the third party application sends a request for obtaining version information to the media service module, wherein the request comprises an authentication code; the media service module authenticates the authentication code; and if the authentication code passes the authentication, the media service module sends the version information to the third party application.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional units of the electronic device according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one control unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 5 is a functional unit block diagram of an apparatus 500 for acquiring gesture data according to an embodiment of the present application. The device 500 for acquiring gesture data is applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and the android system comprises an application layer and a frame layer; the application layer is provided with a third party application, the framework layer comprises a camera service module, and the device 500 for acquiring gesture data comprises a processing unit 501, wherein:
the processing unit 501 is configured to obtain media platform version information by using the third party application; and determining, by the third party application, at least one capability supported by a camera of the electronic device according to the media platform version information; and means for the third party application to determine whether the at least one capability includes gesture detection capability; and if so, the third party application sends an acquisition request of gesture data to the media service module; the media service module is used for interacting with the camera service module according to the gesture data acquisition request to obtain gesture data; and means for the media service module to send the gesture data to the third party application; and receiving the gesture data for the third party application.
The apparatus 500 for acquiring gesture data may further include a communication unit 502 and a storage unit 503, where the storage unit 503 is configured to store program codes and data of an electronic device. The processing unit 501 may be a processor, the communication unit 502 may be a touch display or a transceiver, and the storage unit 503 may be a memory.
It can be seen that, in the embodiment of the present application, a third party application in an electronic device obtains version information of a media platform; secondly, the third party application determines at least one capability supported by a camera of the electronic equipment according to the version information of the media platform; the third party application determining whether the at least one capability includes gesture detection capability; if yes, the third party application sends an acquiring request of the gesture data to the media service module; secondly, the media service module interacts with the camera service module according to the acquisition request of the gesture data to obtain the gesture data; the media service module sends gesture data to a third party application; finally, the third party application receives the gesture data. Therefore, the third party application in the embodiment of the application can acquire the gesture data of the system level through the media service module, enriches the service path of the gesture data, and improves the compatibility of the electronic equipment.
In one possible example, the android system further includes a hardware abstraction layer, the hardware abstraction layer includes a camera hardware abstraction module, the electronic device is provided with a real-time enabling sensor, in that the media service module interacts with the camera service module according to the gesture data obtaining request to obtain gesture data, the processing unit 501 is specifically configured to: the media service module judges a control mode supported by the third party application according to the gesture data acquisition request, wherein the control mode comprises at least one of the following: face control and gesture control; the media service module sends an attitude acquisition request corresponding to the control mode to the camera service module; the camera service module sends a data acquisition instruction to the camera hardware abstraction module according to the gesture acquisition request; the camera hardware abstraction module calls the real-time enabling sensor to acquire image data; the camera hardware abstraction module sends the image data to the camera service module; the camera service module processes the image data to obtain gesture data corresponding to the control mode; the camera service module sends the gesture data to the media service module; the media service module receives the gesture data.
In one possible example, the third party application is audio playing software, the gesture data includes gesture data, and after the third party application receives the gesture data, the processing unit 501 is specifically configured to: if the gesture data is left-hand slide, the third party application switches the song to the last song; and if the gesture data is rightward sliding, the third party application switches the song to the next song.
In one possible example, the third party application is audio playing software, the gesture data includes face data and gesture data, the face data includes a tilt angle of a face with a preset direction of the electronic device, and the processing unit 501 is specifically configured to, after the third party application receives the gesture data: if the inclination angle is within a first preset range and the gesture data is left-sliding, switching the current play list to be the last play list; determining a first song in the previous play list according to the current play mode, and switching the currently played song to be the first song; if the inclination angle is within a second preset range and the gesture data is rightward sliding, switching the current play list to be a next play list; and determining a second song in the next play list according to the current play mode, switching the currently played song to be the second song, wherein the first preset range and the second preset range are different.
In one possible example, the electronic device is communicatively connected to an in-vehicle device, the third party application is navigation software, the in-vehicle device may implement driving control for the vehicle, the gesture data includes gesture data, and the processing unit 501 is specifically configured to, after the third party application receives the gesture data: if the gesture data is a slide-up gesture, the electronic device sends an acceleration instruction to the vehicle-mounted device, wherein the acceleration instruction is used for indicating the vehicle-mounted device to increase the speed of the vehicle by a first preset value; if the gesture data is a sliding down, the electronic device sends a deceleration instruction to the vehicle-mounted device, wherein the deceleration instruction is used for instructing the vehicle-mounted device to reduce the speed of the vehicle by a second preset value; if the gesture data is left-hand sliding, the electronic equipment sends a left lane changing instruction to the vehicle-mounted equipment so as to instruct the vehicle-mounted equipment to control the vehicle to change lanes to the lane left of the current lane; if the gesture data is right sliding, the electronic device sends a right lane changing instruction to the vehicle-mounted device so as to instruct the vehicle-mounted device to control the vehicle to change lanes to the lane on the right of the current lane.
In one possible example, the android system further includes a kernel and a hardware layer, where the kernel and the hardware layer further include an image signal processor, and the processing unit 501 is specifically configured to: the camera hardware abstraction module sends a camera data acquisition request to the image signal processor; the image signal processor calls the real-time enabling sensor to acquire image data; the image signal processor processes the image data according to a preset algorithm to update the image data; the image signal processor sends the updated image data to the camera hardware abstraction module.
In one possible example, in terms of the third party application obtaining media platform version information, the processing unit 501 is specifically configured to: the third party application sends a request for obtaining version information to the media service module, wherein the request comprises an authentication code; the media service module authenticates the authentication code; and if the authentication code passes the authentication, the media service module sends the version information to the third party application.
The embodiment of the application also provides a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program makes a computer execute part or all of the steps of any one of the above method embodiments, and the computer includes a mobile terminal.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the methods described in the method embodiments above. The computer program product may be a software installation package, said computer comprising a mobile terminal.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, such as the above-described division of units, merely a division of logic functions, and there may be additional manners of dividing in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one control unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the above-mentioned method of the various embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
The foregoing has outlined rather broadly the more detailed description of embodiments of the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, the above examples being provided solely to assist in the understanding of the methods of the present application and the core ideas thereof; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (8)

1. The method for acquiring the gesture data is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and the android system comprises an application layer and a framework layer; the application layer is provided with a third party application, and the framework layer comprises a camera service module; the android system further comprises a hardware abstraction layer, the hardware abstraction layer comprises a camera hardware abstraction module, and the electronic equipment is provided with a real-time enabling sensor; the android system further comprises a kernel and a hardware layer, the kernel and the hardware layer further comprise an image signal processor, and the method comprises the following steps:
The third party application obtains the version information of the media platform;
the third party application determines at least one capability supported by a camera of the electronic device according to the media platform version information;
the third party application determining whether the at least one capability includes gesture detection capability;
if yes, the third party application sends an acquiring request of gesture data to the media service module;
the media service module interacts with the camera service module according to the gesture data acquisition request to obtain gesture data, and the method comprises the following steps: the media service module judges a control mode supported by the third party application according to the gesture data acquisition request, wherein the control mode comprises at least one of the following: the method comprises the steps that face control and gesture control are carried out, a media service module sends a gesture acquisition request corresponding to a control mode to a camera service module, the camera service module sends a data acquisition instruction to a camera hardware abstraction module according to the gesture acquisition request, the camera hardware abstraction module calls a real-time enabling sensor to acquire image data, the camera hardware abstraction module sends the image data to the camera service module, the camera service module processes the image data to obtain gesture data corresponding to the control mode, the camera service module sends the gesture data to the media service module, and the media service module receives the gesture data;
The media service module sends the gesture data to the third party application;
the third party application receiving the gesture data;
the camera hardware abstraction module invoking the real-time enabling sensor to acquire image data, comprising: the camera hardware abstraction module sends a camera data acquisition request to the image signal processor, the image signal processor calls the real-time enabling sensor to acquire image data, the image signal processor processes the image data according to a preset algorithm to update the image data, and the image signal processor sends the updated image data to the camera hardware abstraction module.
2. The method of claim 1, wherein the third party application is audio playback software, wherein the gesture data includes gesture data, and wherein after the third party application receives the gesture data, the method further comprises:
if the gesture data is left-hand slide, the third party application switches the song to the last song;
and if the gesture data is rightward sliding, the third party application switches the song to the next song.
3. The method of claim 1, wherein the third party application is audio playing software, the gesture data includes face data and gesture data, the face data includes a tilt angle of a face with a preset direction of the electronic device, and the method further includes, after the third party application receives the gesture data:
If the inclination angle is within a first preset range and the gesture data is left-sliding, switching the current play list to be the last play list; determining a first song in the previous play list according to the current play mode, and switching the currently played song to be the first song;
if the inclination angle is within a second preset range and the gesture data is rightward sliding, switching the current play list to be a next play list; and determining a second song in the next play list according to the current play mode, switching the currently played song to be the second song, wherein the first preset range and the second preset range are different.
4. The method of claim 1, wherein the electronic device is communicatively connected to an in-vehicle device, the third party application is navigation software, the in-vehicle device is capable of implementing driving control for the vehicle, the gesture data includes gesture data, and the third party application receives the gesture data, and the method further includes:
if the gesture data is a slide-up gesture, the electronic device sends an acceleration instruction to the vehicle-mounted device, wherein the acceleration instruction is used for indicating the vehicle-mounted device to increase the speed of the vehicle by a first preset value;
If the gesture data is a sliding down, the electronic device sends a deceleration instruction to the vehicle-mounted device, wherein the deceleration instruction is used for instructing the vehicle-mounted device to reduce the speed of the vehicle by a second preset value;
if the gesture data is left-hand sliding, the electronic equipment sends a left lane changing instruction to the vehicle-mounted equipment so as to instruct the vehicle-mounted equipment to control the vehicle to change lanes to the lane left of the current lane;
if the gesture data is right sliding, the electronic device sends a right lane changing instruction to the vehicle-mounted device so as to instruct the vehicle-mounted device to control the vehicle to change lanes to the lane on the right of the current lane.
5. The method of any of claims 1-4, wherein the third party application obtaining media platform version information comprises:
the third party application sends a request for obtaining version information to the media service module, wherein the request comprises an authentication code;
the media service module authenticates the authentication code;
and if the authentication code passes the authentication, the media service module sends the version information to the third party application.
6. The device for acquiring the gesture data is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a media service module and an android system, and the android system comprises an application layer and a framework layer; the application layer is provided with a third party application, and the framework layer comprises a camera service module; the android system further comprises a hardware abstraction layer, the hardware abstraction layer comprises a camera hardware abstraction module, and the electronic equipment is provided with a real-time enabling sensor; the android system further comprises a kernel and a hardware layer, the kernel and the hardware layer further comprise an image signal processor, the device for acquiring gesture data comprises a processing unit, wherein,
The processing unit is used for the third party application to acquire the version information of the media platform; and determining, by the third party application, at least one capability supported by a camera of the electronic device according to the media platform version information; and means for the third party application to determine whether the at least one capability includes gesture detection capability; and if so, the third party application sends an acquisition request of gesture data to the media service module; the media service module is used for interacting with the camera service module according to the gesture data acquisition request to obtain gesture data; and means for the media service module to send the gesture data to the third party application; receiving the gesture data for the third party application;
in the aspect that the media service module interacts with the camera service module according to the gesture data obtaining request to obtain gesture data, the processing unit is specifically configured to determine a control mode supported by the third party application according to the gesture data obtaining request by using the media service module, where the control mode includes at least one of the following: the method comprises the steps that face control and gesture control are carried out, a media service module sends a gesture acquisition request corresponding to a control mode to a camera service module, the camera service module sends a data acquisition instruction to a camera hardware abstraction module according to the gesture acquisition request, the camera hardware abstraction module calls a real-time enabling sensor to acquire image data, the camera hardware abstraction module sends the image data to the camera service module, the camera service module processes the image data to obtain gesture data corresponding to the control mode, the camera service module sends the gesture data to the media service module, and the media service module receives the gesture data;
In the aspect that the camera hardware abstraction module calls the real-time enabling sensor to acquire image data, the processing unit is specifically configured to send an acquisition request of the camera data to the image signal processor, the image signal processor calls the real-time enabling sensor to acquire the image data, the image signal processor processes the image data according to a preset algorithm to update the image data, and the image signal processor sends the updated image data to the camera hardware abstraction module.
7. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-5.
8. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-5.
CN201911253915.2A 2019-12-09 2019-12-09 Method and related device for acquiring attitude data Active CN111045518B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253915.2A CN111045518B (en) 2019-12-09 2019-12-09 Method and related device for acquiring attitude data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253915.2A CN111045518B (en) 2019-12-09 2019-12-09 Method and related device for acquiring attitude data

Publications (2)

Publication Number Publication Date
CN111045518A CN111045518A (en) 2020-04-21
CN111045518B true CN111045518B (en) 2023-06-30

Family

ID=70235324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253915.2A Active CN111045518B (en) 2019-12-09 2019-12-09 Method and related device for acquiring attitude data

Country Status (1)

Country Link
CN (1) CN111045518B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111061524A (en) * 2019-12-09 2020-04-24 Oppo广东移动通信有限公司 Application data processing method and related device
CN111953848B (en) * 2020-08-19 2022-03-11 Oppo广东移动通信有限公司 System, method, related device and storage medium for realizing application function through context awareness
CN112667079A (en) * 2020-12-25 2021-04-16 海信视像科技股份有限公司 Virtual reality equipment and reverse prompt picture display method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017186007A1 (en) * 2016-04-26 2017-11-02 斑马网络技术有限公司 Media processing method, device, apparatus and system
CN107317918A (en) * 2017-05-26 2017-11-03 广东欧珀移动通信有限公司 Parameter setting method and related product
WO2018214734A1 (en) * 2017-05-26 2018-11-29 Oppo广东移动通信有限公司 Photographing control method and related product
WO2019072132A1 (en) * 2017-10-11 2019-04-18 Oppo广东移动通信有限公司 Face recognition method and related product
CN110169056A (en) * 2016-12-12 2019-08-23 华为技术有限公司 A kind of method and apparatus that dynamic 3 D image obtains
CN110191280A (en) * 2019-05-24 2019-08-30 Oppo广东移动通信有限公司 The photographic method and Related product shown based on cover board
WO2019208915A1 (en) * 2018-04-25 2019-10-31 삼성전자 주식회사 Electronic device for acquiring image using plurality of cameras through position adjustment of external device, and method therefor
CN110502109A (en) * 2019-07-31 2019-11-26 Oppo广东移动通信有限公司 Information processing method, device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017186007A1 (en) * 2016-04-26 2017-11-02 斑马网络技术有限公司 Media processing method, device, apparatus and system
CN110169056A (en) * 2016-12-12 2019-08-23 华为技术有限公司 A kind of method and apparatus that dynamic 3 D image obtains
CN107317918A (en) * 2017-05-26 2017-11-03 广东欧珀移动通信有限公司 Parameter setting method and related product
WO2018214734A1 (en) * 2017-05-26 2018-11-29 Oppo广东移动通信有限公司 Photographing control method and related product
WO2019072132A1 (en) * 2017-10-11 2019-04-18 Oppo广东移动通信有限公司 Face recognition method and related product
WO2019208915A1 (en) * 2018-04-25 2019-10-31 삼성전자 주식회사 Electronic device for acquiring image using plurality of cameras through position adjustment of external device, and method therefor
CN110191280A (en) * 2019-05-24 2019-08-30 Oppo广东移动通信有限公司 The photographic method and Related product shown based on cover board
CN110502109A (en) * 2019-07-31 2019-11-26 Oppo广东移动通信有限公司 Information processing method, device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
姚远 ; 程依平 ; 熊勇 ; .基于Arduino姿态数据云的无线监控终端设计.物流科技.2018,(05),全文. *
徐战亚 ; 吴信才 ; .面向移动空间信息服务的可移植嵌入式GIS平台.地球科学(中国地质大学学报).2010,(03),全文. *

Also Published As

Publication number Publication date
CN111045518A (en) 2020-04-21

Similar Documents

Publication Publication Date Title
JP6966572B2 (en) Signature generation method, electronic devices and storage media
CN111045518B (en) Method and related device for acquiring attitude data
CN110148294B (en) Road condition state determining method and device
US20200173804A1 (en) Map display method, device, storage medium and terminal
KR102130503B1 (en) Mobile terminal
CN104742903B (en) Realize the method and device of cruise
KR102225918B1 (en) Artificial intelligence device
CN112307642B (en) Data processing method, device, system, computer equipment and storage medium
KR20160071887A (en) Mobile terminal and method for controlling the same
KR20100124591A (en) Mobile terminal system and control method thereof
KR101692433B1 (en) Method, system and mobile terminal for video service providing
US20180307819A1 (en) Terminal control method and terminal, storage medium
CN105450883A (en) An operating system conversion device and method, an operating system transmission device and method, and a vehicle
CN110044638A (en) Test method, device and the storage medium of lane holding function
CN113160427A (en) Virtual scene creating method, device, equipment and storage medium
CN110991369A (en) Image data processing method and related device
CN110493635B (en) Video playing method and device and terminal
CN111241499A (en) Application program login method, device, terminal and storage medium
KR20160090584A (en) Display device and method for controlling the same
CN110457082A (en) The method, apparatus and storage medium that control application program is registered
CN112330380B (en) Order creation method, order creation device, computer equipment and computer readable storage medium
CN115623271A (en) Processing method of video to be injected and electronic equipment
CN110941344B (en) Method for obtaining gazing point data and related device
KR101774807B1 (en) Mobile terminal and operation method thereof
KR20130015975A (en) Apparatus and method for detecting a vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant