CN113031838A - Screen recording method and device and electronic equipment - Google Patents

Screen recording method and device and electronic equipment Download PDF

Info

Publication number
CN113031838A
CN113031838A CN201911357660.4A CN201911357660A CN113031838A CN 113031838 A CN113031838 A CN 113031838A CN 201911357660 A CN201911357660 A CN 201911357660A CN 113031838 A CN113031838 A CN 113031838A
Authority
CN
China
Prior art keywords
control
screen recording
operation instruction
screen
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911357660.4A
Other languages
Chinese (zh)
Other versions
CN113031838B (en
Inventor
王小童
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Petal Cloud Technology Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201911357660.4A priority Critical patent/CN113031838B/en
Publication of CN113031838A publication Critical patent/CN113031838A/en
Application granted granted Critical
Publication of CN113031838B publication Critical patent/CN113031838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A screen recording method comprises the following steps: when recording a screen, acquiring an operation instruction and operation time of a user in an application program interface; acquiring a control identification corresponding to the operation instruction; and generating a screen recording file according to the operation time, the operation instruction and the control identification corresponding to the operation instruction. Because the control identification of the application program can uniquely determine the UI control in the application program, when the playback device is different from the screen recording device according to the screen recording file, the instant screen has different sizes or resolutions or the screen state changes, the corresponding UI control can be accurately found according to the control identification, so that the screen recording file can be accurately played back, and the adaptability of the screen recording file to different devices is improved.

Description

Screen recording method and device and electronic equipment
Technical Field
The application belongs to the technical field of screen recording, and particularly relates to a screen recording method and device and electronic equipment.
Background
The screen recording refers to recording screen content in a display screen of the electronic equipment, and synthesizing a video or an instruction set including screen content change information. The screen recording can facilitate the user to share the screen content information, such as a sharing problem solution or an operation course.
In the current screen recording operation, in order to enable playback of recorded instructions, an auxiliary function of the system is usually used to collect user operation instructions, or further include a trigger position corresponding to the instructions. The operation instruction may include an instruction input by a keyboard, a touch instruction, and the like. And during playback, according to the collected operation instruction or combining with a trigger position corresponding to the instruction. However, when the playback device and the recording device have different sizes or resolutions, or the screen display state is switched between horizontal and vertical screens, playback failure may occur, and the manufactured screen recording file has poor adaptability to the playback device.
Disclosure of Invention
The embodiment of the application provides a screen recording method and device and electronic equipment, and can solve the problem that playback failure is easily caused because a screen recording file manufactured in the prior art is influenced by equipment or a screen state.
In a first aspect, an embodiment of the present application provides a screen recording method, where the screen recording method includes: when recording a screen, acquiring an operation instruction and operation time of a user in an application program interface; acquiring a control identification corresponding to the operation instruction; and generating a screen recording file according to the operation time, the operation instruction and the control identification corresponding to the operation instruction.
The application program interface may include one or more UI controls, and when the application program interface includes multiple UI controls, two or more UI controls may be stacked. The method comprises the steps of obtaining operation time and control identification corresponding to an operation instruction when a screen is recorded, and generating a screen recording file according to the corresponding relation between the operation instruction and the operation time and the control identification. The control identification of the application program can uniquely determine the UI control in the application program, so that the corresponding UI control can be accurately found according to the control identification under the application program interface when the screen recording file is played back. Therefore, when the playback device is different from the screen recording device, the instant screen has different sizes or resolutions, or the screen state changes, the UI control corresponding to the operation instruction can be accurately found according to the control identification, so that the screen recording file can be accurately played back, and the adaptability of the screen recording file to different devices is improved.
In some implementation manners, if the application version is upgraded to cause the identifier of the UI control in the application to change, the corresponding control identifier in the screen recording file may be updated according to the change information of the control identifier of the UI control, so that the screen recording file may adapt to a scene in which the control identifier changes due to the application upgrade.
Illustratively, the step of obtaining the control identifier corresponding to the operation instruction includes: acquiring the coordinate position of the operation instruction in a screen; searching one or more UI controls corresponding to the coordinate positions according to the updating information of the Ui controls; and acquiring a control identification corresponding to the operation instruction in the searched UI control.
When the control identifier corresponding to the operation instruction is determined, the layout information of the UI control corresponding to the operation time can be determined according to the real-time update information of the UI control, and which UI controls are included in the application program interface and the position information of the UI controls can be determined. And the position information of the UI controls is combined with the coordinate position triggered by the operation instruction, so that a plurality of UI controls corresponding to the operation instruction can be found, and the control identification corresponding to the operation instruction is further obtained from the found UI controls.
In an implementation manner, when the coordinate position corresponds to multiple UI controls, the step of obtaining, from the found UI controls, a control identifier corresponding to the operation instruction includes: acquiring a layer stacking sequence corresponding to the UI control in the searched UI control; and searching the UI control according to the layer stacking sequence to obtain the control identification of the UI control corresponding to the operation instruction. And sequentially searching the UI controls corresponding to the operation instructions according to the stacking sequence of the layers corresponding to the UI controls.
When the UI control is searched, the UI control can be sequentially obtained from top to bottom according to the layer stacking sequence; acquiring an event monitored by the UI control; and when the operation instruction is matched with the event monitored by the UI control, acquiring a control identification corresponding to the UI control. And if the monitored event list in the UI control comprises the UI control corresponding to the operation instruction, acquiring a control identifier of the UI control as the control identifier corresponding to the operation.
In order to record the screen more comprehensively, when the homepage desktop is displayed in the screen, the starting process of the application program can be further recorded, and the method comprises the following steps: when the screen display content is a desktop homepage, monitoring an application program starting instruction; and acquiring the application program package name of the started application program, and recording the corresponding relation between the application program starting instruction and the application program package name. By recording the corresponding relation between the application program starting instruction and the application program package name, the application program package name is uniquely corresponding to the application program, so that the position of the application program icon can be accurately found through the application program name during playback no matter how the position of the application program icon of the main surface desktop is changed, and the application program can be accurately started according to the corresponding relation between the application program name and the application program starting instruction.
In order to further improve the convenience of obtaining information when the user plays back the screen recording file, the method may further include obtaining control content corresponding to the operation instruction; generating subtitle information corresponding to the operation instruction according to the operation instruction and the control content; and generating a subtitle file according to the subtitle information and the operation time. Therefore, when the screen recording file is played, the current operation instruction and the operation object can be prompted in time through the subtitle file.
In an implementation manner, the step of generating a screen recording file according to the operation time, the operation instruction, and the control identifier corresponding to the operation instruction includes: generating a video file according to the recorded screen content; generating an instruction file according to the operation instruction, the operation time and the control identification corresponding to the operation instruction; and generating a screen recording file according to the video file and the instruction file or the subtitle file. The screen recording file comprises a video file, a subtitle file and an instruction file, and the subtitle file can be synchronously played in video contents by analyzing subtitle information in the subtitle file through a player when the video file is played. The instruction file can analyze the screen recording file through a specific player, and the identification corresponding to the instruction file is identified, so that the instruction file included in the screen recording file is obtained. When the normal player can not analyze the instruction file, the player can play back the instruction file through the analyzed video file and the analyzed subtitle file.
In a possible implementation manner, setting a playback parameter of the screen recording file may be further included, and the playback parameter may include one or more of an execution condition that a device position belongs to a predetermined position range, an execution condition that a start time belongs to a predetermined time range, a step-by-step execution parameter, and an execution number. For example, the current position of the electronic device may be compared with a predetermined position range, and if the current position is not within the predetermined position range, the playback of the screen recording file is not performed, or the current time is compared with a predetermined time range, and if the current time is not within the predetermined time range, the playback of the screen recording file is not performed, so that the security and intelligence of the screen recording file may be improved. Or, the file can be set to be executed step by step, that is, after the user clicks the next confirmation information each time, the next operation instruction is executed, so that the user can conveniently learn the operation process synchronously. Alternatively, the number of times of playback may also be set so that the user can automatically perform playback operation according to the set number of times of playback.
In a second aspect, an embodiment of the present application provides a playback method for a screen recording file, including parsing a screen recording file to obtain a control identifier and an operation instruction corresponding to playback time; searching the coordinate position of the corresponding control in the screen according to the control identification; and triggering the operation event corresponding to the operation instruction at the searched coordinate position.
It can be understood that, according to the control identifier included in the screen recording file, the corresponding UI control can be found in the interface in the corresponding application program, the position of the found UI control is determined, and the operation instruction is triggered at the determined position, so that correct and effective playback of the UI control corresponding to the operation instruction can be realized.
In one implementation, the method further comprises: acquiring an application program package name corresponding to an application program starting instruction according to the screen recording file; searching an application program icon position corresponding to the application program package name in a screen according to the application program package name; and triggering an operation event corresponding to the application program starting instruction according to the searched position of the application program icon.
It can be understood that by recording the corresponding relationship between the application program name and the application program starting instruction, when the screen recording file includes the application program starting instruction, the application program name corresponding to the application program starting instruction is determined according to the detected repeat time of the application program starting instruction, and the application program icon corresponding to the application program name is searched in the screen homepage according to the characteristic of uniqueness of the application program name, so that the correct playback of the screen recording file can still be effectively realized under the condition that the position of the application program icon is changed.
In order to further improve the convenience of obtaining operation information by a user when the screen recording file is played back, the subtitle file included in the screen recording file can be obtained, and the subtitle file includes subtitle information and operation time; and playing the subtitle information when the video file in the screen recording file is played according to the operation time.
The subtitle information can comprise operation instruction content and UI control content, and the current operation object and operation mode can be conveniently and clearly known by a user according to the prompt content of the subtitle information, so that information can be more efficiently acquired from the screen recording file.
In addition, playback parameters may also be obtained in the screen recording file, and according to the playback parameters, playback conditions may be controlled, including, for example, whether the position of the electronic device for playback meets a predetermined position range, or whether the playback time meets a predetermined time range. Or the playback mode can be set to be gradual playback, the playing is stopped after each playback of one operation instruction, and the playback of the next operation instruction is continued after the confirmation instruction of the user is received. Or may also automatically cycle playback, etc., according to a predetermined number of playbacks.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the screen recording method according to any one of the first aspect, or implements the playback method of the screen recording file according to any one of the second aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the screen recording method according to any one of the first aspect, or implements the playback method of the screen recording file according to any one of the second aspect.
It is understood that the electronic device according to the third aspect and the computer-readable storage medium according to the fourth aspect provided above include the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device according to the third aspect may refer to the beneficial effects of the corresponding method provided above, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic diagram of a hardware structure of a mobile phone to which a screen recording method or a screen recording file playback method provided in an embodiment of the present application is applied;
FIG. 2 is a system level diagram of an implementation device according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a suitable scenario of a screen recording and playback method according to an embodiment of the present application;
fig. 4 is a schematic view of another scene of the screen recording and playback method according to the embodiment of the present application;
fig. 5 is a schematic view of another scene of the screen recording and playback method according to the embodiment of the present application;
fig. 6 is a schematic flowchart illustrating an implementation process of a screen recording method according to an embodiment of the present application;
fig. 7 is a schematic flow chart illustrating an implementation process of obtaining a control identifier corresponding to an operation instruction according to an embodiment of the present application;
fig. 8 is a schematic view of an implementation flow of finding a UI control according to an embodiment of the present application;
fig. 9 is a schematic view of an implementation process for generating a subtitle file according to an embodiment of the present application;
fig. 10 is a view of a subtitle information presentation space according to an embodiment of the present application;
fig. 11 is a schematic view of an implementation scenario of a screen recording or playback method according to an embodiment of the present application;
fig. 12 is a schematic flowchart illustrating an implementation flow of a playback method for a screen recording file according to an embodiment of the present application;
fig. 13 is a block diagram of a screen recording apparatus according to an embodiment of the present disclosure;
fig. 14 is a block diagram illustrating a structure of a playback apparatus for a variety of screen files according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise. It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The screen recording method or the screen recording file playback method provided by the embodiment of the application can be applied to terminal devices such as a mobile phone, a tablet personal computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, and a Personal Digital Assistant (PDA), and the embodiment of the application does not limit the specific types of the terminal devices at all.
For example, the terminal device may be a Station (ST) in a WLAN, which may be a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA) device, a handheld device with Wireless communication capability, a computing device or other processing device connected to a Wireless modem, a vehicle-mounted device, a vehicle-mounted networking terminal, a computer, a laptop, a handheld communication device, a handheld computing device, a satellite Wireless device, a Wireless modem card, a television set-top box (STB), a Customer Premises Equipment (CPE), and/or other devices for communicating over a Wireless system and a next generation communication system, such as a Mobile terminal in a 5G Network or a Public Land Mobile Network (future evolved, PLMN) mobile terminals in the network, etc.
By way of example and not limitation, when the terminal device is a wearable device, the wearable device may also be a generic term for intelligently designing daily wearing by applying wearable technology, developing wearable devices, such as glasses, gloves, watches, clothing, shoes, and the like. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable intelligent device has the advantages that the generalized wearable intelligent device is complete in function and large in size, can realize complete or partial functions without depending on a smart phone, such as a smart watch or smart glasses, and only is concentrated on a certain application function, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets for monitoring physical signs, smart jewelry and the like.
Take the terminal device as a mobile phone as an example. Fig. 1 is a block diagram illustrating a partial structure of a mobile phone according to an embodiment of the present disclosure. Referring to fig. 1, the cellular phone includes: a Radio Frequency (RF) circuit 110, a memory 120, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a wireless fidelity (WiFi) module 170, a processor 180, and a power supply 190. Those skilled in the art will appreciate that the handset configuration shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 1:
the RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 180; in addition, the data for designing uplink is transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 100. Specifically, the input unit 130 may include a touch panel 131 and other input devices 132. The touch panel 131, also referred to as a touch screen, may collect touch operations of a user on or near the touch panel 131 (e.g., operations of the user on or near the touch panel 131 using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 131 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. In addition, the touch panel 131 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 130 may include other input devices 132 in addition to the touch panel 131. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 131 can cover the display panel 141, and when the touch panel 131 detects a touch operation on or near the touch panel 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although the touch panel 131 and the display panel 141 are shown as two separate components in fig. 1 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement the input and output functions of the mobile phone.
The handset 100 may also include at least one sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 160, speaker 161, and microphone 162 may provide an audio interface between the user and the handset. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, which is received by the audio circuit 160 and converted into audio data, which is then processed by the audio data output processor 180 and then transmitted to, for example, another cellular phone via the RF circuit 110, or the audio data is output to the memory 120 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 170, and provides wireless broadband Internet access for the user. Although fig. 1 shows the WiFi module 170, it is understood that it does not belong to the essential constitution of the handset 100, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby integrally monitoring the mobile phone. Alternatively, processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The handset 100 also includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 180 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the handset 100 may also include a camera. Optionally, the position of the camera on the mobile phone 100 may be front-located or rear-located, which is not limited in this embodiment of the application.
Optionally, the mobile phone 100 may include a single camera, a dual camera, or a triple camera, which is not limited in this embodiment.
For example, the cell phone 100 may include three cameras, one being a main camera, one being a wide camera, and one being a tele camera.
Optionally, when the mobile phone 100 includes a plurality of cameras, the plurality of cameras may be all front-mounted, all rear-mounted, or a part of the cameras front-mounted and another part of the cameras rear-mounted, which is not limited in this embodiment of the present application.
In addition, although not shown, the mobile phone 100 may further include a bluetooth module or the like, which is not described herein.
Fig. 2 is a schematic diagram of a software structure of the mobile phone 100 according to the embodiment of the present application. Taking the operating system of the mobile phone 100 as an Android system as an example, in some embodiments, the Android system is divided into four layers, which are an application layer, an application Framework (FWK) layer, a system layer and a hardware abstraction layer, and the layers communicate with each other through a software interface.
As shown in fig. 2, the application layer may be a series of application packages, which may include short message, calendar, camera, video, navigation, gallery, call, and other applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer may include some predefined functions, such as functions for receiving events sent by the application framework layer.
As shown in FIG. 2, the application framework layers may include a window manager, a resource manager, and a notification manager, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The application framework layer may further include:
a viewing system that includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide the communication functions of the handset 100. Such as management of call status (including on, off, etc.).
The system layer may include a plurality of functional modules. For example: a sensor service module, a physical state identification module, a three-dimensional graphics processing library (such as OpenGL ES), and the like.
The sensor service module is used for monitoring sensor data uploaded by various sensors in a hardware layer and determining the physical state of the mobile phone 100;
the physical state recognition module is used for analyzing and recognizing user gestures, human faces and the like;
the three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The system layer may further include:
the surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The hardware abstraction layer is a layer between hardware and software. The hardware abstraction layer may include a display driver, a camera driver, a sensor driver, etc. for driving the relevant hardware of the hardware layer, such as a display screen, a camera, a sensor, etc.
The following embodiments may be implemented on the cellular phone 100 having the above-described hardware structure/software structure. The following embodiments will describe a screen recording method or a screen recording file playback method provided in the embodiments of the present application, taking the mobile phone 100 as an example.
Fig. 3 is a schematic view of an implementation scenario of a screen recording method and a screen recording file playback method according to an embodiment of the present application. The equipment for completing screen recording and playback of the screen recording file can be the same equipment or different equipment. For convenience of description, the screen recording device is called a first device, and the device for playing back the screen recording file is called a second device. When the application program A is displayed on the first device, the display area and the display position of the UI control X1 (the center position of the UI control X1) are different from those of the application program A on the second device. Therefore, recording the corresponding relationship between the position of the UI control and the operation instruction cannot effectively implement playback of the operation instruction. By the screen recording method and the screen recording file playback method, the corresponding relation between the operation instruction and the UI control is recorded, so that the position of the UI control is changed, and the UI control corresponding to the operation instruction can still be effectively determined.
Fig. 4 is a schematic view of an implementation scenario of another screen recording method and a screen recording file playback method provided in the embodiment of the present application. The equipment for completing screen recording and playback of the screen recording file is two equipment with different screen sizes. For convenience of description, the screen recording device is called a first device, and the device for playing back the screen recording file is called a second device. The first device and the second device have different screen sizes, may be different in pixel length-width ratio of the screen, and may also have different screen resolutions. Due to the differences described above, it is possible that the location and area displayed in the first device is different from the location and area displayed in the second device for the same UI control X in application a. Therefore, recording the corresponding relationship between the position of the UI control and the operation instruction cannot effectively implement playback of the operation instruction. By the screen recording method and the screen recording file playback method, the corresponding relation between the operation instruction and the UI control is recorded, so that the position of the UI control is changed, and the UI control corresponding to the operation instruction can still be effectively determined.
Fig. 5 is a schematic view of an implementation scenario of another screen recording method and a screen recording file playback method provided in the embodiment of the present application. As shown in fig. 5, the application program in the screen includes stacked UI controls, and when the corresponding relationship between the position of the UI control and the operation instruction is recorded in the screen recording process, the stacking order of the UI controls at the same position changes, or the response manner changes, so that the picture content during playback may be different from the recorded picture. In order to overcome the defect, the application provides a control identifier based on an application program, and a UI control corresponding to an operation instruction is searched, so that the UI control corresponding to the operation instruction can still be effectively searched in a UI control stacking state.
Fig. 6 shows a schematic flowchart of a screen recording method or a screen recording file playback method provided by the present application, and the method can be applied to the mobile phone 100 described above by way of example and not limitation.
In step S601, when recording a screen, acquiring an operation instruction and an operation time of a user in an application program interface;
after entering the application program interface, a plurality of UI controls are usually displayed on the screen, and a plurality of operation instructions input by the user can be received. In order to record the operation process of the user in the application program, the operation instruction and the operation time of the user in the application program interface can be acquired, and the operation instruction determined according to the operation time can be played back sequentially according to the operation time during playback.
When the received operation instruction of the user is a key instruction, since the key instruction usually does not have a specific trigger position, a key event corresponding to the key instruction and operation time corresponding to the key event can be recorded.
In one implementation, in order to make the current operation content clearly known during playback, subtitle information corresponding to the operation time of the key instruction may be generated according to the key content of the key instruction. For example, when it is detected that the user clicks the ok button, the subtitle information "click the 'ok' button" may be automatically generated; when the user clicks the volume up button, the subtitle information "click 'volume +' button", etc. is automatically generated.
When the operation instruction is a touch instruction, the operation instruction includes, for example, sliding, clicking, and the like. When the operation instruction is a slide instruction, the start time of the slide instruction may be used as the operation time of the operation instruction, and when the operation instruction is a click instruction, the click time of the click instruction may be used as the operation time of the operation instruction.
In step S602, acquiring a control identifier corresponding to the operation instruction;
in an application program interface, a plurality of UI controls are generally included for receiving different control instructions. For example, clicking the position a and the position B in the application program interface may correspond to different UI controls, and trigger contents corresponding to the different UI controls. The control identification is preset during application program development, and in the same application program, the UI control in the application program can be uniquely determined according to the control identification.
The process of obtaining the control identifier corresponding to the operation instruction may be as shown in fig. 7, and includes:
in step S701, acquiring a coordinate position of the operation instruction in a screen;
when the operation instruction is a click instruction, the click position of the click instruction in the screen can be used as the coordinate position of the operation instruction
When the operation instruction is a sliding instruction, the coordinate position corresponding to the sliding operation instruction can be determined according to the initial position of the sliding instruction.
In step S702, one or more Ui controls corresponding to the coordinate position are searched according to the update information of the Ui control;
in the screen recording process, screen rendering information can be obtained in real time, and position information corresponding to the UI control in the screen is obtained. For example, the obtained position information of the UI control may be located in multiple UI controls in the same layer, or may include multiple UI controls in different layers, and the UI controls in different layers may partially overlap or completely overlap.
In step S703, a control identifier corresponding to the operation instruction is obtained from the searched UI control.
When the coordinate position of the operation instruction in the screen corresponds to a certain UI control in the same layer, the uniquely determined UI control can be directly searched according to the coordinate position corresponding to the operation instruction, and the control identifier of the UI control can be obtained according to the searched UI control.
When the coordinate position of the operation instruction in the screen corresponds to the plurality of UI controls in the plurality of layers, the layer stacking order of the plurality of UI controls corresponding to the operation time may be obtained. For example, the number of the UI controls corresponding to the coordinate position and the operation time of the operation instruction includes 5, which are respectively UI control 1, UI control 2, and UI control 3, the layer stacking sequence of the UI controls is UI control 3, UI control 2, and UI control 1 according to the sequence from top to bottom, and the UI controls are sequentially searched from top to bottom according to the layer stacking sequence in a lesson to obtain the control identifier of the UI control corresponding to the operation instruction.
When searching for the UI control, according to the difference of specific operation instructions and the difference of monitoring events defined by the UI control, at the same time and at the same position, the UI controls triggered by different operation instructions may be different, and in order to accurately obtain the UI control corresponding to the operation instruction, a process of searching for the UI control may be as shown in fig. 8, including:
in step S801, sequentially acquiring UI controls from top to bottom according to the layer stacking order;
and a Z coordinate axis can be established according to the vertical direction of the screen, and a coordinate position corresponding to the operation instruction can be obtained according to the rendering information of the UI control, wherein the UI control is in the layer stacking sequence from top to bottom in the Z coordinate axis direction. For example, it is assumed that the layer stacking sequence is UI control 3, UI control 2, and UI control 1 from top to bottom.
In step S802, an event monitored by the UI control is acquired;
and after the layer overlapping sequence of the UI control is determined, further determining the event type monitored by the overlapped UI control. For example, the UI control may listen for one or more of the events including a click event, a slide event, a long press event, and the like.
In step S803, when the operation instruction matches the event monitored by the UI control, a control identifier corresponding to the UI control is obtained.
And matching the operation instruction with the events monitored by the UI control in sequence, if the events monitored by the UI control are found to be matched with the operation instruction, acquiring a control identification corresponding to the UI control, and recording the corresponding relation between the control identification and the operation instruction.
For example, in the UI control, the position of the operation command is determined
The UI control 3 monitors a click event, the UI control 2 monitors a slide event, and the UI control 1 monitors a long press event. When the operation instruction is a long press event, the operation instruction is firstly compared with the UI control 3 and then compared with the UI control 2 from top to bottom in sequence until the operation instruction is compared with the UI control 1, the operation instruction is matched with an event monitored by the UI control 1, so that the UI control 1 is determined to be corresponding to the operation instruction, and the control identification of the UI control can be obtained according to the identification information of the UI control in an application program. The control identification may include a control ID, or may also be a control ID and a control name.
In step S603, a screen recording file is generated according to the operation time, the operation instruction, and the control identifier corresponding to the operation instruction.
In the screen recording process, when an operation instruction is detected, the operation time corresponding to the operation instruction is recorded, the control identification corresponding to the operation instruction is obtained, an instruction file can be obtained according to the control identification and the operation instruction corresponding to a plurality of operation times acquired in the screen recording process, and playback can be performed according to the screen recording file generated by the instruction file. And when the instruction file is played back, the corresponding operation instruction is selected according to the operation time, and the UI control corresponding to the control identifier is triggered. The instruction file and the video file can be coded together, and the instruction file can be obtained through analysis of a specific video player according to a preset coding identifier.
And obtaining a video file according to the screen image information acquired during screen recording. Through the video file, screen content playback can be directly performed.
In one implementation, the screen recording file may further include a subtitle file, and the subtitle file is used to record an operation instruction and an operation object of a user. As shown in fig. 9, the generating of the subtitle file may include:
in step S901, acquiring control content corresponding to the operation instruction;
the control content may include main text information included in the control, and may also include the control name. For example, when a user clicks a "my" button, the text information of the button may be acquired as "my", or the control name corresponding to the click instruction is "my page".
In step S902, generating subtitle information corresponding to the operation instruction according to the operation instruction and the control content;
according to an operation instruction, combining the control content of the UI control triggered by the operation instruction, and generating caption information corresponding to the operation instruction. For example, if the operation instruction is a "click" instruction and the triggered UI control is "my page", the subtitle information "click my page" may be generated.
Of course, in order to make the user know the current operation process more clearly, the subtitle information may further include an operation sequence number corresponding to the current operation instruction, for example, the operation sequence may be "instruction one", "instruction two", and the like, as shown in fig. 10, the obtained complete subtitle information may be "instruction one: click on my page.
In step S903, a subtitle file is generated according to the subtitle information and the operation time.
A subtitle file can be generated from the plurality of recorded time information and subtitle information. The subtitle file can be analyzed by a video player and displayed in a video picture when the video is played.
In one implementation, when recording a screen, voice information input by a user may be received, and corresponding subtitle information is identified and obtained according to the voice information.
Considering that the received click command may be located at any position of the screen during screen recording, and the color displayed at the same position in the screen may be changed, the display position of the subtitle information may be adjusted according to the coordinate position corresponding to the operation command, and the color of the subtitle information may be correspondingly adjusted according to the screen color corresponding to the display area of the subtitle information, so that the subtitle information can be played more clearly.
In addition, in an implementation scenario of the present application, as shown in fig. 11, the position of the recorded application, for example, the icon of application a, in the desktop homepage may be changed, and the position of the icon of application may be changed in the same device, or may be different in different devices. In order to effectively play back the process of the screen recording, the corresponding relation between the application program and the starting instruction can be further recorded.
In the screen recording process, when the device is in a desktop homepage, the starting process of the application program can be recorded, including picture recording and instruction recording. In the instruction recording process, the trigger position corresponding to the operation instruction can be determined according to the operation instruction. One implementation may be:
if the content displayed on the screen is a desktop homepage, namely a page displayed by an application program icon, monitoring whether a user triggers a starting instruction of the application program, if so, searching the started application program according to the position of the starting instruction, and acquiring the name of the started application program package, or directly acquiring the name of the application program package according to the started running application program, and recording the name of the application program package corresponding to the starting instruction and the operation time. When the screen recording file is played back, the position of the application program icon corresponding to the application program package name can be searched according to the application program package name corresponding to the starting instruction, and the application program starting instruction is triggered according to the searched position. Thereby, it is possible to effectively record a playback instruction in a scene in which the position of the application icon is changed, such as the scene shown in fig. 10.
In an implementation manner of the present application, a playback parameter input by a user and used for setting the playback of the screen recording file may be further received, where the playback parameter may include one or more of an execution condition that a device position belongs to a predetermined position range, an execution condition that a start time belongs to a predetermined time range, a step-by-step execution parameter, and an execution number.
For example, when the playback parameter includes an execution condition of a position range, a current position of the device is obtained when playback is performed, and if the current position of the device does not coincide with the position range in the preset playback parameter, playback of the screen recording file may be rejected. By setting the execution condition of the position range, the screen recording file can be played back in the designated area range, so that the safety of the screen recording file can be improved, and the screen recording file is prevented from being played back illegally in other areas.
When the playback parameters include the execution condition of the starting time, the screen recording file can be kept secret before the starting time, so that the requirement on the confidentiality of the screen recording file in a time dimension can be met.
In addition, the playback parameters may also include a step-by-step execution parameter, that is, during playback, each time an operation instruction is executed, the playing of the video file is stopped, or the execution of the operation instruction is stopped, until a confirmation instruction of the user is received, the video is continuously played, or the next operation instruction is continuously executed. The method and the device can be suitable for scenes with complex operation flows, so that a user can effectively acquire operation information in the screen recording process.
Or, the playback parameters may further include setting of playback times, so that when the screen recording file is played back, the playback operation is automatically performed on the screen recording file according to the set playback times.
In one implementation, after the screen recording is completed, the recorded screen recording file can be shared with other users, so that the other users can perform playback according to the instruction file or perform playback according to the video file in the screen recording file.
Fig. 12 is a schematic view of an implementation flow of a playback method for a screen recording file according to an embodiment of the present application, where the playback method for the screen recording file corresponds to the screen recording method described above, and the playback method for the screen recording file includes:
in step S1201, a control identifier and an operation instruction corresponding to the playback time are obtained by parsing the screen recording file;
when the screen recording file comprises an instruction file, a video file included in the screen recording file can be identified through a specific player, a coding identifier corresponding to the instruction file is identified, and the instruction file is extracted from the screen recording file. When the player cannot identify the instruction file in the screen recording file, the subtitle corresponding to the instruction operation can be added in the video picture through the parsed video file, through playback of the screen picture in the video content, or in combination with the subtitle file.
The instruction file obtained by the analysis may include an operation instruction, operation time corresponding to the operation instruction, and a control identifier corresponding to the operation instruction. The operation time may be used to distinguish operation sequences of different operation instructions, and in a possible implementation manner, the operation time may also be expressed as an operation sequence of the operation instruction.
In step S1202, the coordinate position of the corresponding control is searched in the screen according to the control identifier;
under the same application program, the control identification comprises the control ID or the control ID and the control name, so that the corresponding control in the application program can be uniquely determined. Therefore, according to the control identification corresponding to the operation instruction included in the screen recording file, the UI control corresponding to the currently played back control instruction can be found
In step S1203, an operation event corresponding to the operation instruction is triggered at the searched coordinate position.
According to the searched UI control and by combining the current rendering information of the UI control, the position of the searched UI control can be obtained, and the operation instruction can be triggered at the central position of the UI control according to the position of the UI control. For example, an operation instruction such as click can be triggered by the send event. Therefore, when the position of the UI control is changed due to equipment replacement or screen resolution change during playback of the screen recording file, reliable playback operation can be realized through the searched control identification.
In a playback implementation manner, when the screen recording file includes an application package name corresponding to an application program starting instruction, the application package name corresponding to the application program starting instruction may be obtained according to the screen recording file; because the application package name has uniqueness, the application icon position corresponding to the application package name can be searched in a screen according to the application package name; and triggering an operation event corresponding to the application program starting instruction at the center position of the icon according to the searched position of the application program icon. Since the application package name has uniqueness, even if the position of the icon of the application is changed in the playback device, the playback operation can be performed according to the instruction file.
When the screen recording file further comprises a video file and a subtitle file, the subtitle information and the operation time included in the subtitle file can be analyzed when the recorded video file is played, and the subtitle information is inserted into the corresponding picture in the screen according to the operation time, so that the current corresponding operation instruction can be timely known during playback, and the convenience of obtaining the operation information is improved.
In one implementation manner, a playback parameter included in the screen recording file may also be acquired, and playback may be performed according to the acquired playback parameter. The playback parameters include, but are not limited to, one or more of execution conditions that the device location belongs to a predetermined location range, execution conditions that the start time belongs to a predetermined time range, step-by-step execution parameters, and execution times.
When the playback parameters include the execution conditions of the position range, the current position of the device is obtained when the playback is executed, and if the current position of the device does not conform to the position range in the preset playback parameters, the playback of the screen recording file can be refused to be executed. By setting the execution condition of the position range, the screen recording file can be played back in the designated area range, so that the safety of the screen recording file can be improved, and the screen recording file is prevented from being played back illegally in other areas.
When the playback parameters include the execution condition of the starting time, the screen recording file can be kept secret before the starting time, so that the requirement on the confidentiality of the screen recording file in a time dimension can be met.
In addition, the playback parameters may also include a step-by-step execution parameter, that is, during playback, each time an operation instruction is executed, the playing of the video file is stopped, or the execution of the operation instruction is stopped, until a confirmation instruction of the user is received, the video is continuously played, or the next operation instruction is continuously executed. The method and the device can be suitable for scenes with complex operation flows, so that a user can effectively acquire operation information in the screen recording process.
Or, the playback parameters may further include setting of playback times, so that when the screen recording file is played back, the playback operation is automatically performed on the screen recording file according to the set playback times.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 13 is a block diagram of a screen recording apparatus according to an embodiment of the present application, which corresponds to the screen recording method according to the foregoing embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 13, the apparatus includes:
an operation information obtaining unit 1201, configured to obtain an operation instruction and operation time of a user in an application interface when recording a screen;
a control identifier obtaining unit 1302, configured to obtain a control identifier corresponding to the operation instruction;
and a screen recording file generating unit 1303, configured to generate a screen recording file according to the operation time, the operation instruction, and the control identifier corresponding to the operation instruction.
Fig. 14 shows a block diagram of a playback apparatus of a screen recording file provided in an embodiment of the present application, and for convenience of explanation, only the parts related to the embodiment of the present application are shown.
Referring to fig. 14, the apparatus includes:
an operation information analysis unit 1401, configured to obtain, according to the screen recording file, a control identifier and an operation instruction corresponding to the playback time through analysis;
a coordinate position searching unit 1402, configured to search, according to the control identifier, a coordinate position of a corresponding control in a screen;
a triggering unit 1403, configured to trigger, at the searched coordinate position, an operation event corresponding to the operation instruction.
Fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 15, the electronic apparatus 15 of this embodiment includes: at least one processor 150 (only one shown in fig. 15), a memory 151, and a computer program 152 stored in the memory 151 and operable on the at least one processor 150, wherein the processor 150 implements the steps of any of the various screen recording method or screen recording file playback method embodiments described above when executing the computer program 152.
The electronic device 15 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The electronic device may include, but is not limited to, a processor 150, a memory 151. Those skilled in the art will appreciate that fig. 15 is merely an example of the electronic device 15, and does not constitute a limitation of the electronic device 15, and may include more or less components than those shown, or combine some of the components, or different components, such as an input-output device, a network access device, etc.
The Processor 150 may be a Central Processing Unit (CPU), and the Processor 150 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 151 may be an internal storage unit of the electronic device 15 in some embodiments, such as a hard disk or a memory of the electronic device 15. The memory 151 may also be an external storage device of the electronic device 15 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 15. Further, the memory 151 may also include both an internal storage unit and an external storage device of the electronic device 15. The memory 151 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs. The memory 151 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to an electronic device, a recording medium, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (15)

1. A screen recording method is characterized by comprising the following steps:
when recording a screen, acquiring an operation instruction and operation time of a user in an application program interface;
acquiring a control identification corresponding to the operation instruction;
and generating a screen recording file according to the operation time, the operation instruction and the control identification corresponding to the operation instruction.
2. The screen recording method of claim 1, wherein the step of obtaining the control identifier corresponding to the operation instruction comprises:
acquiring the coordinate position of the operation instruction in a screen;
searching one or more UI controls corresponding to the coordinate positions according to the updating information of the Ui controls;
and acquiring a control identification corresponding to the operation instruction in the searched UI control.
3. The screen recording method according to claim 2, wherein when the coordinate position corresponds to a plurality of UI controls, the step of obtaining, from the searched UI controls, the control identifier corresponding to the operation instruction comprises:
acquiring a layer stacking sequence corresponding to the UI control in the searched UI control;
and searching the UI control according to the layer stacking sequence to obtain the control identification of the UI control corresponding to the operation instruction.
4. The screen recording method according to claim 3, wherein the step of searching for the UI control according to the layer stacking sequence and obtaining the control identifier of the UI control corresponding to the operation instruction comprises:
sequentially acquiring UI controls from top to bottom according to the layer stacking sequence;
acquiring an event monitored by the UI control;
and when the operation instruction is matched with the event monitored by the UI control, acquiring a control identification corresponding to the UI control.
5. The screen recording method of claim 1, further comprising:
when the screen display content is a desktop homepage, monitoring an application program starting instruction;
and acquiring the application program package name of the started application program, and recording the corresponding relation between the application program starting instruction and the application program package name.
6. The screen recording method according to claim 1, wherein after the step of obtaining the control identifier corresponding to the operation instruction, the method further comprises:
acquiring control content corresponding to the operation instruction;
generating subtitle information corresponding to the operation instruction according to the operation instruction and the control content;
and generating a subtitle file according to the subtitle information and the operation time.
7. The screen recording method of claim 6, wherein the step of generating a screen recording file according to the operation time, the operation instruction, and the control identifier corresponding to the operation instruction comprises:
generating a video file according to the recorded screen content;
generating an instruction file according to the operation instruction, the operation time and the control identification corresponding to the operation instruction;
and generating a screen recording file according to the video file and the instruction file or the subtitle file.
8. The screen recording method of claim 1, further comprising:
and setting the playback parameters of the screen recording file.
9. The screen recording method according to claim 8, wherein the playback parameters of the screen recording file include one or more of an execution condition that a device position belongs to a predetermined position range, an execution condition that a start time belongs to a predetermined time range, a step-by-step execution parameter, and a number of execution times.
10. A playback method of a screen recording file is characterized by comprising the following steps:
analyzing according to the screen recording file to obtain a control identification and an operation instruction corresponding to the playback time;
searching the coordinate position of the corresponding control in the screen according to the control identification;
and triggering the operation event corresponding to the operation instruction at the searched coordinate position.
11. The playback method of the screen recording file according to claim 10, characterized in that the method further comprises:
acquiring an application program package name corresponding to an application program starting instruction according to the screen recording file;
searching an application program icon position corresponding to the application program package name in a screen according to the application program package name;
and triggering an operation event corresponding to the application program starting instruction according to the searched position of the application program icon.
12. The playback method of the screen recording file according to claim 10, characterized in that the method further comprises:
acquiring a subtitle file included in the screen recording file, wherein the subtitle file comprises subtitle information and operation time;
and playing the subtitle information when the video file in the screen recording file is played according to the operation time.
13. The playback method of the screen recording file according to claim 10, characterized in that the method further comprises:
acquiring playback parameters of the screen recording file;
and controlling the playback of the screen recording file according to the playback parameters.
14. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the screen recording method according to any one of claims 1 to 9 or the playback method of the screen recording file according to any one of claims 10 to 13 when executing the computer program.
15. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the screen recording method according to any one of claims 1 to 9, or implements the playback method of the screen recording file according to any one of claims 10 to 13.
CN201911357660.4A 2019-12-25 2019-12-25 Screen recording method and device and electronic equipment Active CN113031838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911357660.4A CN113031838B (en) 2019-12-25 2019-12-25 Screen recording method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911357660.4A CN113031838B (en) 2019-12-25 2019-12-25 Screen recording method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113031838A true CN113031838A (en) 2021-06-25
CN113031838B CN113031838B (en) 2023-03-24

Family

ID=76458472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911357660.4A Active CN113031838B (en) 2019-12-25 2019-12-25 Screen recording method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113031838B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113961158A (en) * 2021-09-08 2022-01-21 北京房江湖科技有限公司 Cross-platform painting brush synchronization method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731478A (en) * 2015-03-24 2015-06-24 广东欧珀移动通信有限公司 Single-hand operation method and device of intelligent terminal
US20160306523A1 (en) * 2013-08-08 2016-10-20 Honeywell International Inc. System and method for visualization of history of events using bim model
CN108052261A (en) * 2017-12-07 2018-05-18 广东欧珀移动通信有限公司 Record screen method, apparatus and terminal
CN108323239A (en) * 2016-11-29 2018-07-24 华为技术有限公司 Recording, playback method, record screen terminal and the playback terminal of film recording
CN109521935A (en) * 2018-10-24 2019-03-26 维沃移动通信有限公司 It is a kind of to record screen, the reproducing method based on record screen, terminal
CN110446097A (en) * 2019-08-26 2019-11-12 维沃移动通信有限公司 Record screen method and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160306523A1 (en) * 2013-08-08 2016-10-20 Honeywell International Inc. System and method for visualization of history of events using bim model
CN104731478A (en) * 2015-03-24 2015-06-24 广东欧珀移动通信有限公司 Single-hand operation method and device of intelligent terminal
CN108323239A (en) * 2016-11-29 2018-07-24 华为技术有限公司 Recording, playback method, record screen terminal and the playback terminal of film recording
CN108052261A (en) * 2017-12-07 2018-05-18 广东欧珀移动通信有限公司 Record screen method, apparatus and terminal
CN109521935A (en) * 2018-10-24 2019-03-26 维沃移动通信有限公司 It is a kind of to record screen, the reproducing method based on record screen, terminal
CN110446097A (en) * 2019-08-26 2019-11-12 维沃移动通信有限公司 Record screen method and mobile terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113961158A (en) * 2021-09-08 2022-01-21 北京房江湖科技有限公司 Cross-platform painting brush synchronization method and device

Also Published As

Publication number Publication date
CN113031838B (en) 2023-03-24

Similar Documents

Publication Publication Date Title
CN107885533B (en) Method and device for managing component codes
CN109992231B (en) Screen projection method and terminal
CN108959361B (en) Form management method and device
CN106921791B (en) Multimedia file storage and viewing method and device and mobile terminal
CN113542825B (en) Screen projection display method, system, terminal device and storage medium
CN108763317B (en) Method for assisting in selecting picture and terminal equipment
CN110430022B (en) Data transmission method and device
CN110795007B (en) Method and device for acquiring screenshot information
CN112996042A (en) Network acceleration method, terminal device, server and storage medium
CN113038434B (en) Device registration method and device, mobile terminal and storage medium
CN107948429B (en) Content demonstration method, terminal equipment and computer readable storage medium
CN112423138A (en) Search result display method and terminal equipment
CN113018868B (en) Cloud game login method, device and system
CN113552986A (en) Multi-window screen capturing method and device and terminal equipment
CN114816617A (en) Content presentation method and device, terminal equipment and computer readable storage medium
CN114125546B (en) Information sharing method and device, terminal equipment and storage medium
CN112311740B (en) Data encryption method, data decryption method, terminal and storage medium
CN110851350A (en) Method and device for monitoring white screen of web page interface
CN111274842B (en) Coded image identification method and electronic equipment
CN113518243A (en) Image processing method and device
CN113157357A (en) Page display method, device, terminal and storage medium
CN110908638A (en) Operation flow creating method and electronic equipment
CN109067975B (en) Contact person information management method and terminal equipment
CN110751028A (en) Transaction method and device based on intelligent sales counter
CN113031838B (en) Screen recording method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220513

Address after: 523799 Room 101, building 4, No. 15, Huanhu Road, Songshanhu Park, Dongguan City, Guangdong Province

Applicant after: Petal cloud Technology Co.,Ltd.

Address before: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Applicant before: HUAWEI DEVICE Co.,Ltd.

Effective date of registration: 20220513

Address after: 523808 Southern Factory Building (Phase I) Project B2 Production Plant-5, New Town Avenue, Songshan Lake High-tech Industrial Development Zone, Dongguan City, Guangdong Province

Applicant after: HUAWEI DEVICE Co.,Ltd.

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant before: HUAWEI TECHNOLOGIES Co.,Ltd.

GR01 Patent grant
GR01 Patent grant