CN118337905A - Note generation method, readable storage medium, program product, and electronic device - Google Patents

Note generation method, readable storage medium, program product, and electronic device Download PDF

Info

Publication number
CN118337905A
CN118337905A CN202310035813.3A CN202310035813A CN118337905A CN 118337905 A CN118337905 A CN 118337905A CN 202310035813 A CN202310035813 A CN 202310035813A CN 118337905 A CN118337905 A CN 118337905A
Authority
CN
China
Prior art keywords
note
video
screen
electronic device
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310035813.3A
Other languages
Chinese (zh)
Inventor
毛俊伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202310035813.3A priority Critical patent/CN118337905A/en
Publication of CN118337905A publication Critical patent/CN118337905A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of terminals, and discloses a note generation method, a readable storage medium, a program product and electronic equipment. The electronic device may display an intelligent note control while playing the video through the video playing application. If the electronic equipment detects that a user performs screen capturing or screen recording operation on a video played by the electronic equipment or an interface displayed by the electronic equipment by utilizing the intelligent note control, a note editing window aiming at the screen capturing image or the screen recording video is automatically displayed after the screen capturing image or the screen recording video is generated, so that the user can add notes for the screen capturing image or the screen recording video in the note editing window. Therefore, a user only needs to execute screen capturing or screen recording operation by utilizing the intelligent note control, a note editing window aiming at a screen capturing image or screen recording video in the non-touch electronic equipment is not needed to be switched back and forth between a video playing application program and a note recording application program, the operation is convenient, and the user experience is improved.

Description

Note generation method, readable storage medium, program product, and electronic device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a note generation method, a readable storage medium, a program product, and an electronic device.
Background
In a scene of playing the video by the user through the electronic equipment, for example, when the video played by the electronic equipment is a video of a net lesson, a net meeting and the like, the user can screen the video played by the electronic equipment or record the screen, annotate and edit the screen capturing image or the screen recording video, so that the content in the screen capturing image or the screen recording video can be reviewed after the video playing is completed.
However, at present, a user needs to firstly screen-capture or record a video through a screen capture or screen recording mode defined by a video playing application program or an operating system of an electronic device, and then adds a note of the screen capture image or the screen recording video through a note recording application program, such as a memo application program, so that the operation is complex, and the user experience is affected.
Disclosure of Invention
In view of this, embodiments of the present application provide a note generation method, a readable storage medium, a program product, and an electronic device. When the electronic equipment plays the video, the intelligent note control can be displayed, and when the operation of the user on the intelligent note control is detected, a note editing interface of a screen capturing image or a screen recording video corresponding to the video played by the electronic equipment is directly displayed, so that the user does not need to switch between a video playing application program and a note recording application program, and the user experience is improved.
In a first aspect, an embodiment of the present application provides a note generation method, applied to a first electronic device, where the method includes: displaying a first application interface of a first application, wherein a first video is played in the first application interface, and the first application interface comprises a first control; detecting a first operation of a user on a first control, and acquiring first display picture data of a first video played in a first application interface; and displaying a note editing window for the first display screen data, wherein the note editing window is used for generating notes of the first display screen data according to the content input by a user in the note editing window.
According to the method, the user can trigger the first electronic device to display the note editing window aiming at the first display picture data through the first operation of the first control, and the user does not need to switch between the first application and the note recording application program, so that the operation is convenient, and the experience of adding notes for the first display picture data by the user is facilitated.
Alternatively, in some implementations, the first video may be a video stored in the first electronic device, or may be a video acquired from a server in real time, such as a web class video, a conference video, a live video, or the like.
In a possible implementation of the first aspect, the method further includes: after the first display picture data is acquired, pausing playing the first video; after detecting that the note of the first display screen data has been generated, playback of the first video is resumed.
That is, after the first electronic device obtains the first display image data, the first electronic device may automatically pause playing of the first video, so that a user may add notes to the first display image data, and after detecting that the user adds notes, the first video may automatically resume playing, which is beneficial to improving user experience.
In a possible implementation of the first aspect, the first display data includes a screen shot image or a video recording for a first application interface or a first video.
That is, the first display data may be a screen shot image or a video recording.
Optionally, in some implementations, the different first display screen data may be corresponding to a different first operation (e.g., an operation of capturing or recording a screen of a video played by the electronic device or an interface displayed by the electronic device with the smart note control hereinafter). For example, when the first operation is a single click operation or a multiple click operation on the first control, the first display screen data may be a screen capture image corresponding to the first application interface or the first video; and when the first operation is a long-press operation on the first control, the first display picture data can correspond to the first application interface or the video recorded on the first video.
In one possible implementation of the first aspect, the first operation includes any one of the following operations: single or multiple click operations on the first control; long-press operation of the first control; a sliding operation or a kneading operation on the first control.
In one possible implementation manner of the first aspect, the note editing window is displayed on an upper layer of the first application interface, or is displayed on a screen with the first application interface, or is displayed on a display screen of the first electronic device in a full screen manner.
In a possible implementation of the first aspect, the method further includes:
After the first display screen data of the first video played in the first application interface is acquired, an editing interface for the first display screen data is displayed, and the first electronic device displays a note editing window after detecting that the user completes editing the first display screen data.
That is, after the first electronic device obtains the first display screen data, the first electronic device may display an editing interface for the first display screen data; and after the fact that the user edits the first display picture data is detected, the note editing window is displayed, so that the user edits the first display picture data conveniently, and user experience is improved.
In a second aspect, an embodiment of the present application provides a note generation method, including: the second electronic device displays a second application interface of a second application, wherein a second video is played in the second application interface, and the second application interface comprises a second control; the second electronic equipment detects a second operation of a user on a second control, and second display picture data of a second video played in a second application interface are obtained; the second electronic device sends the second display picture data to the third electronic device; the third electronic device displays a note editing window for the second display screen data, and the note editing window is used for generating notes of the second display screen data according to the content input by the user in the note editing window.
According to the method, the user can trigger the third electronic device to display the note editing window aiming at the second display picture data through the second operation of the second control in the second electronic device, and the user does not need to switch between the second application and the note recording application program, so that the operation is convenient, and the experience of adding notes for the second display picture data by the user is facilitated to be improved.
Optionally, in some implementations, the second video may be a video stored in the second electronic device, or may be a video acquired from a server in real time, such as a net lesson video, a conference video, a live video, or the like
In a possible implementation of the second aspect, the method further includes: after the second electronic equipment acquires the second display picture data, the second electronic equipment pauses playing of the second video; and the second electronic device resumes playing the second video after detecting that the third electronic device has generated the note of the second display screen data.
That is, after the second electronic device obtains the second display image data, the second electronic device may automatically pause playing of the second video, so that the user may add notes to the second display image data, and after it is detected that the user completes adding notes in the third electronic device, the second video may automatically resume playing, which is beneficial to improving user experience.
In a possible implementation of the second aspect, the second display data includes a screen shot image or a video recording for a second application interface or a second video.
In one possible implementation of the second aspect, the second operation includes any one of the following operations: single or multiple click operations on the second control; long-press operation of the second control; a sliding operation or a kneading operation on the second control.
In a possible implementation of the second aspect, the method further includes: after generating the note of the second display picture data, the third electronic device sends a note adding completion notice to the second electronic device; after receiving the note addition completion notification, the second electronic device detects that the third electronic device has generated a note of the second display screen data.
In a possible implementation of the second aspect, the method further includes: the third electronic device displays an editing interface for the second display screen data after receiving the second display screen data, and the third electronic device displays a note editing window after detecting that the user has completed editing the second display screen data.
That is, after receiving the second display screen data, the third electronic device may display an editing interface for the second display screen data first; and after the fact that the user edits the second display picture data is detected, the note editing window is displayed, so that the user edits the second display picture data conveniently, and user experience is improved.
In a third aspect, an embodiment of the present application provides a computer readable storage medium, where the readable storage medium includes instructions that, when executed by an electronic device, cause the electronic device to implement any one of the above first aspect and various possible implementations of the above first aspect, the above second aspect and various possible implementations of the above second aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a memory in which instructions are stored; at least one processor configured to execute instructions to cause an electronic device to implement any of the above first aspect and the various possible implementations of the first aspect, the above second aspect and the various possible implementations of the second aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on an electronic device, causes the electronic device to implement any one of the above-mentioned first aspect and various possible implementations of the above-mentioned first aspect, the above-mentioned second aspect and various possible implementations of the above-mentioned second aspect.
Drawings
FIG. 1A illustrates a schematic view of a user's screen shot of a web class video played by a web class application, according to some embodiments of the application;
FIG. 1B illustrates a schematic diagram of a user launching a memo application in the handset 10, in accordance with some embodiments of the present application;
FIG. 1C is a schematic diagram illustrating a process by which a user adds notes for a screen shot image in a memo application, in accordance with some embodiments of the present application;
FIG. 2A illustrates a schematic diagram of a setup interface for a memo application according to some embodiments of the present application;
FIG. 2B illustrates a schematic view of a scenario in which notes are added to a screenshot image based on a smart note control, according to some embodiments of the application;
FIG. 3A is a schematic diagram of a cell phone 10 split screen display of a note editing window and a window of a web class application, according to some embodiments of the present application;
FIG. 3B illustrates a schematic diagram of a full screen display of a note editing window for the handset 10, in accordance with some embodiments of the application;
FIG. 4A illustrates a schematic diagram of a note editing window in which the handset 10 displays a screen shot image first and then displays the screen shot image, in accordance with some embodiments of the application;
FIG. 4B is a schematic diagram of a mobile phone 10 displaying a video editing interface of a video recording and then displaying a note editing window of the video recording according to some embodiments of the present application;
FIG. 5 illustrates a schematic diagram of a handset 10 transferring a screenshot image or video stream to a tablet 20 for adding notes, according to some embodiments of the application;
FIG. 6 illustrates a schematic diagram of a prompt window displayed when a memo application is launched, in accordance with some embodiments of the present application;
FIG. 7 illustrates a flow diagram of a note generation method, according to some embodiments of the application;
FIG. 8 illustrates a flow diagram of yet another note generation method, according to some embodiments of the application;
FIG. 9 illustrates a schematic diagram of a tablet computer 20 displaying an image editing interface of a screenshot image and then displaying a note editing window, according to some embodiments of the application;
FIG. 10 illustrates an interactive schematic diagram of yet another note generation method, according to some embodiments of the application;
Fig. 11 illustrates a schematic diagram of a cell phone 10, according to some embodiments of the application.
Detailed Description
Illustrative embodiments of the application include, but are not limited to, note generation methods, readable media, program products, and electronic devices.
The technical scheme of the embodiment of the application is described below with reference to the accompanying drawings.
It can be appreciated that the method provided by the embodiment of the application is applicable to any electronic device capable of performing screen capturing or screen recording on the displayed content of the electronic device, including, but not limited to, a mobile phone, a tablet computer, a wearable device (such as a smart watch, a smart bracelet, etc.), a vehicle-mounted device (such as a car machine, etc.), an internet of things device, a smart home device (such as a smart television), a notebook computer, a desktop computer, and a laptop computer. For convenience of description, the technical scheme of the application is described below by taking electronic equipment as an example of a mobile phone.
In some embodiments, the user needs to perform the screen capturing or recording on the video through the screen capturing operation or the screen recording operation defined by the video playing application program or the operating system of the electronic device, and then adds the note on the screen capturing image or the screen recording video through the note recording application program, for example, the memo application program, which is complex in operation.
For example, referring to fig. 1A, when the online class application of the mobile phone 10 plays the online class video, if the user wants to perform screen capturing/recording and make notes on the online class video, the user needs to first perform a screen capturing operation predefined by the operating system of the mobile phone 10, for example, a finger joint double-click operation on the screen of the mobile phone 10, so as to obtain a screen capturing image corresponding to the online class video. Next, referring to fig. 1B, the user can start the memo application by clicking on the icon 11 of the standby recording application. Then, referring to fig. 1C, after the mobile phone 10 starts the memo application, the user may enter the picture selection interface by clicking the add note control 12 and the insert image control 13 in sequence; then click control 14 selects the screen capture image from the previous screen capture and by clicking select complete control 15, the addition of the screen capture image is accomplished and the note editing interface 16 for the screen capture image is entered. Finally, the user can add notes to the screen capturing image of the net lesson video through the operation in the note editing interface 16, so that the operation is complex and the user experience is poor.
It is to be understood that the application that can add notes to the screen shot image or the video can be any application, and for convenience of description, a memo application will be described below as an example.
In order to improve the convenience of adding notes to a screen capturing image or a screen recording video by a user, the embodiment of the application provides a note generation method, when the video is played by electronic equipment, an intelligent note control can be displayed, and the user can screen capture or record the video played by the electronic equipment or the content displayed by the electronic equipment through the operation of the intelligent note control. If the electronic device detects that the user performs screen capturing or screen recording operation (such as clicking or long-press operation on the intelligent note control) on the video played by the electronic device or the interface displayed by the electronic device by utilizing the intelligent note control, a note editing window aiming at the screen capturing image or the screen recording video is automatically displayed after the screen capturing image or the screen recording video is generated, so that the user can add notes for the screen capturing image or the screen recording video in the note editing window. Therefore, a user can trigger a note editing window aiming at a screen capturing image or a screen recording video only by executing screen capturing or screen recording operation by using the intelligent note control, the operation is convenient, and the user experience is improved.
It is understood that the operations of capturing or recording a screen of a video played by the electronic device or an interface displayed by the electronic device by using the intelligent note control may be any predefined operations, including, but not limited to, single or multiple clicking operations, long press operations, sliding operations on the intelligent note control, kneading operations, and the like. For convenience of description, the following description will take a click operation of the intelligent note control as a screen capturing operation and a long press operation of the intelligent note control as a screen recording operation as examples.
It is understood that the intelligent note control may be any shape and form of control, such as a suspension ball, a suspension polygon, a suspension polyhedron, or the like, or may be any shape and form of button, or the like, and the form of the intelligent note control is not limited herein.
It will be appreciated that in some embodiments, the electronic device displays the smart note control when the electronic device is playing the video after the user turns on the "smart note" function (i.e., the function of displaying the smart note control when the electronic device plays the video, and automatically triggering the note editing window when the user is screenshot or recording the video through the smart note control). For example, referring to FIG. 2A, the user can enter the setup interface of the memo application shown in FIG. 2A by clicking on the setup control 14 in FIG. 1C described previously. After detecting that the user turns on the switch 21 corresponding to "smart note", the mobile phone 10 may display the smart note control when the mobile phone 10 plays the video. In some embodiments, the user may also turn on the note transfer function by turning on the switch "22" on the setting interface shown in fig. 2A (i.e., after the intelligent note function is turned on, if other terminal devices are connected to the electronic device, the electronic device automatically transfers the content of the screen capture or recording through the intelligent note control to other terminal devices when detecting that the user captures the screen or recording through the intelligent note control, and displays a note editing window for the content of the screen capture or recording in other terminal devices), so as to transfer the screen capture image or recording video stream to other electronic devices to add notes.
In particular, FIG. 2B illustrates a schematic view of a scenario in which notes are added to a screenshot image based on a smart note control, according to some embodiments of the application.
As shown in fig. 2B, the smart note control 23 may be displayed when the mobile phone 10 plays a web lesson video through the web classroom application. After detecting that the user performs the screen capturing operation by using the smart note control 23 (for example, the user clicks the smart note control 23), the mobile phone 10 pauses to play the web class video, generates a screen capturing image of the content displayed by the mobile phone 10, and stacks and displays a note editing window 24 for the screen capturing image on a window of the web class application. Then, the user can input note content to be added for the screen shot image, such as "patent note … …" shown in fig. 2B, by entering in the text box 241 in the note editing window 24. After the user finishes adding notes to the screen capturing image, the user can click the completion control 242, and then the mobile phone 10 switches to a window of the online class application program after detecting that the user clicks the completion control 242, and continues playing online class videos.
It will be appreciated that in other embodiments, the note editing window may not be superimposed on the top of the window of the web classroom application, and the display position and display manner of the note editing window are not limited herein. For example, referring to fig. 3A, the note editing window may be displayed on the display screen of the mobile phone 10 separately from the window of the web class application, that is, the note editing window and the window of the web class application are displayed in different areas on the display screen of the mobile phone 10. For another example, referring to fig. 3B, after detecting that the user clicks the intelligent note control 23, the mobile phone 10 may also display the note editing window 31 for the screen capturing image in full screen, and after detecting that the user has added a note, for example, after detecting that the user clicks the completion control 311, switch to a window of the online class application program, and continue playing the online class video.
It can be appreciated that, in some embodiments, after detecting that the user performs the screen capturing or screen recording operation using the intelligent note control 23, the mobile phone 10 may also display an editing interface for the screen capturing image or screen recording video first, and display a note editing window for the screen capturing image or screen recording video after the user completes editing the screen capturing image or screen recording video, so that the user can adjust the screen capturing image or screen recording video first, and then add notes, which is beneficial to improving user experience.
For example, referring to fig. 4A, after detecting the operation of clicking the smart note control 23 by the user, the mobile phone 10 pauses the playing of the web class video, generates a screen capturing image of the content displayed by the mobile phone 10, and displays an editing interface 41 for the screen capturing image, and the user can perform operations such as clipping, adjusting, adding a label, etc. on the screen capturing image through the operation in the editing interface 41. After detecting that the user completes editing the screen capturing image, for example, after detecting that the user clicks the completion control 42, the mobile phone 10 switches to a window of the online class application program, and stacks and displays the aforementioned note editing window 24 for the modified screen capturing image on the window of the online class application program. It will be appreciated that, in other embodiments, after detecting that the user completes editing the screen capturing image, the mobile phone 10 may also display the window of the web class application program and the note editing window for the edited screen capturing image in a split screen manner, and may also display the note editing window for the edited screen capturing image in a full screen manner, where the display manner of the note editing window for the edited screen capturing image is not limited.
For another example, referring to fig. 4B, after detecting that the user presses the smart note control 23 for a long time, the mobile phone 10 pauses to play the web class video, generates a video recording of the content displayed by the mobile phone 10, and displays an editing interface 43 for the video recording, and the user can intercept, divide, adjust the volume, and the like, the video recording by the operation in the editing interface 43. After detecting that the user finishes editing the video recorded by the mobile phone 10, for example, after detecting that the user clicks the finishing control 44, the window of the network class application program and the note editing window for the video recorded after editing are displayed in a split screen manner, and the display mode of the note editing window for the video recorded after editing is not limited.
It will be appreciated that in some embodiments, if other electronic devices are connected to the mobile phone 10 and an application program capable of adding notes to a screen capturing image or a video recording is installed in the electronic device, for example, the aforementioned memo application program, the mobile phone 10 may also transfer a note editing interface for the screen capturing image or the video recording to the other electronic devices, so that a user may add notes to the screen capturing image or the video recording on the other electronic devices without occupying the display area of the mobile phone 10, which is beneficial to further improving the user experience.
For example, referring to fig. 5, assuming that the tablet computer 20 is connected to the mobile phone 10 and a memo application is also installed in the tablet computer 20, the mobile phone 10 may perform screen capturing or recording of the content displayed on the mobile phone 10 after detecting that the user clicks or presses the smart note control 23 for a long time, pause playing of the web class video and transmit the screen capturing image or the screen recording video to the tablet computer 20. After receiving the screen capturing image or the screen recording video transmitted from the mobile phone 10, the tablet pc 20 displays a note editing interface 51 for the received screen capturing image or screen recording video. So that the user can add notes to the received screenshot image or video by operation in the note editing interface 51 of the tablet computer 20. After detecting that the user has completed adding the note, for example, after detecting that the user clicks the completion control 52, the tablet computer 20 may send a notification message of the completion of adding the note to the mobile phone 10. And further, after receiving the notification message of the note addition completion, the mobile phone 10 can resume playing the online class video.
It will be appreciated that in some embodiments, if the user adds multiple notes via the smart note control during the playing of the video, the memo application may also automatically typeset the added notes in the background. For example, referring to FIG. 6, when the user launches the memo application, detecting that the user has previously added note 1 and note 2 through the Smart note control, a prompt box 61 may be displayed to prompt the user "2 Smart notes detected, have been automatically typeset for you".
The technical solution of the present application will be described with reference to the situation shown in fig. 2A to 6.
First, the technical scheme when the mobile phone 10 is not connected to other electronic devices or the "note transfer" function is not turned on will be described.
Specifically, fig. 7 illustrates a flow diagram of a note generation method, according to some embodiments of the application. The main execution subject of the process is the mobile phone 10, as shown in fig. 7, the process includes the following steps:
S701: the foreground plays the video.
The method provided by the embodiment of the present application is triggered when the mobile phone 10 plays the video in the foreground, for example, when the video is played by using the aforementioned network classroom application.
It can be appreciated that when the mobile phone 10 plays any video through an application program that plays any video, the method provided by the embodiment of the present application may be triggered.
It will be appreciated that the video played by the mobile phone 10 may be a video stored in the mobile phone 10, or may be an online video, a live video, or the like acquired from other electronic devices through a network, which is not limited herein.
It will be appreciated that when the mobile phone 10 plays the video in the foreground, if the user has turned on the foregoing smart note function, the smart note control may be displayed.
S702: it is determined whether a trigger operation is detected.
The mobile phone 10 determines whether the triggering operation is detected, if the triggering operation is detected, it indicates that the user may need to screen-capture or record the content displayed by the mobile phone 10, and add notes for the screen-capture image or the screen-record video, and goes to step S703, otherwise, step S702 is repeated, and whether the triggering operation is detected is continuously determined.
It is to be appreciated that the triggering operation may be any operation performed on the smart note control, including, but not limited to, a single or multiple click operation, a long press operation, a sliding operation on the smart note control, a pinching operation, and the like.
S703: and judging whether the triggering operation is a screen capturing operation or not.
When the mobile phone 10 determines that the triggering operation is detected, judging whether the triggering operation is a screen capturing operation of the content displayed on the mobile phone 10 or the video played by using the intelligent note control, if so, explaining that the triggering operation of the user is to capture a screen of the content or the video played by the mobile phone 10 and add notes to the screen capturing image, and turning to step S704; otherwise, go to step S706 for further judgment.
It is to be appreciated that the screen capture operation can be any operation predefined for the smart note control including, but not limited to, a single or multiple click operation, a long press operation, a sliding operation on the smart note control, a pinch operation, and the like. In some embodiments, the screen capture operation may be a user click operation on a smart note control.
S704: the screen capture generates a screen capture image.
When determining that the triggering operation of the intelligent note control by the user is the screen capturing operation, the mobile phone 10 can capture the screen of the content displayed by the mobile phone 10 or the video played by the mobile phone 10, so as to generate a screen capturing image.
It will be appreciated that in some embodiments, the handset 10 may pause the video played in the foreground after generating the screenshot image.
S705: displaying an image editing interface/note editing window, and generating notes of the screen capturing image according to user operation.
The mobile phone 10 may display an image editing interface for the screen shot image after generating the screen shot image, and display a note editing window for the edited screen shot image after the user edits the screen shot image. After displaying the note editing window for the edited screenshot image, the mobile phone 10 may generate a note for the edited screenshot image according to the user's operation in the note editing window.
For example, referring to fig. 4A, after the user clicks the smart note control 23, the mobile phone 10 may screen-capture the content played by the web class application, generate a screen capture image, and display the image editing interface 41. After the user completes editing the screenshot, for example, after the user clicks the completion control 42, the note editing interface 24 for the edited screenshot is displayed.
It will be appreciated that in some embodiments, after the mobile phone 10 generates the screenshot image, the image editing interface for the screenshot image may not be displayed, and the note editing window for the screenshot image may be directly displayed, which is not limited herein. For example, referring to fig. 2B, after the content played by the web class application is captured and the captured image is generated, the mobile phone 10 may directly display the note editing window 24 for the captured image.
S706: judging whether the triggering operation is a screen recording operation or not.
When determining that the triggering operation is not the screen capturing operation, the mobile phone 10 determines whether the screen capturing operation is a screen recording operation of the content displayed on the mobile phone 10 or the video played by using the intelligent note control, if so, the triggering operation of the user is described as to record the content of the mobile phone 10 or the video played and add notes to the recorded images, and then the step S707 is performed; otherwise, go to step S702.
It is to be appreciated that the screen capture operation can be any operation for the smart note control that is predefined and different from the screen capture operation described above, including, but not limited to, a single or multiple click operation, a long press operation, a sliding operation on the smart note control, a pinch operation, and the like. In some embodiments, the screen capture operation may be a long press operation of the smart note control by the user.
It can be appreciated that in some embodiments, the mobile phone 10 may also determine whether the triggering operation is a screen recording operation, and if it is determined that the triggering operation is a screen recording operation, perform operations of S707, S708, S709, S710, S711; if the triggering operation is judged not to be the screen recording operation, judging whether the triggering operation is the screen capturing operation or not. The mobile phone 10 performs the operations of steps S704, S705, S709, S710, and S711 when it is determined that the trigger operation is a screen capturing operation, and proceeds to step S702 when it is determined that the trigger operation is not a screen capturing operation.
S707: and generating a video recording.
When determining that the triggering operation of the user on the intelligent note control is a screen recording operation, the mobile phone 10 can record the content displayed by the mobile phone 10 or the video played by the mobile phone 10 to generate a screen recording video.
It will be appreciated that in some embodiments, the handset 10 may pause the video played in the foreground after generating the video recording.
S708: and displaying a video editing interface/note editing window, and generating notes of the recorded video according to user operation.
After the video recording is generated, the mobile phone 10 may display a video editing interface for the video recording, and after the user edits the video recording, display a note editing window for the edited video recording. After displaying the note editing window for the edited video, the mobile phone 10 may generate a note for the edited video according to the operation of the user in the note editing window.
For example, referring to fig. 4B, after the user clicks the smart note control 23, the mobile phone 10 may record the content played by the online class application, generate a recorded video, and display the video editing interface 43. After the user completes editing the video, for example, after the user clicks the completion control 44, a note editing interface 45 for the edited video is displayed.
It will be appreciated that, in some embodiments, after the mobile phone 10 generates the video recording, the video editing interface for the video recording may not be displayed, but the note editing window for the video recording may be directly displayed, which is not limited herein.
S709: it is detected that the addition of notes is completed.
After detecting that the addition of the notes is completed, the mobile phone 10 performs step S710 and step S711.
Illustratively, referring to FIG. 2B, upon detecting that the user clicks on the completion control 242, the handset 10 may detect that the note has been added to completion; referring to fig. 3B, upon detecting that the user clicks the completion control 311, the handset 10 may detect that the note has been added.
S710: and typesetting the notes by the background.
After the mobile phone 10 detects that the note adding is completed, typesetting is performed on the notes in the background, so that the influence on the user experience in the foreground can be avoided.
S711: and switching the video back to the foreground to continue playing.
After detecting that the note addition is completed, the mobile phone 10 may switch the video back to the foreground to continue playing, and go to step S702 to determine whether a triggering operation is detected.
It will be appreciated that in some embodiments, the mobile phone 10 may perform the step S710 and the step S711 in parallel, or may perform the steps sequentially, which is not limited herein.
According to the method provided by the embodiment of the application, a user can trigger the note editing window aiming at the screen capturing image or the screen recording video in the mobile phone 10 by operating the intelligent note control, such as clicking the intelligent note control or pressing the intelligent note control for a long time, so that notes are added to the screen capturing image or the screen recording video in the note editing window, the operation is convenient, and the user experience is improved.
The following describes a technical scheme that other electronic devices are connected to the mobile phone 10, and the mobile phone 10 transfers the screen capturing image or the screen recording video stream to the other electronic devices to add notes.
In particular, FIG. 8 illustrates a flow diagram of yet another note generation method, according to some embodiments of the application.
As shown in fig. 8, the flow includes the steps of:
s801: the handset 10 plays the video.
The method provided by the embodiment of the present application is triggered when the mobile phone 10 plays the video, for example, when the video is played by using the aforementioned network classroom application.
It can be appreciated that when the mobile phone 10 plays any video through an application program that plays any video, the method provided by the embodiment of the present application may be triggered.
It will be appreciated that the video played by the mobile phone 10 may be a video stored in the mobile phone 10, or may be an online video, a live video, or the like acquired from other electronic devices through a network, which is not limited herein.
It can be appreciated that when the mobile phone 10 plays the video in the foreground, if the user has turned on the foregoing smart note function and note streaming function, the smart note control may be displayed.
S802: the handset 10 determines whether a trigger operation is detected.
The mobile phone 10 determines whether the triggering operation is detected, if the triggering operation is detected, it indicates that the user may need to screen-capture or record the content displayed by the mobile phone 10, and add notes for the screen-capture image or the screen-record video, and goes to step S803, otherwise, step S802 is repeated, and whether the triggering operation is detected is continuously determined.
It is to be appreciated that the triggering operation may be any operation performed on the smart note control, including, but not limited to, a single or multiple click operation, a long press operation, a sliding operation on the smart note control, a pinching operation, and the like.
S803: the handset 10 determines whether the trigger operation is a screen capturing operation.
When the mobile phone 10 determines that the triggering operation is detected, judging whether the triggering operation is a screen capturing operation of the content displayed on the mobile phone 10 or the video played by using the intelligent note control, if so, explaining that the triggering operation of the user is to capture a screen of the content or the video played by the mobile phone 10 and add notes to the screen capturing image, and turning to step S804; otherwise, go to step S806 for further judgment.
It is to be appreciated that the screen capture operation can be any operation predefined for the smart note control including, but not limited to, a single or multiple click operation, a long press operation, a sliding operation on the smart note control, a pinch operation, and the like. In some embodiments, the screen capture operation may be a user click operation on a smart note control.
S804: the handset 10 captures a screen to generate a screen capture image.
When determining that the triggering operation of the intelligent note control by the user is the screen capturing operation, the mobile phone 10 can capture the screen of the content displayed by the mobile phone 10 or the video played by the mobile phone 10, so as to generate a screen capturing image.
It will be appreciated that in some embodiments, the handset 10 may pause the video played in the foreground after generating the screenshot image.
S805: the handset 10 transfers the screenshot image stream to the tablet computer 20.
After generating the screenshot image, the handset 10 may send the screenshot image to the tablet 20.
S806: the tablet personal computer 20 displays an image editing interface/note editing interface and generates notes of a screen capturing image according to a user operation.
After receiving the screen capturing image sent by the mobile phone 10, the tablet computer 20 may display an image editing interface for the screen capturing image, and after the user edits the screen capturing image, display a note editing window for the edited screen capturing image. After displaying the note editing window for the edited screenshot image, the tablet computer 20 may generate a note for the edited screenshot image according to the user's operation in the note editing window.
For example, referring to fig. 5, after receiving the screen capturing image transmitted from the mobile phone 10, the tablet pc 20 may display the note editing window 51 for the screen capturing image in full screen, and generate a note for the screen capturing image according to the operation of the user in the note editing window 51.
It will be appreciated that, in some embodiments, after receiving the screenshot image, the tablet computer 20 may display an image editing interface for the screenshot image first, and display a note editing window for the edited screenshot image after detecting that the user edits the screenshot image, which is not limited herein. For example, referring to fig. 9, the tablet computer 20 may first display an image editing interface 91 for the screen shot image after receiving the screen shot image, and display a note editing window 92 for the edited screen shot image after the user edits the screen shot image.
S807: the mobile phone 10 determines whether the trigger operation is a screen recording operation.
When determining that the triggering operation is not the screen capturing operation, the mobile phone 10 determines whether the screen capturing operation is a screen recording operation of the content displayed on the mobile phone 10 or the video played by using the intelligent note control, if so, the triggering operation of the user is described as to record the content of the mobile phone 10 or the video played and add notes to the recorded images, and then the step S808 is performed; otherwise, go to step S802.
It is to be appreciated that the screen capture operation can be any operation for the smart note control that is predefined and different from the screen capture operation described above, including, but not limited to, a single or multiple click operation, a long press operation, a sliding operation on the smart note control, a pinch operation, and the like. In some embodiments, the screen capture operation may be a long press operation of the smart note control by the user.
It can be appreciated that in some embodiments, the mobile phone 10 may also determine whether the triggering operation is a screen recording operation, and if it is determined that the triggering operation is a screen recording operation, perform operations S808, S809, S810, S811, S812; if the triggering operation is judged not to be the screen recording operation, judging whether the triggering operation is the screen capturing operation or not. The mobile phone 10 executes the operations of steps S804, S805, S806, S811, and S812 when it is determined that the trigger operation is a screen capturing operation, and proceeds to step S802 when it is determined that the trigger operation is not a screen capturing operation.
S808: the handset 10 records a screen to generate a recorded video.
When determining that the triggering operation of the user on the intelligent note control is a screen recording operation, the mobile phone 10 can record the content displayed by the mobile phone 10 or the video played by the mobile phone 10 to generate a screen recording video.
It will be appreciated that in some embodiments, the handset 10 may pause playing the video after generating the video.
S809: the handset 10 transfers the video stream to the tablet computer 20.
After generating the video recording, the mobile phone 10 may send the video recording to the tablet computer 20.
S810: the tablet computer 20 displays a video editing interface/note editing interface and generates notes of the recorded video according to user operations.
After receiving the video recorded by the mobile phone 10, the tablet computer 20 can display a video editing interface for the video recorded, and after the user edits the video recorded, display a note editing window for the edited video recorded. After displaying the note editing window for the edited video, the tablet computer 20 may generate notes for the edited video according to the user's operation in the note editing window.
It will be appreciated that in some embodiments, after receiving the video, the tablet computer 20 may also display the note editing window for the video directly instead of displaying the video editing interface for the video, which is not limited herein.
S811: the tablet computer 20 transmits a post-addition notification to the mobile phone 10 after the addition of the note is completed.
After detecting that the user completes the addition of the note for the screen shot image or the screen recording video, the tablet computer 20 may send a notification after the addition is completed to the mobile phone 10.
Illustratively, referring to fig. 5, upon detecting that the user clicks the completion control 511, the tablet 20 may detect that the note has been added, and send a post-addition notification to the handset 10; referring to fig. 9, when detecting that the user clicks the completion control 921, the tablet computer 20 may detect that the note has been added, and send a notification after the addition is completed to the mobile phone 10.
S811: the handset 10 continues to play the video.
After receiving the notification of the completion of the addition of the tablet pc 20, the mobile phone 10 continues playing the video in response to the notification, and proceeds to step S802 to determine whether a trigger operation is detected.
According to the method provided by the embodiment of the application, a user can trigger the display of the note editing window aiming at the screen capturing image or the screen recording video in the tablet personal computer 20 by operating the intelligent note control in the mobile phone 10, such as clicking the intelligent note control or pressing the intelligent note control for a long time, so that notes are added to the screen capturing image or the screen recording video in the note editing window, the operation is convenient, and the user experience is facilitated.
The application also provides a note generation method in combination with the embodiment shown in fig. 2A to 9.
Specifically, fig. 10 illustrates an interactive flow diagram of a note generation method, according to some embodiments of the application. As shown in fig. 10, the interaction flow includes the following steps:
S1001: the mobile phone 10 detects that the smart note function has been turned on and displays the smart note control while playing the video.
In the case that the mobile phone 10 detects that the intelligent note function of the memo application is turned on, if the mobile phone 10 is detected to be playing video, the intelligent note control is displayed.
It will be appreciated that in some embodiments, the handset 10 may display the smart note control only when the application playing the video is a preset or user-set application; in other embodiments, the mobile phone 10 may also display the smart note control when the mobile phone 10 plays the video through any application, which is not limited herein.
S1002: the mobile phone 10 determines whether an operation is detected in which the user obtains the content displayed by the mobile phone 10 using the smart note control.
The mobile phone 10 judges whether the operation of obtaining the content displayed by the mobile phone 10 by the user through the intelligent note control is detected, if so, the mobile phone 10 indicates that the user can want to screen capture or record the content displayed by the mobile phone 10 or the expanded video content, and adds notes for the screen capture image or the screen record video, and the step is transferred to step S1003; otherwise, step S1002 is repeated, and the operation of judging whether the user obtaining the played video content by using the intelligent note control is detected is continued.
It will be appreciated that in some embodiments, the user's operation to obtain the played video content using the smart note control may be the screen capture operation or the screen recording operation described above. For example, in some embodiments, the user's operation to obtain the played video content with the smart note control may include a user's click operation on the smart note control (corresponding to a screen capture) and a user's long press operation on the smart note control (corresponding to a screen capture).
It will be appreciated that in other embodiments, the user's operation of obtaining the played video content using the smart note control may include more or fewer operations, and is not limited in this regard.
S1003: the handset 10 generates a content acquisition result.
When determining that the operation of acquiring the played video content by the user through the intelligent note control is detected, the mobile phone 10 generates a corresponding content acquisition result according to the specific operation of the user. For example, when the mobile phone 10 detects that the operation of the user is a screen capturing operation (for example, the operation of the user clicking on the smart note control), the content acquisition result generated by the mobile phone 10 may be a screen capturing image of the content displayed by the mobile phone 10; for another example, when the mobile phone 10 detects that the operation of the user is a screen recording operation (for example, the user presses the smart note control for a long time), the content acquisition result generated by the mobile phone 10 may be a screen recording video of the content displayed by the mobile phone 10.
S1004: the mobile phone 10 determines whether the note transfer function of the mobile phone 10 is turned on.
After the mobile phone 10 generates the content acquisition result, judging whether the note transfer function of the mobile phone 10 is already started, if the note transfer function of the mobile phone 10 is already started, indicating that the content acquisition result may be transferred to other electronic devices to add notes to the content acquisition result, and then transferring to step S1005 for further judgment; otherwise, the note can be added to the content acquisition result only in the mobile phone 10, and the process goes to step S1006.
S1005: the handset 10 determines whether a streamable electronic device is present.
When determining that the note transfer function of the mobile phone 10 is turned on, the mobile phone 10 determines whether there is a transfer-enabled electronic device, and if so, the mobile phone 10 may transfer the content acquisition result to another electronic device to add notes to the content acquisition result, and then proceeds to step S1009; otherwise, the note can be added to the content acquisition result only in the mobile phone 10, and the process goes to step S1006.
It will be appreciated that in some embodiments, the handset 10 may determine that there is a streamable electronic device when there are other electronic devices connected to the handset 10 and the same application or service for taking notes as the handset 10 is installed, for example, all installed with the aforementioned memo application.
S1006: the mobile phone 10 pauses playing the video, and displays an editing interface for the content acquisition result.
When the mobile phone 10 determines that the note transfer function of the mobile phone 10 is not started or no electronic equipment capable of transferring exists, the mobile phone pauses playing video and displays an editing interface for the content acquisition result.
Illustratively, referring to fig. 4A, the handset 10 may display an image editing interface 41 for the screenshot image when the content acquisition result is the screenshot image.
Illustratively, referring to fig. 4B, when the content acquisition result is a video recording, the handset 10 may display a video editing interface 43 for the video recording.
S1007: after detecting that the user edits the content acquisition result, the mobile phone 10 displays a note editing window for the edited content acquisition result.
After detecting that the user edits the content acquisition result, the mobile phone 10 displays a note editing window for the edited content acquisition result.
Illustratively, referring to FIG. 4A, upon detecting that the user clicks the done control 42 in the image editing interface 41, the handset 10 detects that the user has completed editing the screen capture image and displays the note editing interface 24 for the edited screen capture image.
Illustratively, referring to fig. 4B, upon detecting that the user clicks the completion control 44 in the video editing interface 43, the handset 10 detects that the user has completed editing the video on the screen, and displays the note editing interface 45 for the edited video on the screen.
It will be appreciated that, in some embodiments, the mobile phone 10 may not display an editing interface for the content acquisition result, but directly display the note editing interface for the content acquisition result when it is determined that the note transfer function of the mobile phone 10 is not turned on or no electronic device capable of transferring is present, which is not limited herein.
S1008: the mobile phone 10 detects that the user has added notes for the content acquisition result, and resumes the video playback.
The mobile phone 10 resumes the video playback upon detecting that the user has added notes for the content acquisition result.
Illustratively, referring to fig. 2B, when detecting that the user clicks the completion control 242, the mobile phone 10 may detect that the user has added notes for the content acquisition result, and resume video playback; referring to fig. 3B, when detecting that the user clicks the completion control 311, the mobile phone 10 may resume video playback after the user has added a note to the content acquisition result.
Illustratively, after receiving the note adding completion notification sent by the tablet computer 20, the mobile phone 10 detects that the user has added a note for the content acquisition result, and resumes the video playing.
S1009: the mobile phone 10 pauses playing the video and transmits the content acquisition result to the tablet computer 20.
In the case where it is determined that a streamable electronic device (e.g., tablet computer 20) exists, the handset 10 pauses playing the video and sends the content acquisition result to the tablet computer 20.
S1010: the tablet computer 20 displays an editing interface of the content acquisition result.
After receiving the content acquisition result (for example, the screen capturing image or the screen recording video) sent by the mobile phone 10, the tablet computer 20 displays an editing interface for the screen capturing image or the screen recording video.
Illustratively, referring to fig. 9, upon receiving the screenshot image, the tablet 20 may display an image editing interface 91 for the screenshot image.
It will be appreciated that after the tablet computer 20 displays the editing interface of the content acquisition result, the user may edit the content acquisition result in the editing interface.
S1011: after detecting that the user edits the content acquisition result, the tablet computer 20 displays a note editing window for the edited content acquisition result.
After detecting that the user edits the content acquisition result, the tablet computer 20 displays a note editing window for the edited content acquisition result.
Illustratively, referring to fig. 9, upon detecting a click operation of the completion control 911 by the user in the image editing interface 91, the tablet pc 20 may detect that the user has completed editing the screen shot image, and display a note editing window 921 for the edited screen shot image.
It will be appreciated that after the tablet computer 20 displays the note editing window for the edited content acquisition result, the user may add notes to the edited content acquisition result in the note editing window.
It will be appreciated that in some embodiments, the tablet computer 20 may not display an editing interface for the content acquisition result, but directly display a note editing interface for the content acquisition result when the content acquisition result is received, which is not limited herein.
S1012: after detecting that the user completes adding the note for the content acquisition result, the tablet computer 20 sends a note addition completion notification to the mobile phone 10.
After detecting that the user completes adding the note for the content acquisition result, the tablet computer 20 sends a note addition completion notification to the mobile phone 10.
Illustratively, referring to fig. 9, when detecting that the user clicks the completion control 921, the tablet computer 20 may detect that the user has completed adding notes for the content acquisition result, and send a note addition completion notification to the mobile phone 10. Further, after receiving the note adding completion notification, the mobile phone 10 may detect that the user has added a note for the content acquisition result, and resume video playback.
According to the method provided by the embodiment of the application, a user can trigger the note editing window for the screen capturing image or the screen recording video in the mobile phone 10 or the tablet personal computer 20 by operating the intelligent note control in the mobile phone 10, for example, clicking the intelligent note control or pressing the intelligent note control for a long time, so that notes are added to the screen capturing image or the screen recording video in the note editing window, the operation is convenient, and the user experience is improved.
It should be understood that the application program for playing video in the mobile phone 10 is described as a web class application program, which is merely an example, and in other embodiments, the application program for playing video in the mobile phone 10 may be any other application program, which is not limited herein.
Further, fig. 11 illustrates a schematic diagram of a cell phone 10, according to some embodiments of the present application.
As shown in fig. 11, the mobile phone 10 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, a communication device 196, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may be configured to execute instructions of the note generation method provided by the embodiments of the present application.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect to a charger to charge the mobile phone 10, or may be used to transfer data between the mobile phone 10 and peripheral devices. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the mobile phone 10 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 10. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., as applied to the handset 10. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 may receive electromagnetic waves from the antenna 2, filter, amplify, and the like the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The wireless communication module 160 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
The handset 10 implements display functions through a GPU, a display 194, and an application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), a mini-LED, a micro-OLED, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the handset 10 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the handset 10 may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to interface with an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 10. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code that includes instructions, such as the memory 103 described above. The internal memory 121 may include a program memory area and a data memory area. The program storage area may store an application program required for at least one function of the operating system, such as the aforementioned memo application, and the like. The data storage area may store data created during use of the handset 10, etc., and may be used, for example, to store screen shots, video recordings, notes for screen shots or video recordings, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 executes various functional applications of the handset 10 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The handset 10 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The handset 10 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 194, the mobile phone 10 detects the touch operation intensity from the pressure sensor 180A. The handset 10 may also calculate the location of the touch based on the detection signal from the pressure sensor 180A.
The acceleration sensor 180E can detect the magnitude of acceleration of the mobile phone 10 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 10 is stationary. And can also be used for recognizing the gesture of the electronic equipment, determining whether the mobile phone 10 is in a horizontal screen state, and applying a pedometer.
The ambient light sensor 180L is used to sense ambient light level. The handset 10 may adaptively adjust the brightness of the display 194 based on perceived ambient light levels. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect if the handset 10 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 10 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 10 at a different location than the display 194. For example, in some embodiments, when the touch screen detects an operation of the touch screen by a user, the touch screen may be sent to the processor 110, so that the processor 110 may identify the operation of the intelligent note control by the user, so as to trigger the note generation method provided by the embodiment of the present application.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The handset 10 may receive key inputs, generating key signal inputs related to user settings and function control of the handset 10.
The motor 191 may generate a vibration cue.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card.
It will be appreciated that the structure of the handset 10 shown in fig. 11 does not constitute a specific limitation on the handset 10. In other embodiments of the application, the handset 10 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Embodiments of the disclosed mechanisms may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as a computer program or program code that is executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For the purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope by any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) in an electrical, optical, acoustical or other form of propagated signal using the internet. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module mentioned in each device is a logic unit/module, and in physical terms, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is only a key for solving the technical problem posed by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems posed by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the application.

Claims (15)

1. The note generation method is applied to the first electronic device and is characterized by comprising the following steps:
displaying a first application interface of a first application, wherein a first video is played in the first application interface, and the first application interface comprises a first control;
Detecting a first operation of a user on the first control, and acquiring first display picture data of a first video played in the first application interface;
and displaying a note editing window aiming at the first display picture data, wherein the note editing window is used for generating notes of the first display picture data according to the content input by a user in the note editing window.
2. The method according to claim 1, wherein the method further comprises:
after the first display picture data are acquired, pausing playing the first video;
and after detecting that the note of the first display picture data is generated, resuming playing the first video.
3. The method of claim 1, wherein the first display data comprises a screen shot image or a video recording for the first application interface or the first video.
4. The method of claim 1, wherein the first operation comprises any one of:
Single or multiple click operations on the first control;
long-press operation is performed on the first control;
a sliding operation or a kneading operation on the first control.
5. The method of claim 1, wherein the note editing window is displayed on top of the first application interface, or split-screen with the first application interface, or full-screen in a display screen of the first electronic device.
6. The method according to claim 1, wherein the method further comprises:
And after the first display picture data of the first video played in the first application interface is obtained, displaying an editing interface aiming at the first display picture data, and displaying the note editing window after the first electronic device detects that the user finishes editing the first display picture data.
7. A note generation method, comprising:
Displaying a second application interface of a second application by the second electronic device, wherein a second video is played in the second application interface, and the second application interface comprises a second control;
The second electronic equipment detects a second operation of the user on the second control, and second display picture data of a second video played in the second application interface are obtained;
The second electronic device sends the second display picture data to a third electronic device;
The third electronic device displays a note editing window for the second display screen data, wherein the note editing window is used for generating notes of the second display screen data according to contents input by a user in the note editing window.
8. The method of claim 7, wherein the method further comprises:
After the second electronic device acquires the second display picture data, the second electronic device pauses playing the second video;
and after the second electronic device detects that the third electronic device has generated the note of the second display picture data, resuming playing the second video.
9. The method of claim 7, wherein the second display data comprises a screen shot image or a video recording for the second application interface or the second video.
10. The method of claim 7, wherein the second operation comprises any one of:
single or multiple click operations on the second control;
Long-press operation is performed on the second control;
A sliding operation or a kneading operation on the second control.
11. The method of claim 8, wherein the method further comprises:
after generating the note of the second display picture data, the third electronic device sends a note adding completion notice to the second electronic device;
And after receiving the note adding completion notification, the second electronic device detects that the third electronic device has generated the note of the second display screen data.
12. The method of claim 7, wherein the method further comprises:
The third electronic device displays an editing interface for the second display screen data after receiving the second display screen data, and the third electronic device displays the note editing window after detecting that the user completes editing the second display screen data.
13. A readable storage medium, comprising instructions that, when executed by an electronic device, cause the electronic device to implement the note generation method of any one of claims 1 to 6 or the note generation method of any one of claims 7 to 12.
14. An electronic device, comprising:
A memory in which instructions are stored;
At least one processor configured to execute the instructions to cause the electronic device to implement the note generation method of any one of claims 1 to 6, or the note generation method of any one of claims 7 to 12.
15. A program product, characterized in that it when run on an electronic device causes the electronic device to implement the note generation method of any one of claims 1 to 6 or the note generation method of any one of claims 7 to 12.
CN202310035813.3A 2023-01-10 2023-01-10 Note generation method, readable storage medium, program product, and electronic device Pending CN118337905A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310035813.3A CN118337905A (en) 2023-01-10 2023-01-10 Note generation method, readable storage medium, program product, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310035813.3A CN118337905A (en) 2023-01-10 2023-01-10 Note generation method, readable storage medium, program product, and electronic device

Publications (1)

Publication Number Publication Date
CN118337905A true CN118337905A (en) 2024-07-12

Family

ID=91768987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310035813.3A Pending CN118337905A (en) 2023-01-10 2023-01-10 Note generation method, readable storage medium, program product, and electronic device

Country Status (1)

Country Link
CN (1) CN118337905A (en)

Similar Documents

Publication Publication Date Title
KR102470275B1 (en) Voice control method and electronic device
EP3979628B1 (en) Display method of video call applied to electronic device and related apparatus
CN110058777B (en) Method for starting shortcut function and electronic equipment
JP7378598B2 (en) Near field communication methods and electronic devices
CN111666119A (en) UI component display method and electronic equipment
WO2021063237A1 (en) Control method for electronic device, and electronic device
WO2020173370A1 (en) Method for moving application icons, and electronic device
CN108881286B (en) Multimedia playing control method, terminal, sound box equipment and system
CN110633043A (en) Split screen processing method and terminal equipment
CN112583957A (en) Display method of electronic device, electronic device and computer-readable storage medium
CN111742539A (en) Voice control command generation method and terminal
CN115022495B (en) Photographing method, readable medium, and electronic device
CN113992663B (en) Method for sending file, electronic device and readable storage medium
CN112527093A (en) Gesture input method and electronic equipment
CN113934330A (en) Screen capturing method and electronic equipment
CN113010076A (en) Display element display method and electronic equipment
WO2024045801A1 (en) Method for screenshotting, and electronic device, medium and program product
WO2022037725A1 (en) System service recovery method and apparatus, and electronic device
CN112911337B (en) Method and device for configuring video cover pictures of terminal equipment
CN111492678A (en) File transmission method and electronic equipment
CN113168257A (en) Method for locking touch operation and electronic equipment
CN114756785A (en) Page display method and device, electronic equipment and readable storage medium
CN110286839B (en) Message sending method, device, terminal and storage medium
CN114089902A (en) Gesture interaction method and device and terminal equipment
WO2023029916A1 (en) Annotation display method and apparatus, terminal device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication