CN110134237B - Interface control method and related equipment - Google Patents

Interface control method and related equipment Download PDF

Info

Publication number
CN110134237B
CN110134237B CN201910394436.6A CN201910394436A CN110134237B CN 110134237 B CN110134237 B CN 110134237B CN 201910394436 A CN201910394436 A CN 201910394436A CN 110134237 B CN110134237 B CN 110134237B
Authority
CN
China
Prior art keywords
screen
video playing
interface
video
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910394436.6A
Other languages
Chinese (zh)
Other versions
CN110134237A (en
Inventor
张海平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910394436.6A priority Critical patent/CN110134237B/en
Publication of CN110134237A publication Critical patent/CN110134237A/en
Application granted granted Critical
Publication of CN110134237B publication Critical patent/CN110134237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Abstract

The application discloses an interface control method and related equipment, which are applied to electronic equipment comprising a main screen, a plurality of cameras and an auxiliary screen, wherein the cameras are arranged in parallel, the auxiliary screen is covered above the cameras, and the main screen currently displays a video playing interface; the method comprises the following steps: when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface; when a first touch operation for the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event related to a video playing interface; and executing the first control instruction. By adopting the method and the device, the problem of shielding a video playing interface when the current video is adjusted to be played can be solved.

Description

Interface control method and related equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an interface control method and a related device.
Background
Watching videos on electronic equipment such as a mobile phone or a tablet personal computer is a very common entertainment form in daily life of people, however, the current mobile phone or computer generally has only one screen. When video playing control is performed on the screen, the screen is usually required to be tapped, and the displayed adjustment control area is correspondingly clicked, slid and the like, so as to trigger events associated with the video playing interface. The method for operating on the screen by the fingers is easy to shield a video playing interface, so that lower video playing experience is caused.
Disclosure of Invention
The embodiment of the application provides an interface control method and related equipment, which are used for solving the problem of shielding a video playing interface when a currently adjusted video is played, and improving the experience of a user when the user watches the video.
In a first aspect, an embodiment of the present application provides an interface control method, which is applied to an electronic device including a main screen, multiple cameras, and a secondary screen, where the multiple cameras are arranged side by side, the secondary screen is covered above the multiple cameras, and the main screen currently displays a video playing interface; the method comprises the following steps:
when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface;
when a first touch operation for the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event related to the video playing interface;
and executing the first control instruction.
In a second aspect, an embodiment of the present application provides an interface control apparatus, which is applied to an electronic device that includes a main screen, a plurality of cameras, and a secondary screen, where the plurality of cameras are arranged side by side, the secondary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; the device comprises:
the auxiliary screen control unit is used for awakening the auxiliary screen when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, and the awakened auxiliary screen is used for triggering events related to the video playing interface;
the determining unit is used for determining a first control instruction corresponding to a first touch operation when the first touch operation for the secondary screen is detected, wherein the first control instruction is used for triggering a first event associated with the video playing interface;
and the execution unit executes the first control instruction.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, a communication interface, and one or more programs, stored in the memory and configured to be executed by the processor, the programs including instructions for performing some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is executed by a processor to implement part or all of the steps described in the method according to the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps described in the method according to the first aspect of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, the electronic device includes a main screen, a plurality of cameras, and an auxiliary screen, the plurality of cameras are arranged side by side, the auxiliary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface; when a first touch operation aiming at the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event related to a video playing interface; and executing the first control instruction. Different from the current video playing interface control method, the video playing interface is controlled through the auxiliary screen in the embodiment of the application, so that the video played currently on the main screen cannot be shielded, and the experience of a user in watching the video is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 1B is a schematic structural diagram of a program runtime provided in an embodiment of the present application;
fig. 1C is a schematic structural framework diagram of an Android system provided in an embodiment of the present application;
fig. 1D is a schematic position diagram of a pressure sensor, a camera and a sub-screen provided in the embodiment of the present application;
fig. 2A is a schematic flowchart of an interface control method according to an embodiment of the present application;
fig. 2B is a schematic structural diagram of a video playing interface and a first sub-interface provided in an embodiment of the present application;
fig. 2C is a schematic diagram of a second message display provided in the embodiment of the present application;
FIG. 3 is a schematic flow chart diagram illustrating another interface control method provided in an embodiment of the present application;
FIG. 4 is a schematic flow chart diagram illustrating another interface control method provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of another electronic device provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an interface control device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Hereinafter, some terms in the present application are explained to facilitate understanding by those skilled in the art.
As shown in fig. 1A, fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. In the embodiment of the application, the electronic equipment comprises a main screen, a plurality of cameras and an auxiliary screen, wherein the cameras are arranged side by side, the auxiliary screen cover is arranged above the cameras, and the main screen displays a video playing interface currently. As shown in fig. 1A, the plurality of cameras are each rear cameras. The sub-screen is transparent or has a certain transmittance (e.g., 85%, 90%, etc.), and is not limited herein. The sub-screen may be used to sense a touch operation, display contents, etc. as well as the main screen, which is shown in fig. 1B.
Further, the electronic device further includes a pressure sensor disposed below the sub-screen (as shown in fig. 1D), the pressure sensor being transparent or having a certain light transmittance (e.g., 85%, 90%, etc.), which is not limited herein. The pressure sensor is used for sensing a pressing operation, measuring pressure, and the like.
As shown in fig. 1B, fig. 1B is a schematic structural diagram of a program runtime space provided in an embodiment of the present application. At present, electronic devices such as smart phones are generally provided with a program running space, where the program running space includes a user space and an operating system space, where the user space runs one or more application programs, the one or more application programs are third-party application programs installed on the electronic devices, and the operating system space runs an operating system of the electronic devices.
The electronic device can specifically run an Android system, a mobile operating system iOS developed by apple Inc., and the like, and the electronic device is not limited herein. As shown in fig. 1C, fig. 1C is a schematic view of a structural framework of an Android system provided in an embodiment of the present application. Taking the example that the electronic device runs with an Android system, the corresponding user space includes Application layers (Applications) in the Android system, and the operating system space may include an Application Framework layer (Application Framework) in the Android system, a system running library layer (including a system running library layer Libraries and an Android Runtime), and a Linux Kernel layer (Linux Kernel). The application layer comprises various application programs which are directly interacted with a user or service programs which are written by Java language and run in a background. For example, programs that implement common basic functions on smartphones, such as Short Messaging Service (SMS) SMS, phone dialing, picture viewer, calendar, games, maps, world Wide Web (Web) browser, and other applications developed by developers. The application framework layer provides a series of class libraries required by Android application development, can be used for reusing components, and can also realize personalized extension through inheritance. The system operation library layer is a support of an application program framework and provides services for all components in the Android system. The system operation library layer is composed of a system class library and Android operation. The Android runtime comprises two parts, namely a core library and a Dalvik virtual machine. The Linux kernel layer is used for realizing core functions such as hardware device driving, process and memory management, a network protocol stack, power management, wireless communication and the like.
Electronic devices may include a variety of handheld devices, vehicle-mounted devices, wearable devices (e.g., smartwatches, smartbands, pedometers, etc.), computing devices or other processing devices communicatively connected to wireless modems, as well as various forms of User Equipment (UE), mobile Stations (MS), terminal Equipment (terminal device), and so forth having wireless communication capabilities. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
Referring to fig. 2A, fig. 2A is a schematic flowchart of an interface control method according to an embodiment of the present disclosure, and is applied to an electronic device including a main screen, a plurality of cameras, and a sub-screen, where the cameras are arranged in parallel, the sub-screen is covered above the cameras, and the main screen currently displays a video playing interface; the method comprises the following steps:
step 201: and when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface.
The secondary screen wake-up command may be a voice command, may be an operation such as clicking or sliding on the secondary screen or the main screen, or may be a pressing operation on a physical key on the electronic device, which is not limited herein.
Wherein the event associated with the video playing interface comprises at least one of the following events: adjusting the video playing progress, adjusting the interface display brightness, adjusting the video playing volume, intercepting the video, intercepting the image, adjusting the video switch, adjusting the barrage, adjusting the video playing speed, adjusting the video definition and projecting the screen.
Step 202: when a first touch operation for the auxiliary screen is detected, a first control instruction corresponding to the first touch operation is determined, and the first control instruction is used for triggering a first event associated with the video playing interface.
Wherein the first touch operation may be a click operation, a slide operation, a double-click operation, or the like for the secondary screen.
The first control instruction can be video fast forward, video fast backward, interface display brightness increasing, interface display brightness reducing, video playing volume increasing, video playing volume reducing, video capturing, image capturing, video pausing, video playing, bullet screen launching, bullet screen closing, bullet screen displaying, video playing progress increasing, video playing progress reducing, blue light playing, super-definition playing, high-definition playing, smooth playing, stream-saving playing, screen projection playing and the like.
The first event may be one of events associated with the video playing interface.
Step 203: and executing the first control instruction.
It can be seen that, in the embodiment of the present application, the electronic device includes a main screen, a plurality of cameras, and an auxiliary screen, the plurality of cameras are arranged side by side, the auxiliary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface; when a first touch operation aiming at the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event related to a video playing interface; and executing the first control instruction. Different from the current video playing interface control method, the video playing interface is controlled through the auxiliary screen in the embodiment of the application, so that the video played currently on the main screen cannot be shielded, and the experience of a user in watching the video is improved.
In an implementation manner of the present application, the method further includes:
when a second touch operation aiming at the auxiliary screen is detected, acquiring first information through target hardware, wherein the target hardware comprises the multiple cameras and/or pressure sensors, and the electronic equipment further comprises the pressure sensors;
determining a second control instruction corresponding to the first information, wherein the second control instruction is used for triggering a second event associated with the video playing interface;
and executing the second control instruction.
The second control instruction may be the same as the first control instruction, or may be different from the first control instruction; the second event may be the same as the first event or different from the first event; and are not limited herein.
In an implementation manner of the present application, if the target hardware is a plurality of cameras, the first information is gesture information, the gesture information includes a gesture video and/or a gesture picture, and determining a second control instruction corresponding to the first information includes:
and analyzing and recognizing the gesture video and/or the gesture picture to obtain a second control instruction corresponding to the gesture video and/or the gesture picture.
In an implementation manner of the present application, if the target hardware is a pressure sensor, the first information is pressure information, the pressure information includes at least one of a pressure magnitude, a pressing duration, and a pressing frequency, and the determining the second control instruction corresponding to the first information includes: determining the second control command based on the pressure information.
For example, assuming that the pressure is 1N, the pressing duration is 3s, and the pressing frequency is 1, the corresponding second control instruction is to increase the video playing volume by 30 from the original playing volume; and if the pressure is 1N and the pressing time is 5s, the corresponding second control instruction is to increase the video playing volume by 50 from the original playing volume. The maximum video playing volume is 100, and the maximum video playing volume is not increased after the maximum video playing volume is increased to 100.
In an implementation manner of the present application, the plurality of cameras are in an open state, and the method further includes:
when a fourth touch operation for the auxiliary screen is detected and third information is acquired through the multiple cameras, determining a third control instruction based on the fourth touch operation and the third information, wherein the third control instruction is used for triggering a third event associated with the video playing interface; and executing the third control instruction.
For example, if the fourth touch operation is a single-click operation and the third information is that the hand swings left, the corresponding third control instruction is to increase the video playing volume; assuming that the fourth touch operation is a single-click operation and the third information is that the hand swings to the right, the corresponding third control instruction is to reduce the video playing volume; if the fourth touch operation is a double-click operation and the third information is that the hand swings left, the corresponding third control instruction is to increase the video playing progress; and assuming that the fourth touch operation is a double-click operation, and the gesture information is that the hand swings to the right, the corresponding third control instruction is to reduce the video playing progress.
In an implementation manner of the present application, the pressure sensor is in an on state, and the method further includes:
when a fifth touch operation for the auxiliary screen is detected and fourth information is acquired through the pressure sensor, determining a fourth control instruction based on the fifth touch operation and the fourth information, wherein the fourth control instruction is used for triggering a fourth event related to the video playing interface; and executing the fourth control instruction.
For example, assuming that a fifth touch operation is a sliding operation from left to right on the secondary screen, and the fourth information is that the pressure sensor is lightly pressed once, the corresponding fourth control instruction is to increase the video playing volume; assuming that a fifth touch operation is a right-to-left sliding operation on the secondary screen, and the fourth information is that the pressure sensor is lightly pressed once, a corresponding fourth control instruction is to reduce video playing volume; assuming that a fifth touch operation is a sliding operation from left to right on the secondary screen, and the pressure information is obtained by slightly pressing the pressure sensor twice, the corresponding fourth control instruction is to increase the video playing progress; assuming that the fifth touch operation is a sliding operation from right to left on the secondary screen, and the pressure information is obtained by slightly pressing the pressure sensor twice, the corresponding fourth control instruction is to reduce the video playing progress.
It should be noted that the fourth touch operation and the fifth touch operation may be the same as the first touch operation, the second touch operation, and the third touch operation, or may be different from the first touch operation, the second touch operation, and the third touch operation; the third control command and the fourth control command may be the same as the first control command and the second control command, or may be different from the first control command and the second control command; the third event and the fourth event may be the same as the first event and the second event, or may be different from the first event and the second event; and are not limited herein.
In an implementation manner of the present application, the first event or the second event includes adjusting a video playing progress, adjusting interface display brightness, or adjusting a video playing volume, and the method further includes:
and when the first control instruction or the second control instruction is executed, dynamically displaying an adjusting progress bar corresponding to the first event or the second event on the main screen.
It can be seen that in this application embodiment, the dynamic display adjustment progress bar can convenience of customers confirms the progress of video regulation on the main screen to and adjust according to the progress of video regulation, compare with present video regulation method, can not shelter from the progress that can't know current video regulation because of pressing on the progress bar and adjusting, promoted the experience that the video was watched.
In an implementation manner of the present application, the first event or the second event includes capturing a video or capturing an image, and the method further includes:
and after the first control instruction or the second control instruction is executed, sharing the intercepted image or the intercepted video to a target contact person.
In an implementation manner of the present application, the first event or the second event includes capturing a video or capturing an image, and the method further includes:
and after the first control instruction or the second control instruction is executed, sharing the intercepted image or the recorded video to a target platform.
The target contact may be a contact with a contact frequency greater than or equal to a preset frequency, or a contact selected in a target platform, where the target platform may be, for example, faceTime, wechat, QQ, pay pal, microblog, or the like.
The preset frequency is, for example, 1 time per day, 5 times per week, 20 times per month, or other values, which are not limited herein.
In an implementation manner of the present application, before sharing the captured image or the captured video to the target contact, the method further includes:
displaying a first sub-interface on the video playing interface, wherein the first sub-interface displays the identification of a plurality of contacts;
when a second touch operation aiming at the identification of one contact in the first sub-interface is detected, the one contact is taken as a target contact.
In an implementation manner of the present application, the method further includes:
when a sixth touch operation for the secondary screen is detected, the target contact is determined from the plurality of contacts based on the sixth touch operation.
Wherein the sixth touch operation may be a click operation, a slide operation, a double click operation, or the like for the sub screen.
For example, as shown in fig. 2B, fig. 2B is a schematic structural diagram of a video playing interface and a first sub-interface provided in the embodiment of the present application. The first sub-interface is displayed on the video playing interface, the target contact person can be determined by directly clicking or sliding on the identification of the contact person displayed on the first sub-interface, a mouse identification can be displayed on the first sub-interface, the position of the mouse identification is controlled by sliding on the sub-screen, the mouse identification is dragged to the contact person identification to be selected, and then the selected contact person is determined as the target contact person by clicking.
In an implementation manner of the present application, after sharing the captured image or the captured video to the target contact, the method further includes:
receiving second information sent by the target contact person;
when a third touch operation for the secondary screen is detected, the second information is displayed on the main screen in a bullet screen mode.
Wherein the third touch operation may be a click operation, a slide operation, a double-click operation, or the like for the sub screen.
The second message may be displayed on the upper, lower, left, middle, or right portion of the main screen, the color of the first message may be red, orange, yellow, green, cyan, indigo, or purple, and the display manner of the second message may be scrolling display or still display, which is not limited herein.
For example, as shown in fig. 2C, fig. 2C is a schematic diagram of displaying a second message according to an embodiment of the present application. Assume that the second message is "the video is nice at" is the video available at all times? We look at the multi-tap 4 bar "together and upon detecting a double-tap operation on the secondary screen, the second message is displayed in a bullet-screen form on the primary screen.
Therefore, in the embodiment of the application, the interaction with the target contact person can be carried out while the video is watched, so that the interactive experience in the video watching process is enhanced; in addition, the interactive message with the target contact person is displayed on the video playing interface in a bullet screen mode, the interactive message cannot be missed while the video is watched, the video playing interface cannot be shielded by the message, and the experience of watching the video is further improved.
In an implementation manner of the present application, the method further includes:
when a secondary screen sleep instruction input aiming at the video playing interface is detected, or when no operation aiming at the secondary screen is detected within preset time, the secondary screen is in a closed state.
The preset time may be, for example, 1min, 3min, 5min, 10min or other values, which is not limited herein.
It can be seen that, in the embodiment of the application, when a sub-screen sleep instruction input for the video playing interface is detected or when no operation for the sub-screen is detected within a preset time, the sub-screen is in a closed state, so that energy consumption of the electronic device can be saved.
Referring to fig. 3, fig. 3 is a schematic flowchart of another interface control method according to an embodiment of the present disclosure, which is applied to an electronic device including a main screen, a plurality of cameras, and a secondary screen, where the plurality of cameras are arranged in parallel, the secondary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; the method comprises the following steps:
step 301: and when a secondary screen awakening instruction input aiming at the video playing interface is detected, awakening the secondary screen, wherein the awakened secondary screen is used for triggering events related to the video playing interface.
Step 302: when a first touch operation for the auxiliary screen is detected, a first control instruction corresponding to the first touch operation is determined, and the first control instruction is used for triggering a first event related to the video playing interface.
Step 303: and executing the first control instruction, wherein the first event comprises the adjustment of video playing progress, the adjustment of interface display brightness or the adjustment of video playing volume.
Step 304: and dynamically displaying an adjusting progress bar corresponding to the first event on the main screen.
Step 305: executing the first control instruction, wherein the first event comprises intercepting a video or intercepting an image.
Step 306: and displaying a first sub-interface on the video playing interface, wherein the first sub-interface displays the identifications of a plurality of contacts.
Step 307: when a second touch operation aiming at the identification of one contact in the first sub-interface is detected, the one contact is taken as a target contact.
Step 308: and sharing the intercepted image or the intercepted video to the target contact person.
Step 309: and receiving second information sent by the target contact person.
Step 310: when a third touch operation for the secondary screen is detected, the second information is displayed on the main screen in a bullet screen mode.
Step 311: when a secondary screen sleep instruction input for the video playing interface is detected, or when no operation for the secondary screen is detected within preset time, the secondary screen is in a closed state.
It should be noted that step 311 may be after step 304, or after step 310; the specific implementation process of this embodiment may refer to the specific implementation process described in the above method embodiment, and will not be described here.
Referring to fig. 4, fig. 4 is a schematic flowchart of another interface control method according to an embodiment of the present disclosure, which is applied to an electronic device including a main screen, a plurality of cameras, and a secondary screen, where the plurality of cameras are arranged in parallel, the secondary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; the method comprises the following steps:
step 401: and when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface.
Step 402: when a first touch operation for the secondary screen is detected, first information is acquired through target hardware, the target hardware comprises the multiple cameras and/or the pressure sensor, and the electronic equipment further comprises the pressure sensor.
Step 403: and determining a second control instruction corresponding to the first information, wherein the second control instruction is used for triggering a second event associated with the video playing interface.
Step 404: and executing the second control instruction, wherein the second event comprises the adjustment of video playing progress, the adjustment of interface display brightness or the adjustment of video playing volume.
Step 405: and dynamically displaying an adjustment progress bar corresponding to the second event on the main screen.
Step 406: and executing the second control instruction, wherein the second event comprises intercepting a video or intercepting an image.
Step 407: and displaying a first sub-interface on the video playing interface, wherein the first sub-interface displays the identifications of a plurality of contacts.
Step 408: when a second touch operation aiming at the identification of one contact in the first sub-interface is detected, the one contact is taken as a target contact.
Step 409: and sharing the intercepted image or the intercepted video to the target contact person.
Step 410: and receiving second information sent by the target contact person.
Step 411: when a third touch operation for the secondary screen is detected, the second information is displayed on the main screen in a bullet screen mode.
Step 412: when a secondary screen sleep instruction input aiming at the video playing interface is detected, or when no operation aiming at the secondary screen is detected within preset time, the secondary screen is in a closed state.
It should be noted that step 412 may be after step 405 or after step 411; the specific implementation process of this embodiment may refer to the specific implementation process described in the above method embodiment, and will not be described here.
In accordance with the embodiments shown in fig. 2, fig. 3, and fig. 4, please refer to fig. 5, fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, and as shown in fig. 5, the electronic device further includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface;
when a first touch operation aiming at the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event associated with the video playing interface;
and executing the first control instruction.
It can be seen that, in the embodiment of the present application, the electronic device includes a main screen, a plurality of cameras, and an auxiliary screen, the plurality of cameras are arranged side by side, the auxiliary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface; when a first touch operation aiming at the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event related to a video playing interface; and executing the first control instruction. Different from the current video playing interface control method, the video playing interface is controlled through the auxiliary screen in the embodiment of the application, so that the video played currently on the main screen cannot be shielded, and the experience of a user in watching the video is improved.
In an implementation manner of the present application, the program includes instructions for further performing the following steps:
when a first touch operation aiming at the auxiliary screen is detected, acquiring first information through target hardware, wherein the target hardware comprises the multiple cameras and/or pressure sensors, and the electronic equipment further comprises the pressure sensors;
determining a second control instruction corresponding to the first information, wherein the second control instruction is used for triggering a second event associated with the video playing interface;
and executing the second control instruction.
In an implementation manner of the present application, the first event or the second event includes adjusting a video playing progress, adjusting an interface display brightness, or adjusting a video playing volume, and the program includes instructions further configured to perform the following steps:
and when the first control instruction or the second control instruction is executed, dynamically displaying an adjusting progress bar corresponding to the first event or the second event on the main screen.
In an implementation manner of the present application, the first event or the second event includes capturing a video or capturing an image, and the program includes instructions further for performing the following steps:
and after the first control instruction or the second control instruction is executed, sharing the intercepted image or the intercepted video to a target contact person.
In an implementation manner of the present application, before sharing the captured image or the captured video to the target contact, the program includes instructions further configured to:
displaying a first sub-interface on the video playing interface, wherein the first sub-interface displays the identification of a plurality of contacts;
when a second touch operation aiming at the identification of one contact in the first sub-interface is detected, the one contact is taken as a target contact.
In an implementation manner of the present application, after sharing the captured image or the captured video to the target contact, the program further includes instructions for executing the following steps:
receiving second information sent by the target contact person;
when a third touch operation for the secondary screen is detected, the second information is displayed on the main screen in a bullet screen mode.
In an implementation manner of the present application, the program includes instructions for further performing the following steps:
when a secondary screen sleep instruction input aiming at the video playing interface is detected, or when no operation aiming at the secondary screen is detected within preset time, the secondary screen is in a closed state.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
The following is an embodiment of an apparatus of the present application, which is configured to execute a method implemented in an embodiment of the method of the present application. Referring to fig. 6, fig. 6 is a schematic structural diagram of an interface control apparatus according to an embodiment of the present disclosure, which is applied to an electronic device including a main screen, a plurality of cameras, and a sub-screen, where the plurality of cameras are arranged in parallel, the sub-screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; the device comprises:
a sub-screen control unit 601, configured to wake up the sub-screen when a sub-screen wake-up instruction input for the video playing interface is detected, where the woken-up sub-screen is used to trigger an event associated with the video playing interface;
a determining unit 602, configured to determine, when a first touch operation for the secondary screen is detected, a first control instruction corresponding to the first touch operation, where the first control instruction is used to trigger a first event associated with the video playing interface;
the execution unit 603 executes the first control instruction.
It can be seen that, in the embodiment of the present application, the electronic device includes a main screen, a plurality of cameras, and an auxiliary screen, the plurality of cameras are arranged side by side, the auxiliary screen is covered above the plurality of cameras, and the main screen currently displays a video playing interface; when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for controlling events related to the video playing interface; when a first touch operation for the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for controlling a first event related to a video playing interface; and executing the first control instruction. Different from the current video playing interface control method, the video playing interface is controlled through the auxiliary screen in the embodiment of the application, so that the video played currently on the main screen cannot be shielded, and the experience of a user in watching the video is improved.
In an implementation manner of the present application, the apparatus further includes an obtaining unit 604, where:
the obtaining unit 604 is configured to obtain first information through target hardware when a first touch operation on the secondary screen is detected, where the target hardware includes the multiple cameras and/or pressure sensors, and the electronic device further includes the pressure sensor;
the determining unit 602 is further configured to determine a second control instruction corresponding to the first information, where the second control instruction is used to trigger a second event associated with the video playing interface;
the execution unit 603 is further configured to execute the second control instruction.
In an implementation manner of the present application, the first event or the second event includes adjusting a video playing progress, adjusting interface display brightness, or adjusting video playing volume, and the apparatus further includes a display unit 605, configured to dynamically display an adjustment progress bar corresponding to the first event or the second event on the main screen when the first control instruction or the second control instruction is executed.
In an implementation manner of the present application, the first event or the second event includes an intercepted video or an intercepted image, and the apparatus further includes a sharing unit 606, configured to share the intercepted image or the intercepted video to a target contact after executing the first control instruction or the second control instruction.
In an implementation manner of the application, before sharing the captured image or the captured video to the target contact, the display unit 605 is further configured to display a first sub-interface on the video playing interface, where the first sub-interface displays the identifiers of the plurality of contacts;
the determining unit 602 is further configured to, when a second touch operation directed to an identifier of one of the contacts in the first sub-interface is detected, take the one of the contacts as a target contact.
In an implementation manner of the present application, after sharing the captured image or the captured video to the target contact, the apparatus further includes a communication unit 607, where:
the communication unit 607 is configured to receive second information sent by the target contact;
the display unit 605 is further configured to display the second information on the main screen in a bullet screen form when a third touch operation for the secondary screen is detected.
In an implementation manner of the present application, the sub-screen control unit 601 is further configured to, when a sub-screen sleep command input for the video playing interface is detected, or when no operation for the sub-screen is detected within a preset time, place the sub-screen in a closed state.
The sub-screen control unit 601, the determination unit 602, the execution unit 603, the acquisition unit 604, the display unit 605, and the sharing unit 606 may be implemented by a processor, and the communication unit 607 may be implemented by a communication interface.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods as described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. With such an understanding, the technical solution of the present application may be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or part of the steps of the above methods according to the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, read-Only memories (ROMs), random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. The interface control method is characterized by being applied to electronic equipment comprising a main screen, a plurality of cameras and an auxiliary screen, wherein the cameras are arranged in parallel, the auxiliary screen is covered above the cameras, and the main screen currently displays a video playing interface; the method comprises the following steps:
when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, awakening the auxiliary screen, wherein the awakened auxiliary screen is used for triggering events related to the video playing interface;
when a first touch operation for the auxiliary screen is detected, determining a first control instruction corresponding to the first touch operation, wherein the first control instruction is used for triggering a first event associated with the video playing interface, and the first event comprises a captured video or a captured image;
executing the first control instruction, and sharing the intercepted image or the intercepted video to a target contact person, or sharing the intercepted image or the intercepted video to a target platform, wherein the target contact person comprises a contact person with a contact frequency greater than or equal to a preset frequency and a contact person selected from the target platform;
receiving second information sent by the target contact person;
when a third touch operation for the secondary screen is detected, the second information is displayed on the main screen in a bullet screen mode.
2. The method of claim 1, further comprising:
when a first touch operation aiming at the auxiliary screen is detected, acquiring first information through target hardware, wherein the target hardware comprises the multiple cameras and/or pressure sensors, and the electronic equipment further comprises the pressure sensors;
determining a second control instruction corresponding to the first information, wherein the second control instruction is used for triggering a second event associated with the video playing interface;
and executing the second control instruction.
3. The method of claim 2, wherein the first event or the second event comprises adjusting a video playing progress, adjusting an interface display brightness, or adjusting a video playing volume, the method further comprising:
and when the first control instruction or the second control instruction is executed, dynamically displaying an adjusting progress bar corresponding to the first event or the second event on the main screen.
4. The method of claim 2, wherein the second event comprises intercepting a video or intercepting an image, the method further comprising:
and after the second control instruction is executed, sharing the intercepted image or the intercepted video to the target contact person.
5. The method of claim 4, wherein prior to sharing the captured image or captured video to the target contact, the method further comprises:
displaying a first sub-interface on the video playing interface, wherein the first sub-interface displays the identification of a plurality of contacts;
when a second touch operation aiming at the identification of one contact in the first sub-interface is detected, the one contact is taken as a target contact.
6. The method according to any one of claims 1-5, further comprising:
when a secondary screen sleep instruction input aiming at the video playing interface is detected, or when no operation aiming at the secondary screen is detected within preset time, the secondary screen is in a closed state.
7. The interface control device is characterized by being applied to electronic equipment comprising a main screen, a plurality of cameras and a secondary screen, wherein the cameras are arranged in parallel, the secondary screen is covered above the cameras, and the main screen currently displays a video playing interface; the device comprises:
the auxiliary screen control unit is used for awakening the auxiliary screen when an auxiliary screen awakening instruction input aiming at the video playing interface is detected, and the awakened auxiliary screen is used for triggering an event related to the video playing interface;
the determining unit is used for determining a first control instruction corresponding to a first touch operation when the first touch operation for the secondary screen is detected, wherein the first control instruction is used for triggering a first event associated with the video playing interface, and the first event comprises a captured video or a captured image;
the execution unit is used for executing the first control instruction, and sharing the intercepted image or the intercepted video to a target contact person, or sharing the intercepted image or the intercepted video to a target platform, wherein the target contact person comprises a contact person with a contact frequency greater than or equal to a preset frequency and a contact person selected from the target platform;
the communication unit is used for receiving second information sent by the target contact person;
and the display unit is used for displaying the second information on the main screen in a bullet screen mode when a third touch operation aiming at the auxiliary screen is detected.
8. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-6.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 6.
CN201910394436.6A 2019-05-13 2019-05-13 Interface control method and related equipment Active CN110134237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910394436.6A CN110134237B (en) 2019-05-13 2019-05-13 Interface control method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910394436.6A CN110134237B (en) 2019-05-13 2019-05-13 Interface control method and related equipment

Publications (2)

Publication Number Publication Date
CN110134237A CN110134237A (en) 2019-08-16
CN110134237B true CN110134237B (en) 2022-10-25

Family

ID=67573619

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910394436.6A Active CN110134237B (en) 2019-05-13 2019-05-13 Interface control method and related equipment

Country Status (1)

Country Link
CN (1) CN110134237B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110597432B (en) * 2019-09-27 2024-04-09 腾讯科技(深圳)有限公司 Interface control method, device, computer readable medium and electronic equipment
CN110753258A (en) * 2019-10-18 2020-02-04 维沃移动通信有限公司 Video operation method and device
CN112925572B (en) * 2021-03-01 2023-05-23 联想(北京)有限公司 Control method and device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108055572A (en) * 2017-11-29 2018-05-18 努比亚技术有限公司 Control method, mobile terminal and the computer readable storage medium of mobile terminal
CN108965519A (en) * 2018-07-04 2018-12-07 孟楚雁 A kind of terminal device
CN108966032A (en) * 2018-06-06 2018-12-07 北京奇艺世纪科技有限公司 A kind of barrage social contact method and device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102905170B (en) * 2012-10-08 2015-05-13 北京导视互动网络技术有限公司 Screen popping method and system for video
CN103294347B (en) * 2013-06-24 2016-09-07 贝壳网际(北京)安全技术有限公司 Operation control method and device for video playing of mobile terminal browser and browser
CN105163187B (en) * 2015-08-27 2017-08-01 广东欧珀移动通信有限公司 A kind of video playing control method and device
KR101949493B1 (en) * 2017-02-20 2019-02-19 네이버 주식회사 Method and system for controlling play of multimeida content
CN106951055B (en) * 2017-03-10 2019-07-12 Oppo广东移动通信有限公司 A kind of display control method of mobile terminal, device and mobile terminal
CN106991394A (en) * 2017-03-31 2017-07-28 联想(北京)有限公司 A kind of electronic equipment with fingerprint identification function
CN107979775A (en) * 2017-12-20 2018-05-01 广东欧珀移动通信有限公司 Video related information display methods and relevant device
CN108628515B (en) * 2018-05-08 2020-06-16 维沃移动通信有限公司 Multimedia content operation method and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108055572A (en) * 2017-11-29 2018-05-18 努比亚技术有限公司 Control method, mobile terminal and the computer readable storage medium of mobile terminal
CN108966032A (en) * 2018-06-06 2018-12-07 北京奇艺世纪科技有限公司 A kind of barrage social contact method and device
CN108965519A (en) * 2018-07-04 2018-12-07 孟楚雁 A kind of terminal device

Also Published As

Publication number Publication date
CN110134237A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
US11467715B2 (en) User interface display method, terminal and non-transitory computer-readable storage medium for splitting a display using a multi-finger swipe
CN107562316B (en) Method for showing interface, device and terminal
CN112817684B (en) User interface display method, device, terminal and storage medium
CN108024079B (en) Screen recording method, device, terminal and storage medium
CN110134237B (en) Interface control method and related equipment
US11132123B2 (en) Key display method, terminal, and non-transitory computer-readable medium
WO2019091411A1 (en) Image capturing method, device, terminal, and storage medium
US20190332640A1 (en) Method and apparatus for displaying webpage content
CN111541930B (en) Live broadcast picture display method and device, terminal and storage medium
WO2019233307A1 (en) User interface display method and apparatus, and terminal and storage medium
CN107888965B (en) Image gift display method and device, terminal, system and storage medium
CN107526591B (en) Method and device for switching types of live broadcast rooms
CN110798615A (en) Shooting method, shooting device, storage medium and terminal
WO2013176510A1 (en) Method and apparatus for multi-playing videos
US20150121301A1 (en) Information processing method and electronic device
CN110968364A (en) Method and device for adding shortcut plug-in and intelligent equipment
US20200244869A1 (en) Method for capturing images, terminal, and storage medium
US11194598B2 (en) Information display method, terminal and storage medium
CN109788333A (en) For being displayed in full screen the method and device of video
CN112698901A (en) Application program setting method and device
CN112684963A (en) Screenshot method and device and electronic equipment
US10613622B2 (en) Method and device for controlling virtual reality helmets
CN113794831B (en) Video shooting method, device, electronic equipment and medium
CN113923392A (en) Video recording method, video recording device and electronic equipment
CN109726027B (en) Message viewing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant