WO2018161586A1 - Procédé et appareil permettant de reconnaître un scénario d'affichage d'un terminal mobile, support de stockage et dispositif électronique - Google Patents

Procédé et appareil permettant de reconnaître un scénario d'affichage d'un terminal mobile, support de stockage et dispositif électronique Download PDF

Info

Publication number
WO2018161586A1
WO2018161586A1 PCT/CN2017/106942 CN2017106942W WO2018161586A1 WO 2018161586 A1 WO2018161586 A1 WO 2018161586A1 CN 2017106942 W CN2017106942 W CN 2017106942W WO 2018161586 A1 WO2018161586 A1 WO 2018161586A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
display
touch
mobile terminal
target
Prior art date
Application number
PCT/CN2017/106942
Other languages
English (en)
Chinese (zh)
Inventor
彭德良
苟生俊
易永鹏
袁晓日
甘高亭
郑志勇
杨海
Original Assignee
广东欧珀移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东欧珀移动通信有限公司 filed Critical 广东欧珀移动通信有限公司
Publication of WO2018161586A1 publication Critical patent/WO2018161586A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs

Definitions

  • Embodiments of the present invention relate to the field of display technologies, and in particular, to a display scene identification method, apparatus, storage medium, and electronic device for a mobile terminal.
  • the display screen size of the mobile terminal is getting larger and larger, the loaded application and the functions that can be implemented are more and more abundant, and the user spends a lot of time every day to use the mobile terminal, so the display efficiency and consumption of the mobile terminal are Electrical conditions have become an important indicator to consider the performance of mobile terminals.
  • the content displayed by the mobile terminal during the running process is ever-changing, and there are many corresponding display scenes.
  • the embodiment of the invention provides a display scene recognition method and device for a mobile terminal, a storage medium and an electronic device, which can identify a display scene of the mobile terminal.
  • an embodiment of the present invention provides a display scene identification method for a mobile terminal, including:
  • an embodiment of the present invention provides a display scene recognition apparatus for a mobile terminal, including:
  • a process identifier obtaining module configured to acquire a process identifier of an application running in the mobile terminal
  • a layer attribute obtaining module configured to acquire a layer attribute of each layer in the layer set drawn by the application
  • displaying a scene recognition module configured to identify a display scenario of the mobile terminal according to the process identifier and the layer attribute.
  • an embodiment of the present invention provides a storage medium on which a computer program is stored, and when the computer program is executed on a computer, the computer is configured to perform a display scene identification method of the mobile terminal provided by the embodiment of the present invention.
  • an embodiment of the present invention provides an electronic device, including a memory, and a processor, by using a computer program stored in the memory, to perform steps:
  • the embodiment of the invention provides a display scene recognition method and device for a mobile terminal, a storage medium and an electronic device, which can identify a display scene of the mobile terminal.
  • FIG. 1 is a first schematic flowchart of a method for identifying a display scene of a mobile terminal according to an embodiment of the present disclosure
  • FIG. 2 is a second schematic flowchart of a method for identifying a display scene of a mobile terminal according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a display process according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a Vsync display refresh mechanism according to an embodiment of the present invention.
  • FIG. 5 is a third schematic flowchart of a method for identifying a display scene of a mobile terminal according to an embodiment of the present disclosure
  • FIG. 6 is a structural block diagram of a display scene recognition apparatus for a mobile terminal according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
  • FIG. 9 is another schematic structural diagram of an electronic device according to an embodiment of the present invention.
  • the content displayed by the mobile terminal during the running process is ever-changing, and there are many corresponding display scenarios.
  • the display scene needs to be identified.
  • the embodiment of the invention provides a display scene recognition method for a mobile terminal, which includes the following steps:
  • the method for identifying a display scene of the mobile terminal may further include the following steps: acquiring touch data;
  • the identifying the display scenario of the mobile terminal according to the process identifier and the layer attribute may include: identifying a display scenario of the mobile terminal according to the process identifier, the layer attribute, and the touch data.
  • the method for identifying a display scene of the mobile terminal may further include the following steps: acquiring sensing data by using a preset type of sensor;
  • the identifying the display scenario of the mobile terminal according to the process identifier, the layer attribute, and the touch data may include: determining, according to the process identifier, the layer attribute, the touch data, and the The sense data identifies the display scene of the mobile terminal.
  • the layer attribute may include whether the cached data is empty, a horizontal or vertical screen mode, an attribute of a visible area, an attribute of a transparent area, whether an update area exists, an attribute of an update area, and image information. At least one of them.
  • the attribute of the visible area may include at least one of: whether the visible area is empty, the number, shape, size, and location of the visible area;
  • the attribute of the transparent area may include at least one of the following: the number, shape, size, position of the transparent area, and the relative position of the visible area of the other layer;
  • the attribute of the update area may include at least one of the following: the number, location, shape, size, and ratio of area to screen area of the update area;
  • the image information may include at least one of the following: whether the image is at least one of a solid color, a gradation, a grayscale, a hue, a contrast, a brightness, a saturation, a transparency, and a blur.
  • the method may further include the following steps:
  • the display scene determines at least one of a corresponding target drawing frame rate, a target composite frame rate, a target refresh rate, a target display brightness, and a target resolution; and the frame rate, the target composite frame rate, and the target refresh rate are drawn according to the determined target.
  • At least one of the target display brightness and the target resolution performs at least one of a corresponding layer drawing, a layer composition, a display picture refresh, and a display brightness adjustment.
  • the method further includes the following steps: performing statistical analysis on the touch data, and determining a corresponding touch event according to the statistical analysis result, where the touch data includes at least a touch Position, touch area, and touch duration, the touch event includes at least no event, a click event, a leaving event, a slow sliding event, and a fast sliding event;
  • the identifying the display scenario of the mobile terminal according to the process identifier, the layer attribute, and the touch data may include: identifying, according to the process identifier, the layer attribute, and the touch event, the mobile terminal Show the scene.
  • FIG. 1 is a first schematic flowchart of a method for identifying a display scene of a mobile terminal according to an embodiment of the present invention.
  • the method can be performed by a display scene recognition device of the mobile terminal, wherein the device can be implemented by software and/or hardware, and can generally be integrated in the mobile terminal.
  • the method includes:
  • Step 101 Obtain a process identifier of an application running in the mobile terminal.
  • the mobile terminal in the embodiment of the present invention may be a device including a display screen such as a mobile phone, a smart watch, a tablet computer, a game machine, a personal digital assistant, and a digital multimedia player.
  • the embodiment of the present invention does not limit the operating system loaded in the mobile terminal, and may be an Android (Android) system or a mobile phone window (Windows). Phone, WP) operating system, Linux and IOS systems, etc. In some embodiments, for ease of explanation, embodiments of the present invention are described in the following with a common Android system.
  • the package name is a concept in the Android system. Other operating systems may not have this concept, but if you use the same concept as the package name, such as the name of the application, the same problem will exist.
  • a function in the application may be implemented by one or more processes, and the process identifier (such as the process name) can identify which function the application is currently implementing, that is, can identify This function corresponds to the display scene. Therefore, in this step, the process identifier of the application running in the mobile terminal is obtained, and is used for subsequent display scene identification.
  • the process identifier such as the process name
  • Step 102 Acquire a layer attribute of each layer in the layer set drawn by the application.
  • the layer attributes include a layer identifier (such as a layer name or number), whether the cached data is empty, a horizontal or vertical screen mode, an attribute of the visible area, an attribute of the transparent area, whether an update area exists, Update at least one of the attributes of the area and the image information.
  • a layer identifier such as a layer name or number
  • the corresponding cache data is stored in its corresponding cache area. If the cache data is all 0s, it means that the cache data is empty. In this case, the layer can be understood as one. Blank layer.
  • the application draws at a certain drawing frame rate. For the current layer received, the current layer and the previously drawn layer (the layer received within the preset duration) A comparison is made (for example, whether the gray value corresponding to each coordinate position is changed) to determine whether the current layer has an update area.
  • the specific value of the preset duration is not limited in the embodiment of the present invention.
  • the attribute of the visible area includes at least one of the following: whether the visible area is empty (after the application draws a layer, the corresponding cached data of the layer includes the visible area in the layer) Coordinate information, generally the visible area is a rectangle.
  • the visible area is considered to be empty, the number, shape, size and position of the visible area;
  • the attributes of the transparent area include At least one of the following: the number, shape, size, location, and relative position of the transparent area relative to the visible area of the other layer;
  • the attributes of the updated area include at least one of the following: the number, location, shape, size, and size of the updated area
  • the image information includes at least one of the following: whether the image is a solid color (including color data is all 0), color gradation, grayscale, hue, contrast, brightness, saturation, transparency, and blur. At least one of them.
  • layer attributes are listed above, and when used to identify a display scene, it can be used to identify a display scene according to any one or more of them. Of course, there are other layer properties that can be combined with the items listed above.
  • Step 103 Identify a display scenario of the mobile terminal according to the process identifier and the layer attribute.
  • the corresponding layer scene category may be determined according to the set of layer attributes, and then the display scene of the mobile terminal is identified according to the process identifier and the layer scene category.
  • different process identifiers and different layer scene categories may be arranged and combined in advance to obtain different scene description sets, and corresponding relationship between different scene description items and different display scenes is established in advance, and preset scene correspondings are obtained. relationship.
  • the current scene description item is determined according to the process identifier and the layer scene type, and the preset scene correspondence relationship is queried according to the current scene description item, thereby obtaining the current display scene of the mobile terminal, so as to achieve the purpose of identifying the display scene of the mobile terminal.
  • the application is a video playback application, which generally includes three layers: a layer for displaying video content, which can be defined as U1; two layers of SurfaceView type, and one for displaying a barrage Content can be defined as U2 and the other is used to display the user interface (User Interface, UI) controls (such as playback progress bar, volume control bar, and various control buttons) and advertisements can be defined as U3.
  • the layer identifiers are U1, U2, and U3 described above. When the layer set includes U1 and U2, it corresponds to the first layer scene, and when the layer set includes U1 and U3, it corresponds to the second layer scene.
  • more layer attributes may be introduced, such as U1 and U3 in the layer set, and only a single rectangular update area in U3, corresponding to the third Layer scene.
  • the display scene identification scheme of the mobile terminal acquires the process identifier of the application running in the mobile terminal, and acquires the layer attribute of each layer in the layer set drawn by the application, according to the process identifier and the layer.
  • the attribute identifies the display scene of the mobile terminal.
  • the display scene recognition method of the mobile terminal may further include: acquiring touch data.
  • the identifying the display scenario of the mobile terminal according to the process identifier and the layer attribute may include: identifying a display scenario of the mobile terminal according to the process identifier, the layer attribute, and the touch data.
  • the touch data may include data such as a touch location, a touch area, and a touch duration.
  • the touch data may be statistically analyzed, the corresponding touch event is determined according to the statistical analysis result, and the display scene of the mobile terminal is identified according to the process identifier, the layer attribute, and the touch event.
  • the touch event may include no event (eg, no touch screen is detected to be touched within a preset time period), a click event, a leaving event (eg, from a touch to a release touch), a slow sliding event, a fast sliding event, and the like.
  • the method further includes the following steps:
  • the touch data includes at least a touch position, a touch area, and a touch duration
  • the touch event includes at least an eventless, a click event, a leaving event, and a slow sliding event. And fast sliding events;
  • the step of identifying the display scene of the mobile terminal according to the process identifier, the layer attribute, and the touch data may include: identifying a display scene of the mobile terminal according to the process identifier, the layer attribute, and the touch event.
  • the recognition of the display scene is performed according to three factors: process identification, layer attributes, and touch data, which can further improve the accuracy of display scene recognition.
  • the display scene recognition method of the mobile terminal may further include: acquiring the sensing data by using a preset type of sensor.
  • the identifying a display scenario of the mobile terminal according to the process identifier, the layer attribute, and the touch data including: according to the process identifier, the layer attribute, the touch data, and the The sense data identifies the display scene of the mobile terminal.
  • the preset type of sensor includes: a distance sensor, a light sensor, a magnetic field sensor, a weight sensor, an acceleration sensor, a direction sensor, a gyro sensor, a rotation vector sensor, a barometric pressure sensor, a temperature sensor, and a humidity sensor. At least one of them.
  • the preset type of sensor comprises at least two of the above sensors. The specific sensor type may be determined according to the type of the sensor integrated by the mobile terminal, and the number of the same type of sensor is not limited in the embodiment of the present invention.
  • the selection strategy of the sensor may be preset, or may be determined according to a preset correspondence between the sensor and the application, and the running application is used to query the preset correspondence to obtain a corresponding sensor type.
  • the acquired sensing data may be the sensing data of the current time, or may be the sensing data in the preset time period before the current time, which is not limited in the embodiment of the present invention.
  • the sensor data may be statistically analyzed, the corresponding sensor event is determined according to the statistical analysis result, and the display scene of the mobile terminal is identified according to the process identifier, the layer attribute, the touch time, and the sensor event.
  • statistical analysis of the sensing data corresponding to the acceleration sensor can be performed to obtain that the mobile terminal is currently in a stationary state (first sensor event) and a slow forward state (second sensor event). Still a fast forward state (a third sensor event. Assuming that the determined sensor event is a second sensor event, the second sensor event is combined with the currently acquired process identifier, layer attributes, and touch events to obtain a current scene description item, Further, the display scene of the mobile terminal is identified.
  • the process name, the sensing data, the touch data, and the layer attribute may be used as the identification elements of the display scene, and the order of obtaining the elements is not limited in the embodiment of the present invention, and the types of the elements are identified. The more the displayed display scene, the more accurate it is.
  • a whitelist may be pre-established.
  • the display scene is identified. Otherwise, the display scene is not required to be identified.
  • the “displaying the display scenario of the mobile terminal according to the process identifier and the layer attribute” may be: determining whether the process identifier is in a preset process whitelist. If yes, the display scenario of the mobile terminal is identified according to the process identifier and the layer attribute. For the same reason, other steps can be similarly optimized. The advantage of optimization here is that the display scene can be selectively identified and the system resources consumed by unnecessary scene recognition can be reduced.
  • the identification element includes a process name, a layer attribute, and touch data
  • the touch data corresponding to the touch data is in the preset touch event white list
  • the display scene is determined according to the process name, the layer attribute, and the touch data.
  • the touch data corresponding to the touch data is not in the preset touch event white list
  • the display scene is determined according to the process name and the layer attribute, and the two methods are equivalent to the relationship between the local variable and the global variable, and the local variable is preferentially used when the local variable exists.
  • Display scene recognition otherwise use global variables to display scene recognition.
  • FIG. 2 is a second schematic flowchart of a method for identifying a display scene of a mobile terminal according to an embodiment of the present disclosure, where the method includes the following steps:
  • Step 201 Obtain a process identifier of an application running in the mobile terminal.
  • Step 202 Obtain a layer attribute of each layer in the layer set drawn by the application.
  • Step 203 Identify a display scenario of the mobile terminal according to the process identifier and the layer attribute.
  • Step 204 Determine at least one of a corresponding target drawing frame rate, a target composite frame rate, a target refresh rate, a target display brightness, and a target resolution according to the identified display scene.
  • FIG. 3 is a schematic diagram of a display process according to an embodiment of the present invention.
  • each application (hereinafter referred to as an application or APP) contains one or more layers, and each application APP1, APP2...APPN is designed according to its own application (generally by the corresponding installation package APK) Decided to perform a layer rendering operation (that is, drawing an image on the layer), and after the drawing operation is completed, each application sends all the layers drawn to the layer composition that performs the layer composition operation.
  • Module Surface Flinger
  • the layer composition module selects the visible layer from ListAll to form a visible layer list, which is defined as DisplayList.
  • the layer synthesis module from the system three recyclable frame buffers (Frame Buffer, referred to as BF or buffer, finds an idle FB, and on the free FB, according to the application configuration information, for example, which layer should be bottomed, which layer should be topped, which area is visible, and which The area is a transparent area, etc., and the layers contained in the DisplayList are superimposed by a Compose operation to obtain a final picture to be displayed.
  • BF recyclable frame buffer
  • the picture to be displayed can be transferred to the display hardware (including the display controller and display) so that the picture to be displayed is finally displayed on the display.
  • the type of the display screen is not limited, and may be, for example, a liquid crystal display (Liquid). Crystal Display, LCD).
  • FIG. 4 is a schematic diagram of a Vsync display refresh mechanism according to an embodiment of the present invention.
  • the Vsync refresh mechanism actually inserts a "heartbeat" or system synchronization (Vsync) signal in the entire display process, and is sent by the display controller to the CPU for generating a Vsync interrupt to control each layer drawing operation and graph.
  • Layer synthesis operations need to be done in accordance with the heartbeat, so that the key steps in the entire display process are incorporated into the unified management mechanism of Vsync.
  • the frequency of the Vsync signal is currently commonly 60 Hz. As shown in FIG.
  • the CPU forwards the second Vsync signal Vsync2 to the layer synthesizing module, and the layer synthesizing module starts performing a layer synthesizing operation, and synthesizes the multiple layers drawn by each application to generate The screen to be displayed.
  • the third Vsync signal Vsync3 reaches the CPU, the system starts to perform display refresh and finally displays the to-be-displayed screen on the display.
  • the frequency of the Vsync signal received by the application, the layer synthesis module, and the display screen is consistent, and is a fixed value set in advance.
  • the drawing frame rate In the process of drawing, synthesizing, and refreshing the display of the mobile terminal layer, there are three kinds of frame rates: the drawing frame rate, the composite frame rate, and the refresh rate.
  • the drawing frame rate is a frame rate that triggers the layer synthesis module to perform layer composition after the drawing of the layer is completed, and can be understood as the number of layer frames drawn per unit time (for example, 1 second).
  • the drawing frame rate includes a drawing frame rate of the application and a drawing frame rate of the layer.
  • the video player application in the above example generally includes three layers U1, U2, and U3.
  • the drawing frame rate of the application is the number of times the drawing operation is performed in the application unit time, and one or more layers may be drawn when performing a drawing operation.
  • the drawing frame rate of the layer is the number of times the layer is triggered to be drawn in the unit time of the same number or name (such as U1, U2, or U3 in the previous section).
  • the composite frame rate is a frame rate in which the layers drawn by each application are combined into one picture to be displayed, which can be understood as the number of picture frames synthesized per unit time.
  • the refresh rate is the frame rate at which the display screen of the mobile terminal is refreshed. Typically, the display will refresh at a 60Hz refresh rate.
  • the target rendering frame rate, the target composite frame rate, the target refresh rate, the target display luminance, and the target resolution are collectively referred to as target parameter values.
  • Step 205 Perform corresponding layer drawing, layer composition, display screen refresh, and display according to at least one of the determined target drawing frame rate, the target composite frame rate, the target refresh rate, the target display brightness, and the target resolution. At least one of brightness adjustments.
  • the target drawing frame rate may be the maximum drawing frame rate when the application draws the layer.
  • Performing a layer drawing operation according to the target drawing frame rate may be changing the frequency of the reference signal (such as the Vsync signal sent to the application described above) sent to the application to the target drawing frame rate, or changing the drawing of the application.
  • the layer rendering operation responds to the reference signal to limit the maximum rendering frame rate when the application draws the layer to within the target rendering frame rate. For example, the received 2n+1th signal response, the received 2nth signal does not respond; for example, a set of n (such as 5) signals, the first type of preset number in each group ( For signals such as 1, 2, 4, 5), the signal of the second type of preset number (such as 4) does not respond.
  • the target composite frame rate may be a layer synthesis module (such as Surface) Flinger)
  • the maximum composite frame rate when synthesizing layers may change the frequency of the reference signal (such as the Vsync signal) sent to the layer synthesis module to the target composite frame rate, or change the layer synthesis operation of the layer synthesis module to the reference.
  • the response mechanism of the signal is to limit the maximum rendering frame rate when the application draws the layer to within the target rendering frame rate (as in the example above).
  • Performing a display refresh operation according to the target refresh rate can adjust the frequency of the reference signal (such as the Vsync signal) sent to the display screen to the target refresh rate, and can also change the response mechanism of the display screen refresh operation of the display screen to the reference signal to achieve The frequency of picture refresh is equal to the target refresh rate (as in the above example).
  • the reference signal such as the Vsync signal
  • the frequencies of the Vsync signals received by the application, the layer synthesizing module and the display screen in the embodiment of the present invention are independent, and may be the same or different.
  • the advantage of the setting is that according to different display scenarios. Select a combination of different signal frequencies to optimize display control.
  • Performing the display brightness adjustment according to the target display brightness can be to adjust the display brightness to the value corresponding to the target display brightness, including setting the brightness value of the display to the lowest (extinguished screen) or highest.
  • the target resolution may include a target resolution of the layer and a target resolution of the display screen.
  • Performing a layer drawing operation based on the target resolution of the layer can be performed by the application to draw the corresponding layer based on the target resolution of the layer.
  • Performing the layer compositing operation according to the target resolution of the display screen may be performed by the layer compositing module to adjust the resolution of each layer to the target resolution of the display screen before the composite layer according to the target resolution of the display screen;
  • the resolution of the picture to be displayed is adjusted by the layer synthesis module to the target resolution of the display picture during the execution of the layer composition operation.
  • the adjustment of the resolution can be achieved by means of image processing related means.
  • the method for identifying a display scene of a mobile terminal determines a corresponding target parameter value according to the identified display scene after identifying the current display scene, and then performs a corresponding operation according to the target parameter value, which may be moved.
  • the display of the terminal is more reasonably controlled.
  • more operations can be performed according to the identified display scene.
  • this can be achieved by adding marker bits, with different values on the marker bits corresponding to different operations.
  • FIG. 5 is a third schematic flowchart of a method for identifying a display scene of a mobile terminal according to an embodiment of the present invention. As shown in FIG. 5, the method includes:
  • Step 501 Obtain a process identifier of an application running in the mobile terminal.
  • Step 502 Acquire a layer attribute of each layer in the layer set drawn by the application, and determine a layer scene category according to the layer attribute.
  • Step 503 Acquire sensor data by using a preset type of sensor, and determine a sensor event according to the sensor data.
  • Step 504 Acquire touch data, and determine a touch event according to the touch data.
  • Step 505 Identify a display scenario of the mobile terminal according to the process identifier, the layer scene category, the sensor event, and the touch event.
  • a correspondence between a process identification, a layer scene category, a sensor event, and a touch event and a display scene is given in Table 1 below.
  • the sensor uses an acceleration sensor as an example, and the layer attribute is used in a video playback application.
  • the three layer identifiers (U1, U2, and U3):
  • Step 506 Determine a corresponding target drawing frame rate, a target composite frame rate, and a target refresh rate according to the identified display scene.
  • the target drawing frame rate in the embodiment of the present invention is described by taking the target drawing frame rate of the application as an example, and the target drawing frame rate can also be detailed to the target drawing frame rate of the layer.
  • the correspondence between the display scene and the target rendering frame rate, the target composite frame rate, and the target refresh rate is given in Table 2 below:
  • Table 2 shows the correspondence between the scene and the target drawing frame rate, the target composite frame rate, and the target refresh rate.
  • Step 507 Perform corresponding layer drawing, layer composition, and display screen refresh operation according to the determined target drawing frame rate, the target composite frame rate, and the target refresh rate.
  • the display scene recognition method of the mobile terminal identifies the display scene of the mobile terminal according to the process identifier, the layer scene category, the sensor event, and the touch event, and then performs personalized layer drawing and layer synthesis according to the display scene. And the display screen refresh operation can more reasonably control the display of the mobile terminal.
  • the target drawing frame rate, the target composite frame rate, and the target refresh rate are lower than the normal frequency, the effect of reducing the system power consumption can also be achieved.
  • the embodiment of the invention provides a display scene recognition device for a mobile terminal, which may include:
  • a process identifier obtaining module configured to acquire a process identifier of an application running in the mobile terminal
  • a layer attribute obtaining module configured to acquire a layer attribute of each layer in the layer set drawn by the application
  • displaying a scene recognition module configured to identify a display scenario of the mobile terminal according to the process identifier and the layer attribute.
  • the display scene recognition apparatus of the mobile terminal provided by the embodiment of the present invention may further include:
  • Touch data acquisition module for acquiring touch data.
  • the display scene recognition module is configured to: identify a display scene of the mobile terminal according to the process identifier, the layer attribute, and the touch data.
  • the display scene recognition apparatus of the mobile terminal provided by the embodiment of the present invention may further include:
  • the sensing data acquisition module is configured to acquire sensing data by using a preset type of sensor.
  • the display scene recognition module is configured to: identify a display scene of the mobile terminal according to the process identifier, the layer attribute, the touch data, and the sensing data.
  • the layer attribute may include whether the cached data is empty, a horizontal or vertical screen mode, an attribute of a visible area, an attribute of a transparent area, whether an update area exists, an attribute of an update area, and image information. At least one of them.
  • the attribute of the visible area may include at least one of: whether the visible area is empty, the number, shape, size, and location of the visible area;
  • the attribute of the transparent area may include at least one of the following: the number, shape, size, position of the transparent area, and the relative position of the visible area of the other layer;
  • the attribute of the update area may include at least one of the following: the number, location, shape, size, and ratio of area to screen area of the update area;
  • the image information may include at least one of the following: whether the image is at least one of a solid color, a gradation, a grayscale, a hue, a contrast, a brightness, a saturation, a transparency, and a blur.
  • the display scene recognition apparatus of the mobile terminal provided by the embodiment of the present invention may further include:
  • the target parameter value determining module is configured to determine at least one of a corresponding target drawing frame rate, a target composite frame rate, a target refresh rate, a target display brightness, and a target resolution according to the identified display scene.
  • the operation execution module is configured to perform corresponding layer drawing, layer composition, and display screen refresh according to at least one of the determined target drawing frame rate, the target composite frame rate, the target refresh rate, the target display brightness, and the target resolution. And at least one of the display brightness adjustments.
  • the touch data acquiring module may be further configured to: perform statistical analysis on the touch data, and determine a corresponding touch event according to a statistical analysis result, where the touch data includes at least a touch position, a touch area, and The touch duration includes at least an eventless, a click event, a leaving event, a slow sliding event, and a fast sliding event.
  • the display scene recognition module may be configured to: identify a display scene of the mobile terminal according to the process identifier, the layer attribute, and the touch event.
  • FIG. 6 is a structural block diagram of a display scene recognizing apparatus of a mobile terminal according to an embodiment of the present invention.
  • the apparatus may be implemented by software and/or hardware, and is generally integrated in a mobile terminal, and may perform a display scene recognizing method of the mobile terminal. Perform display scene recognition.
  • the device includes:
  • a process identifier obtaining module 601, configured to acquire a process identifier of an application running in the mobile terminal;
  • a layer attribute obtaining module 602 configured to acquire a layer attribute of each layer in the layer set drawn by the application
  • the display scene identification module 603 is configured to identify a display scene of the mobile terminal according to the process identifier and the layer attribute.
  • the display scene recognizing device of the mobile terminal provided by the embodiment of the present invention can accurately identify the display scenario of the mobile terminal according to the process identifier and the layer attribute of the application running by the mobile terminal.
  • the device further comprises:
  • Touch data acquisition module for acquiring touch data.
  • the display scene recognition module is configured to:
  • the device further comprises:
  • the sensing data acquisition module is configured to acquire sensing data by using a preset type of sensor.
  • the display scene recognition module is configured to:
  • the layer attribute includes whether the cache data is empty, a horizontal or vertical screen mode, an attribute of a visible area, an attribute of a transparent area, whether an update area exists, an attribute of an update area, and an image information. at least one.
  • the attribute of the visible area comprises at least one of the following:
  • the visible area is empty, the number, shape, size and position of the visible area.
  • the attributes of the transparent area include at least one of the following:
  • the attributes of the update area include at least one of the following:
  • the image information includes at least one of the following:
  • the image is at least one of a solid color, a color scale, a gray scale, a hue, a contrast, a brightness, a saturation, a transparency, and a blur.
  • the device further comprises:
  • a target parameter value determining module configured to determine a corresponding target according to the identified display scene after identifying a display scene of the mobile terminal according to the process identifier, the layer attribute, the touch data, and the sensing data At least one of a frame rate, a target composite frame rate, a target refresh rate, a target display brightness, and a target resolution is drawn.
  • the operation execution module is configured to perform corresponding layer drawing, layer composition, and display screen refresh according to at least one of the determined target drawing frame rate, the target composite frame rate, the target refresh rate, the target display brightness, and the target resolution. And at least one of the display brightness adjustments.
  • the touch data acquiring module may be further configured to: perform statistical analysis on the touch data, and determine a corresponding touch event according to a statistical analysis result, where the touch data includes at least a touch position, a touch area, and The touch duration includes at least an eventless, a click event, a leaving event, a slow sliding event, and a fast sliding event.
  • the display scene identification module may be configured to: identify a display scenario of the mobile terminal according to the process identifier, the layer attribute, and the touch event.
  • the embodiment of the present invention further provides a computer readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is configured to perform display scene recognition of the mobile terminal provided by the embodiment of the present invention.
  • the steps in the method are described in detail below.
  • FIG. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
  • the mobile terminal may include: a casing (not shown), a memory 701, and a central processing unit (Central). Processing Unit, CPU) 702 (also referred to as a processor, hereinafter referred to as CPU), a circuit board (not shown), and a power supply circuit (not shown).
  • CPU central processing unit
  • the circuit board is disposed inside a space enclosed by the casing; the CPU 702 and the memory 701 are disposed on the circuit board; and the power circuit is configured to supply power to each circuit or device of the mobile terminal
  • the memory 701 is configured to store executable program code; the CPU 702 runs a computer program corresponding to the executable program code by reading executable program code stored in the memory 701 to implement the following steps:
  • the mobile terminal further includes: a peripheral interface 703, RF (Radio) Frequency, RF circuit 705, audio circuit 706, speaker 711, power management chip 708, input/output (I/O) subsystem 709, touch screen 712, other input/control devices 710, and external port 704, these components are passed through one or A plurality of communication buses or signal lines 707 are in communication.
  • RF Radio
  • the illustrated mobile terminal 700 is merely one example of a mobile terminal, and that the mobile terminal 700 may have more or fewer components than those shown in the figures, and two or more components may be combined. Or it can have different component configurations.
  • the various components shown in the figures can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the mobile terminal for displaying scene recognition provided in this embodiment is described in detail below.
  • the mobile terminal takes a mobile phone as an example.
  • the memory 701 can be accessed by the CPU 702, the peripheral interface 703, etc., and the memory 701 can include a high speed random access memory, and can also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices. Or other volatile solid-state storage devices.
  • a non-volatile memory such as one or more magnetic disk storage devices, flash memory devices. Or other volatile solid-state storage devices.
  • Peripheral interface 703, which can connect the input and output peripherals of the device to CPU 702 and memory 701.
  • the I/O subsystem 709 can connect input and output peripherals on the device, such as touch screen 712 and other input/control devices 710, to peripheral interface 703.
  • the I/O subsystem 709 can include a display controller 7091 and one or more input controllers 7092 for controlling other input/control devices 710.
  • one or more input controllers 7092 receive electrical signals from other input/control devices 710 or transmit electrical signals to other input/control devices 710, and other input/control devices 710 may include physical buttons (press buttons, rocker buttons, etc.) ), dial, slide switch, joystick, click wheel.
  • the input controller 7092 can be connected to any of the following: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
  • the touch screen 712 is an input interface and an output interface between the user mobile terminal and the user, and displays the visual output to the user.
  • the visual output may include graphics, text, icons, videos, and the like.
  • Display controller 7091 in I/O subsystem 709 receives an electrical signal from touch screen 712 or an electrical signal to touch screen 712.
  • Touch screen 712 detects contact on the touch screen, display controller 7091 converts the detected contact into interaction with a user interface object displayed on touch screen 712, ie, implements human-computer interaction, and the user interface object displayed on touch screen 712 can be operational
  • the device may also include a light mouse, which is a touch sensitive surface that does not display a visual output, or an extension of a touch sensitive surface formed by the touch screen.
  • the RF circuit 705 is mainly used for establishing communication between the mobile phone and the wireless network (ie, the network side), and realizing data reception and transmission between the mobile phone and the wireless network. For example, sending and receiving short messages, emails, and the like. Specifically, the RF circuit 705 receives and transmits an RF signal, which is also referred to as an electromagnetic signal, and the RF circuit 705 converts the electrical signal into an electromagnetic signal or converts the electromagnetic signal into an electrical signal, and through the electromagnetic signal and communication network and other devices Communicate.
  • an RF signal which is also referred to as an electromagnetic signal
  • RF circuitry 705 may include known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC ( COder-DECoder, codec) chipset, user identification module (Subscriber Identity Module, SIM) and more.
  • CODEC COder-DECoder, codec
  • the audio circuit 706 is mainly used to receive audio data from the peripheral interface 703, convert the audio data into an electrical signal, and transmit the electrical signal to the speaker 711.
  • the speaker 711 is configured to restore the voice signal received by the mobile phone from the wireless network through the RF circuit 705 to sound and play the sound to the user.
  • the power management chip 708 is used for power supply and power management of the hardware connected to the CPU 702, the I/O subsystem, and the peripheral interface.
  • the mobile terminal provided by the embodiment of the present invention can accurately identify the display scenario of the mobile terminal according to the process identifier and the layer attribute of the application running by the mobile terminal.
  • the display scene recognizing device and the mobile terminal of the mobile terminal provided in the above embodiments may perform the display scene recognizing method of the mobile terminal provided by any embodiment of the present invention, and have the corresponding functional modules and beneficial effects of executing the method.
  • a display scene identification method for a mobile terminal provided by any embodiment of the present invention may be performed.
  • An embodiment of the present invention provides an electronic device, including a memory, and a processor, by using a computer program stored in the memory, to perform a step in a display scene identification method of a mobile terminal provided by an embodiment of the present invention. step.
  • the electronic device can be a mobile terminal device such as a smartphone or tablet.
  • FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
  • the electronic device 800 can include components such as a memory 801, a processor 802, and a sensor 803.
  • Memory 801 can be used to store applications and data.
  • the application stored in the memory 801 contains executable code.
  • Applications can form various functional modules.
  • the processor 802 executes various functional applications and data processing by running an application stored in the memory 801.
  • the processor 802 is a control center of the electronic device, and connects various parts of the entire electronic device using various interfaces and lines, executes the electronic device by running or executing an application stored in the memory 801, and calling data stored in the memory 801. The various functions and processing of data to provide overall monitoring of the electronic device.
  • the electronic device can include sensors 803 such as light sensors, motion sensors, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel according to the brightness of the ambient light, and the proximity sensor may turn off the display panel and/or the backlight when the electronic device moves to the ear.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, Related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping).
  • Other sensors such as gyroscopes, barometers, hygrometers, thermometers, and infrared sensors that can be configured in electronic devices are not described here.
  • the processor 802 in the electronic device loads the executable code corresponding to the process of one or more applications into the memory 801 according to the following instructions, and is executed by the processor 802 to be stored in the memory.
  • the application in 801 thus implementing the steps:
  • the processor 802 can also be used to perform the step of acquiring touch data by calling a computer program stored in the memory.
  • the processor 802 when performing the step of identifying a display scenario of the electronic device according to the process identifier and the layer attribute, may include: identifying an electronic according to the process identifier, the layer attribute, and the touch data The display scene of the device.
  • the processor 802 can also be configured to perform the step of acquiring sensor data by a preset type of sensor by calling a computer program stored in the memory.
  • the processor 802 when performing the step of identifying a display scenario of the electronic device according to the process identifier, the layer attribute, and the touch data, may include: determining, according to the process identifier, the layer attribute, The touch data and the sensor data identify a display scene of the electronic device.
  • the layer attribute may include whether the cached data is empty, a horizontal or vertical screen mode, an attribute of a visible area, an attribute of a transparent area, whether an update area exists, an attribute of an update area, and image information. At least one of them.
  • the attributes of the visible region may include at least one of the following: whether the visible region is empty, the number, shape, size, and location of the visible regions.
  • the properties of the transparent region may include at least one of the following: the number, shape, size, location, and relative position of the transparent regions to the visible regions of the other layers.
  • the attributes of the update area may include at least one of the following: the number, location, shape, size, and ratio of area to screen area of the update area.
  • the image information may include at least one of the following: whether the image is at least one of a solid color, a gradation, a grayscale, a hue, a contrast, a brightness, a saturation, a transparency, and a blur.
  • the processor 802 may further be configured to perform the steps:
  • Performing corresponding layer drawing, layer composition, display picture refresh, and display brightness adjustment according to at least one of the determined target drawing frame rate, target composite frame rate, target refresh rate, target display brightness, and target resolution At least one operation.
  • the processor 802 may further be configured to perform the steps:
  • the touch data includes at least a touch position, a touch area, and a touch duration
  • the touch event includes at least no event, a click event, a leaving event, Slow swipe events and fast swipe events.
  • the processor 802 when performing the step of identifying a display scenario of the electronic device according to the process identifier, the layer attribute, and the touch data, may include: according to the process identifier, the layer attribute, and the The touch event identifies a display scene of the electronic device.
  • the electronic device structure illustrated in FIG. 8 does not constitute a limitation to the electronic device, and may include more or less components than those illustrated, or a combination of certain components, or different component arrangements.
  • the electronic device 800 may further include an input unit 804, an output unit 805, and the like.
  • the input unit 804 can be configured to receive input digits, character information, or user characteristic information (such as fingerprints), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function controls.
  • input unit 804 can include a touch-sensitive surface as well as other input devices. Touch-sensitive surfaces, also known as touch screens or trackpads.
  • the output unit 805 can be used to display information input by the user or information provided to the user and various graphical user interfaces of the mobile terminal, which can be composed of graphics, text, icons, video, and any combination thereof.
  • the output unit 805 can be a display panel.
  • the program may be stored in a computer readable storage medium, and the storage medium may include: Read only memory (ROM, Read Only Memory), Random Access Memory (RAM), disk or CD.
  • ROM Read only memory
  • RAM Random Access Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un procédé et un appareil permettant de reconnaître un scénario d'affichage d'un terminal mobile, un support de stockage, un dispositif électronique. Le procédé comprend les étapes suivantes : obtention d'un identificateur de processus d'une application s'exécutant dans un terminal mobile (101) ; obtention d'attributs de couche de chaque couche dans un ensemble de couches dessinées par l'application (102) ; et reconnaissance d'un scénario d'affichage du terminal mobile en fonction de l'identificateur de processus et des attributs de couche (103). Selon le procédé, un scénario d'affichage d'un terminal mobile peut être reconnu avec précision en fonction d'un identificateur de processus d'une application exécutée et des attributs de couche.
PCT/CN2017/106942 2017-03-10 2017-10-19 Procédé et appareil permettant de reconnaître un scénario d'affichage d'un terminal mobile, support de stockage et dispositif électronique WO2018161586A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710142061.5 2017-03-10
CN201710142061.5A CN106874017B (zh) 2017-03-10 2017-03-10 一种移动终端的显示场景识别方法、装置及移动终端

Publications (1)

Publication Number Publication Date
WO2018161586A1 true WO2018161586A1 (fr) 2018-09-13

Family

ID=59171415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/106942 WO2018161586A1 (fr) 2017-03-10 2017-10-19 Procédé et appareil permettant de reconnaître un scénario d'affichage d'un terminal mobile, support de stockage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN106874017B (fr)
WO (1) WO2018161586A1 (fr)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106874017B (zh) * 2017-03-10 2019-10-15 Oppo广东移动通信有限公司 一种移动终端的显示场景识别方法、装置及移动终端
CN109426830B (zh) * 2017-08-29 2022-05-17 武汉安天信息技术有限责任公司 一种自动识别移动终端场景的方法和装置
CN108156508B (zh) * 2017-12-28 2020-10-20 北京安云世纪科技有限公司 弹幕信息处理的方法、装置、移动终端、服务器及系统
CN109726303A (zh) * 2018-12-28 2019-05-07 维沃移动通信有限公司 一种图像推荐方法和终端
CN116324689A (zh) 2020-10-30 2023-06-23 海信视像科技股份有限公司 显示设备、几何图形识别方法及多图层叠加显示方法
CN115243094A (zh) * 2020-12-22 2022-10-25 海信视像科技股份有限公司 一种显示设备及多图层叠加方法
CN113032884B (zh) * 2021-04-07 2023-03-17 深圳大学 建筑空间量化方法、装置、设备与计算机可读存储介质
CN113672326B (zh) * 2021-08-13 2024-05-28 康佳集团股份有限公司 应用窗口录屏方法、装置、终端设备及存储介质
CN114510207A (zh) * 2022-02-28 2022-05-17 亿咖通(湖北)技术有限公司 图层合成方法、装置、设备、介质及程序产品
CN115690269B (zh) * 2022-10-31 2023-11-07 荣耀终端有限公司 一种视图对象的处理方法及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103050108A (zh) * 2012-12-17 2013-04-17 华为终端有限公司 一种屏幕背光动态调整方法以及用户终端
CN105278811A (zh) * 2015-10-23 2016-01-27 三星电子(中国)研发中心 智能终端的屏幕显示装置和方法
CN105739670A (zh) * 2016-02-01 2016-07-06 广东欧珀移动通信有限公司 用于移动终端的显示方法、装置和移动终端
CN106874017A (zh) * 2017-03-10 2017-06-20 广东欧珀移动通信有限公司 一种移动终端的显示场景识别方法、装置及移动终端
CN106919400A (zh) * 2017-03-10 2017-07-04 广东欧珀移动通信有限公司 一种移动终端的显示场景识别方法、装置及移动终端

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593155B (zh) * 2013-11-06 2016-09-07 华为终端有限公司 显示帧生成方法和终端设备
CN104731543B (zh) * 2015-03-23 2018-03-16 广东欧珀移动通信有限公司 一种屏幕刷新率的显示方法和装置
CN106020987A (zh) * 2016-05-31 2016-10-12 广东欧珀移动通信有限公司 处理器中内核运行配置的确定方法以及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103050108A (zh) * 2012-12-17 2013-04-17 华为终端有限公司 一种屏幕背光动态调整方法以及用户终端
CN105278811A (zh) * 2015-10-23 2016-01-27 三星电子(中国)研发中心 智能终端的屏幕显示装置和方法
CN105739670A (zh) * 2016-02-01 2016-07-06 广东欧珀移动通信有限公司 用于移动终端的显示方法、装置和移动终端
CN106874017A (zh) * 2017-03-10 2017-06-20 广东欧珀移动通信有限公司 一种移动终端的显示场景识别方法、装置及移动终端
CN106919400A (zh) * 2017-03-10 2017-07-04 广东欧珀移动通信有限公司 一种移动终端的显示场景识别方法、装置及移动终端

Also Published As

Publication number Publication date
CN106874017B (zh) 2019-10-15
CN106874017A (zh) 2017-06-20

Similar Documents

Publication Publication Date Title
WO2018161586A1 (fr) Procédé et appareil permettant de reconnaître un scénario d'affichage d'un terminal mobile, support de stockage et dispositif électronique
WO2018161578A1 (fr) Procédé, dispositif, support de stockage et appareil électronique permettant de régler dynamiquement la fréquence de rafraîchissement d'écran
WO2018161571A1 (fr) Procédé, dispositif, support et appareil électronique permettant de régler dynamiquement le niveau d'économie d'énergie d'un terminal
WO2018161602A1 (fr) Procédé et appareil de commande de fréquence d'image d'un dispositif électronique, support d'informations et dispositif électronique
WO2018161585A1 (fr) Procédé et appareil de commande de fréquence de trames d'un dispositif électronique, support de stockage et dispositif électronique
WO2016060514A1 (fr) Procédé pour partager un écran entre des dispositifs et dispositif l'utilisant
WO2018161604A1 (fr) Procédé et dispositif de commande de lecture pour terminal mobile, support de stockage et dispositif électronique
WO2018182296A1 (fr) Dispositif électronique et procédé de partage d'ecran de dispositif électronique
WO2018143624A1 (fr) Procédé de commande d'affichage, support de mémoire et dispositif électronique
WO2018161572A1 (fr) Procédé et appareil de commande de débit de trame de terminal mobile, support de stockage et dispositif électronique
WO2018182287A1 (fr) Procédé de commande à faible puissance d'un dispositif d'affichage et dispositif électronique pour la mise en oeuvre de ce procédé
WO2013027908A1 (fr) Terminal mobile, dispositif d'affichage d'image monté sur véhicule et procédé de traitement de données les utilisant
WO2017052143A1 (fr) Dispositif d'affichage d'image, et procédé de commande associé
EP3403175A1 (fr) Dispositif électronique et procédé d'affichage de données d'application associé
WO2021162436A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé
WO2018161603A1 (fr) Procédé et appareil de commande de dessin d'image de terminal mobile, support, dispositif électronique
WO2018155893A1 (fr) Procédé de fourniture d'interface de fonctionnement multitâche et dispositif électronique mettant en œuvre ledit procédé
WO2019050317A1 (fr) Procédé de commande de sortie audio par application à travers des écouteurs, et dispositif électronique mettant en œuvre ce procédé
WO2015180013A1 (fr) Procédé et appareil d'opération de toucher pour terminal
WO2021066293A1 (fr) Dispositif électronique pour synchroniser une modification entre des écrans et procédé de fonctionnement de celui-ci
WO2018076818A1 (fr) Procédé de sauvegarde de données, appareil, dispositif électronique, support de stockage et système
WO2019124912A1 (fr) Dispositif électronique et procédé de commande de synchronisation de sortie d'un signal correspondant à un état dans lequel un contenu peut être reçu en fonction d'un emplacement d'affichage du contenu affiché sur un affichage
WO2015178707A1 (fr) Dispositif d'affichage et procédé pour le commander
WO2016167610A1 (fr) Terminal portatif pouvant commander la luminosité de ce dernier, et son procédé de commande de luminosité
WO2022030996A1 (fr) Dispositif électronique comprenant un dispositif d'affichage et procédé de fonctionnement associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900021

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900021

Country of ref document: EP

Kind code of ref document: A1