CN116560552A - Information processing method, device, electronic equipment and medium - Google Patents

Information processing method, device, electronic equipment and medium Download PDF

Info

Publication number
CN116560552A
CN116560552A CN202210113109.0A CN202210113109A CN116560552A CN 116560552 A CN116560552 A CN 116560552A CN 202210113109 A CN202210113109 A CN 202210113109A CN 116560552 A CN116560552 A CN 116560552A
Authority
CN
China
Prior art keywords
information
sliding
hand
event
recognition result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210113109.0A
Other languages
Chinese (zh)
Inventor
肖望忠
陈贻东
刘畅
杨典
李丽
黄文臣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202210113109.0A priority Critical patent/CN116560552A/en
Publication of CN116560552A publication Critical patent/CN116560552A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure discloses an information processing method, an information processing device, electronic equipment and a medium. The method comprises the following steps: after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment; inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event; determining layout information of a multimedia data display interface based on the hand recognition result; and displaying the multimedia data display interface on the electronic equipment based on the layout information. By utilizing the method, the acquired sliding information and equipment information are input into the attribute determination model to obtain the hand recognition result corresponding to the sliding event, so that the accuracy of the hand recognition result is improved, meanwhile, the layout information of the multimedia data display interface can be adjusted based on the difference of the hand recognition results, and the user experience is enhanced.

Description

Information processing method, device, electronic equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of computer vision, in particular to an information processing method, an information processing device, electronic equipment and a medium.
Background
With the rapid development of technology, man-machine interaction can be realized through a screen, such as a touch screen, in many emerging fields. Illustratively, a user enables browsing of a short video presentation interface via a touch screen. When browsing the short video display page, the adjustment of the short video display interface can be realized by identifying whether the hand of the user operating the short video display page is left hand or right hand.
When the hand recognition is carried out in the existing technical scheme, the left hand and the right hand are recognized by calculating the slope of the sliding track of the hand of the user on the touch screen, and then the human-computer interaction is realized in response to the hand recognition result.
However, since the existing technical scheme only simply calculates the slope to identify the left hand and the right hand, the identification accuracy is not high, for example, in one sliding, the user performs arc sliding, the slope may be positive before negative, or may be negative before positive, and misjudgment is easy to occur.
Disclosure of Invention
The embodiment of the disclosure provides an information processing method, an information processing device, electronic equipment and a medium, so as to improve accuracy of a recognition result and enhance user experience.
In a first aspect, an embodiment of the present disclosure provides an information processing method, including:
After a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment;
inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand;
determining layout information of a multimedia data display interface based on the hand recognition result;
and displaying the multimedia data display interface on the electronic equipment based on the layout information.
In a second aspect, an embodiment of the present disclosure further provides an information processing apparatus, including:
the electronic equipment comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring sliding information corresponding to a sliding event and equipment information of electronic equipment after the sliding event is intercepted, and the sliding event is triggered by sliding operation on a display screen of the electronic equipment;
the input module is used for inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand;
The determining module is used for determining layout information of the multimedia data display interface based on the hand recognition result;
and the display module is used for displaying the multimedia data display interface on the electronic equipment based on the layout information.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including:
one or more processing devices;
a storage means for storing one or more programs;
the one or more programs are executed by the one or more processing apparatuses, so that the one or more processing apparatuses implement the information processing method provided by the embodiments of the present disclosure.
In a fourth aspect, the embodiments of the present disclosure further provide a computer-readable medium having stored thereon a computer program which, when executed by a processing apparatus, implements the information processing method provided by the embodiments of the present disclosure.
The embodiment of the disclosure provides an information processing method, an information processing device, electronic equipment and a medium. The method comprises the following steps: after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment; inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand; determining layout information of a multimedia data display interface based on the hand recognition result; and displaying the multimedia data display interface on the electronic equipment based on the layout information. By utilizing the technical scheme, the acquired sliding information and equipment information are input into the attribute determination model to obtain the hand recognition result corresponding to the sliding event, so that the accuracy of the hand recognition result is improved, meanwhile, the layout information of the multimedia data display interface can be adjusted based on the difference of the hand recognition result, and the user experience is enhanced.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a flow chart of an information processing method according to a first embodiment of the disclosure;
FIG. 2 is a schematic diagram of a sliding event in an information processing method according to a first embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an information processing method according to a first embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of another information processing method according to the first embodiment of the present disclosure;
fig. 5 is a flow chart of an information processing method according to a second embodiment of the disclosure;
FIG. 6 is a schematic diagram of an event monitoring interface according to a second embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an attribute determining model according to a second embodiment of the present disclosure;
fig. 8 is a schematic flow chart of an operation reasoning provided in the second embodiment of the disclosure;
fig. 9 is a schematic diagram of an optimized multimedia data presentation interface according to a second embodiment of the disclosure;
Fig. 10 is a schematic structural diagram of an information processing apparatus according to a third embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
In the following embodiments, optional features and examples are provided in each embodiment at the same time, and the features described in the embodiments may be combined to form multiple alternatives, and each numbered embodiment should not be considered as only one technical solution. Furthermore, embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
In the description of the present invention, it should be noted that, directions or positional relationships indicated by terms such as "center", "upper", "lower", "left", "right", etc. are based on directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention, and do not indicate that the apparatus or element referred to must have a specific direction, and thus should not be construed as limiting the present invention. In addition, the technical features of the different embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
Example 1
Fig. 1 is a flow chart of an information processing method according to a first embodiment of the present disclosure, where the method may be applicable to a case of identifying a mobile phone, and the method may be performed by an information processing method apparatus, where the apparatus may be implemented by software and/or hardware and is generally integrated on an electronic device, and in this embodiment, the electronic device includes but is not limited to: computers, notebook computers, tablet computers, cell phones and other devices.
As shown in fig. 1, an information processing method provided in a first embodiment of the present disclosure includes the following steps:
s110, after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of the electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment.
The sliding event may be an event triggered by sliding on a display screen of the electronic device, and when sliding on the display screen of the electronic device, the sliding event may be triggered, and the embodiment does not limit a sliding operation, for example, an operation of sliding from left to right on the display screen from any position, or an operation of sliding from top to bottom on the display screen from any position. The sliding information may be information characterizing a sliding operation, and may include a sliding track of the sliding operation, a direction of the sliding, position information of a touch screen point, and the like; the device information of the electronic device may refer to information of the electronic device, and may include a screen width of the electronic device, a screen height of the electronic device, a shape of the electronic device, information related to display of the electronic device, and the like.
In this embodiment, the step may intercept a sliding event, after the sliding event is intercepted, may acquire the sliding information corresponding to the sliding event and the device information of the electronic device, and the embodiment does not limit specific steps for acquiring the sliding information corresponding to the sliding event and the device information of the electronic device, for example, may identify the sliding event to obtain a sliding track corresponding to the sliding event, and acquire the sliding information corresponding to the sliding event through the sliding track, and may acquire the device authority of the electronic device to a user to acquire the device information of the electronic device.
Fig. 2 is a schematic diagram of a sliding event in an information processing method according to an embodiment of the present disclosure, where, as shown in fig. 2, the sliding event may be an event that slides from top to bottom on a display screen of an electronic device.
S120, inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand.
The attribute determination model may be a model for obtaining a hand recognition result based on the sliding information and the device information, and the attribute determination model may be a model which is trained in advance, and the attribute determination model is not limited in this step, so long as a hand recognition result corresponding to the sliding event can be obtained. The hand recognition result may be a result of recognizing a hand triggering the sliding event through the attribute determination model, and the hand recognition result may indicate that the hand triggering the sliding event is a left hand or a right hand.
In one embodiment, the attribute determination model is obtained by training a neural network model based on a training set, the training set including: a first number of first sample pairs, a second number of second sample pairs, and device information;
the first pair of samples is formed from left-hand slide information and information characterizing a left hand, the second pair of samples is formed from right-hand slide information and information characterizing a right hand,
the left-hand sliding information is collected after a left-hand sliding control is triggered in the event monitoring interface, and the right-hand sliding information is collected after a right-hand sliding control is triggered in the event monitoring interface.
The training set may refer to data that trains the neural network model, and the training set may include a first number of first sample pairs, a second number of second sample pairs, and device information. In this embodiment, the first sample pair may be understood as a sample pair related to a left hand, and may be formed by left-hand sliding information and information representing the left hand, where the left-hand sliding information may refer to sliding information corresponding to a left-hand triggered sliding event, that is, information based on sliding operation performed by the left hand on a display screen of the electronic device, may include position information of a left-hand touch screen point and a timestamp corresponding to position information of each touch screen point, and the left-hand sliding information may be collected after the left-hand sliding control is triggered in the event monitoring interface; the information characterizing the left hand may be, for example, a number or letter, etc., with the numeral 1 representing the left hand.
The second sample pair can be understood as a sample pair related to a right hand, and can be formed by right hand sliding information and information representing the right hand, wherein the right hand sliding information can refer to sliding information corresponding to a sliding event triggered by the right hand, namely, the information based on sliding operation performed by the right hand on a display screen of the electronic device can comprise position information of a right hand touch screen point and time stamps corresponding to the position information of each touch screen point, and the right hand sliding information can be acquired after a right hand sliding control is triggered in an event monitoring interface; the information characterizing the right hand may be, for example, a number or letter, etc., with the numeral 0 representing the right hand.
It will be appreciated that the first number is the number of the first pair of samples and the second number is the number of the second pair of samples, and the first number and the second number are not limited in this embodiment and may be set by a person concerned.
It should be noted that, the event monitoring interface may be an interface for monitoring left-hand sliding information or right-hand sliding information, and the user may browse the content presented by the event monitoring interface. The event monitoring interface can comprise a left-hand sliding control and a right-hand sliding control, the left-hand sliding control can be a control for triggering and collecting left-hand sliding information, the right-hand sliding control can be a control for triggering and collecting right-hand sliding information, namely, the left-hand sliding information can be collected after the left-hand sliding control is triggered in the event monitoring interface, and the right-hand sliding information can be collected after the right-hand sliding control is triggered in the event monitoring interface.
For example, after the left-handed slip control is triggered, the collected left-handed slip information, and information characterizing the left hand corresponding to the left-handed slip control, will be stored as a pair of samples into the training set.
The event monitoring interface can also comprise a reset control and the like, wherein the reset control can be a control for re-collecting left-hand sliding information or right-hand sliding information. The information collected after the left hand sliding control or the right hand sliding control is triggered last time can be collected again.
The event monitoring interface may be an interface in a test prototype, or may be an interface in an electronic device, where the test prototype may be a device different from the electronic device, and may be used to collect a training set.
It can be understood that after the training set is collected by the test prototype, the neural network model can be trained based on the training set to obtain an attribute determination model; the test prototype can also send the training set to the server after the training set is collected, and the server trains the neural network model to obtain the attribute determination model.
In one embodiment, the attribute determination model is trained by the server based on a training set.
In one embodiment, the training set is collected by a test prototype.
In this embodiment, the attribute determining model may be obtained by training the neural network model based on the training set, and the specific step of obtaining the attribute determining model by training is not limited in this embodiment, for example, each pair of samples in the training set may be input into the neural network model to obtain a corresponding result of identifying the hand, and then the result of identifying the hand is compared with the information representing the left hand or the right hand to adjust the parameters of the neural network model until the training ending condition is satisfied, where the training ending condition is not limited, and the identification accuracy may be above 95%.
And S130, determining layout information of a multimedia data display interface based on the hand recognition result.
The multimedia data presentation interface may be understood as an interface for presenting multimedia data, where the multimedia data presentation interface may include a plurality of controls to present or operate on the multimedia data, for example, the plurality of controls may include controls that perform conventional operations on the multimedia data, such as forwarding controls to implement forwarding of the multimedia data; a dynamic presentation control, i.e., a control that operates on the set multimedia data, may also be included. Wherein the multimedia data may include video data, audio data, etc. Layout information may be understood as information characterizing the layout of the controls and video data in the multimedia data presentation interface.
In this embodiment, the layout information of the multimedia data presentation interface may be determined according to the difference of the hand recognition result. For example, when the hand recognition result is left hand, the positions of some control layouts in the multimedia data display interface can be adjusted according to the operation habit of the user, so that the user can control the required controls better.
And S140, displaying the multimedia data display interface on the electronic equipment based on the layout information.
After the layout information of the multimedia data display interface is determined through the steps, the corresponding multimedia data display interface can be displayed on the electronic equipment based on the determined layout information.
According to the information processing method provided by the first embodiment of the disclosure, after a sliding event is intercepted, sliding information corresponding to the sliding event and equipment information of electronic equipment are obtained, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment; inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand; determining layout information of a multimedia data display interface based on the hand recognition result; and displaying the multimedia data display interface on the electronic equipment based on the layout information. By using the method, the acquired sliding information and the acquired equipment information are input into the attribute determination model to obtain the hand recognition result corresponding to the sliding event, so that the accuracy of the hand recognition result is improved, and meanwhile, the layout information of the multimedia data display interface can be adjusted based on the difference of the hand recognition results, so that the user experience is enhanced.
In one embodiment, the determining layout information of the multimedia data presentation interface based on the hand recognition result includes:
and determining the display position of the dynamic display control when the multimedia data display interface displays the set multimedia data based on the hand recognition result, wherein the layout information comprises the display position.
The dynamic display control may be a control that operates on the set multimedia data, such as an interest tag control, so as to set the set multimedia data that is interested or not interested, where the set multimedia data may be multimedia data preset by the system, such as advertisement data.
The display position can be considered as the position of the dynamic display control when the multimedia data display interface displays the set multimedia data. Specifically, the layout information may include a display position of the dynamic display control when the multimedia data display interface displays the set multimedia data, the display position may be determined according to a hand recognition result, and the embodiment does not limit specific steps of determining the display position, for example, when the hand recognition result is right-handed, the display position of the dynamic display control when the multimedia data display interface displays the set multimedia data may be located in a right-hand area in the display area of the electronic device.
In one embodiment, the display location is located within a hot spot area formed by an area of the electronic device display area where the operating frequency is greater than a set threshold.
The hot spot area may be understood as an area with a larger operating frequency in the display area of the electronic device, for example, the hot spot area may be formed by an area with an operating frequency greater than a set threshold in the display area of the electronic device, and the operating frequency may refer to a frequency of being operated, such as a frequency of clicking, etc.; the set threshold may be a critical value of the operating frequency in the display area of the electronic device, and the set threshold may be preset by the system or related personnel.
Fig. 3 is a schematic structural diagram of an information processing method according to an embodiment of the present disclosure, and as shown in fig. 3, an implementation process may be roughly divided into a training model stage (i.e., an attribute determination model stage) and an operation reasoning stage (i.e., a stage of obtaining a hand recognition result corresponding to a sliding event). In the training model stage, a first number of first sample pairs, a second number of second sample pairs and equipment information are acquired through a display screen of a test prototype to form a training set, and then the training set is reported to an AI server for training of a neural network model to obtain an attribute determination model. In the operation reasoning stage, firstly, acquiring events (namely, intercepting sliding events) through a display screen of the electronic equipment to acquire sliding information of the corresponding sliding events and equipment information of the electronic equipment; inputting the sliding information and the equipment information into an attribute determination model through an AI engine to obtain a recognition result (namely a hand recognition result corresponding to the sliding event), wherein the AI engine can be a framework for supporting a user to perform machine learning and deep learning model training operation development; and finally, determining layout information of the multimedia data display interface based on the identification result and displaying the multimedia data display interface on the electronic equipment.
Fig. 4 is a schematic structural diagram of another information processing method provided in the first embodiment of the present disclosure, as shown in fig. 4, touch events (i.e., sliding events) may be collected on a test prototype to collect a training set, then the test prototype uploads the training set to a server, the server performs training of a neural network model based on the training set to obtain an attribute determination model, and finally, sliding information and device information corresponding to the sliding events may be obtained through a client on an electronic device and input to the attribute determination model to perform operation reasoning, so as to obtain a hand recognition result corresponding to the sliding events.
Example two
Fig. 5 is a flow chart of an information processing method according to a second embodiment of the present disclosure, where the second embodiment is implemented based on various alternatives in the foregoing embodiments. In the present embodiment, the sliding information is further embodied as: the method comprises the steps of enabling position information of at least two touch screen points and time stamps corresponding to the position information of each touch screen point.
Further, the present embodiment further embodies the device information as: the screen width of the electronic device, the screen height of the electronic device, and the screen density of the electronic device.
For details not yet described in detail in this embodiment, refer to embodiment one.
As shown in fig. 5, an information processing method provided in a second embodiment of the present disclosure includes the following steps:
s210, after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment, and the sliding information comprises: the device information includes: the screen width of the electronic device, the screen height of the electronic device, and the screen density of the electronic device.
It may be understood that the sliding operation on the display screen includes at least two touch screen points, where the touch screen points may refer to points where a hand acts on the display screen, so that the sliding information may include position information of at least two touch screen points and time stamps corresponding to the position information of each touch screen point. The coordinate axis of the display screen may take the bottom edge of the display screen as the x axis, the left edge of the display screen as the y axis, and the intersection point of the bottom edge and the left edge of the display screen as the origin, which is not limited in this embodiment.
The device information may include a screen width of the electronic device, a screen height of the electronic device, and a screen density of the electronic device. The screen width may be considered as the width of the electronic device display screen, the screen height may be considered as the height of the electronic device display screen, and the screen density may refer to the number of pixels possessed by the electronic device display screen per inch.
After the sliding event is intercepted, the position information of at least two touch screen points in the sliding event, the time stamp corresponding to the position information of each touch screen point, the screen width of the electronic equipment, the screen height of the electronic equipment and the screen density of the electronic equipment can be obtained.
It should be noted that, after the sliding information of the sliding event and the device information of the electronic device are obtained, the sliding information and the device information may be respectively stored in a file in an associated manner, so as to facilitate reading the information of the sliding event. The locations where the slide information and the device information of the electronic device are stored are not limited here.
In one embodiment, the sliding information is stored in a file; the device information is stored in a file name of the file.
In this embodiment, the sliding information may be stored in a file, and the type of the file is not limited, for example, may be a csv file, in which, for the same sliding event, different sliding events may be marked as different sliding event identifiers, and the location information of each touch screen point and the timestamp corresponding to the location information of each touch screen point may be stored in the file, and then the device information of the electronic device corresponding to the sliding event is stored in the file name of the file.
It may be understood that, in the file corresponding to the file name, the location information of each touch screen point corresponding to the same sliding event and the timestamp corresponding to the location information of each touch screen point may be included, and the location information of each touch screen point corresponding to different sliding events and the timestamp corresponding to the location information of each touch screen point may also be included, where the sliding events may be distinguished according to the sliding event identifier.
S220, inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand.
And S230, determining layout information of a multimedia data display interface based on the hand recognition result.
And S240, displaying the multimedia data display interface on the electronic equipment based on the layout information.
An information processing method provided in a second embodiment of the present disclosure includes: after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment, and the sliding information comprises: the device information includes: the screen width of the electronic device, the screen height of the electronic device and the screen density of the electronic device; inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand; determining layout information of a multimedia data display interface based on the hand recognition result; by using the method, the sliding information and the equipment information can be more embodied by displaying the multimedia data display interface on the electronic equipment based on the layout information, and the accuracy of the hand recognition result is further improved.
An exemplary description of the information processing method is given below:
first, the operational record (i.e., training set) may be collected at a test prototype: touch events at the test prototype intercept interface (i.e., event monitoring interface) to collect a training set and labeled left and right hand (i.e., information characterizing the left hand), wherein the training set comprises: a first number of first sample pairs formed from left-hand slide information and information characterizing the left hand, a second number of second sample pairs formed from right-hand slide information and information characterizing the right hand, and device information. Fig. 6 is a schematic diagram of an event monitoring interface provided in a second embodiment of the present disclosure, as shown in fig. 6, where the event monitoring interface includes a left-hand sliding control 1, a right-hand sliding control 2, and a reset control 3, and left-hand sliding information is collected after the left-hand sliding control 1 is triggered in the event monitoring interface, and right-hand sliding information is collected after the right-hand sliding control 2 is triggered in the event monitoring interface.
The left-hand and right-hand sliding information in the training set collected may then be serialized into a csv file, e.g., in the following format:
index x y t
3 287.0 1662.0 542805640
3 285.0 1658.0 542805662
3 283.0 1646.0 542805668
3 278.0 1628.0 542805673
3 273.0 1603.0 542805679
3 267.0 1577.0 542805684
the index may be a sliding event identifier, where the index is the same and represents the same sliding event, and data of each row in the table may represent position information of each touch screen point in the sliding event and a timestamp corresponding to the position information of each touch screen point, x is an abscissa of the touch screen point on the display screen, y is an ordinate of the touch screen point on the display screen, and t is a timestamp.
Meanwhile, the information (namely, information representing the left hand and information representing the right hand) such as the left hand or the right hand of the current operator can be stored in a file name, and the format of the file name can be as follows: < date > _time > _screen width > _screen height > _screen density > _whether current operator is left or right hand · csv, such as: 20211108_175755_1080_2206_2.75_left.csv.
Then, an offline training AI model (i.e., an attribute determination model) is established, specifically, the neural network may be trained based on the training set to obtain an operator (left and right hand) recognition model (i.e., an attribute determination model), and a training process/parameter thereof is described as follows:
fig. 7 is a schematic structural diagram of an attribute determining model according to a second embodiment of the present disclosure, and as shown in fig. 7, an input of a neural network model may be normalized to a format of 6×16: the system comprises 6 dimensions, each dimension has 16 data points, wherein the 6 dimensions are respectively the time interval (determined based on the corresponding time stamp) between two samples, the screen density, the screen width and the screen height, the y coordinate of the touch screen point in a coordinate system taking the lower left corner of the mobile phone as an origin, and the x coordinate of the touch screen point in a coordinate system taking the lower left corner of the mobile phone as an origin.
For 6 dimensions, 16 sampling points are required for each input. When the number of the touch points is greater than 16, 16 sampling points can be acquired, for example, 16 touch screen points are obtained by using equidistant sampling, and when the number of the touch screen points included in the sliding event is less than 16, the touch screen points are discarded.
Placing 6 x 16 touch screen points into a neural network model for training, wherein the first layer passes through a convolution kernel of 3 x 6 and a ReLU activation function; the second layer is a convolution kernel of 3 x 6 x 12, and is sampled by MaxPooling after a ReLU activation function; the third layer is a convolution of 3 x 12 x 24, and is also subjected to a ReLU activation function, and a one-dimensional vector is obtained through the expansion of the flat on the basis; the fourth, fifth and sixth layers are all full-connection layers with different numbers of neurons and corresponding activation functions, and the activation function of the last layer is Sigmoid.
And the neural network model outputs the probability that the training target is left-hand operation and compares the probability with the information representing the left hand or the right hand correspondingly to adjust the parameters of the neural network model until the accuracy representing the left hand or the right hand operation meets the training ending condition, and then the attribute determining model is obtained.
Then, the reasoning can be run on the client, namely after the attribute determination model is obtained, the client is issued through the network, and at the moment, the client intercepts the sliding event, so that the AI engine on the client can be used for running the reasoning. Fig. 8 is a schematic flow chart of operation reasoning provided in the second embodiment of the present disclosure, as shown in fig. 8, a sliding event is first intercepted at a client, whether an AI engine has been initialized is determined, after the initialization of the AI engine is confirmed, the operation reasoning stage is entered to obtain a left hand or a right hand (i.e. a hand recognition result corresponding to the sliding event), and finally, a left hand logic or a right hand logic (i.e. layout information of a multimedia data display interface) may be respectively determined based on the left hand or the right hand.
Finally, the UI may be optimized according to the reasoning result (i.e. layout information of the multimedia data presentation interface), i.e. the multimedia data presentation interface is displayed on the electronic device based on the layout information. Fig. 9 is a schematic diagram of an optimized multimedia data display interface provided in the second embodiment of the present disclosure, when a hand recognition result indicates that a hand triggering a sliding event is a right hand, as shown in fig. 9, a left diagram is a multimedia data display interface before optimization, and in the current interface, a dynamic display control 4 is located in a middle area below the multimedia data display interface; the right image is an optimized multimedia data display interface, and at this time, the dynamic display control 4 is located in a right area below the multimedia data display interface, so that the interactive experience of the user is facilitated.
As can be seen from the above description, the embodiment of the present disclosure uses machine learning in principle, and considers not only the slope of the sliding track, but also the coordinates of the touch screen point, the screen height, the screen width and the screen density of the electronic device, so that the accuracy of the hand recognition result can reach more than 95% through the attribute determination model. Meanwhile, on an application scene, the embodiment of the disclosure can adjust the control position in the multimedia data display interface, so that the control is closer to an operator of a user, the user can operate more conveniently, and multimedia data display interfaces with different styles can be provided for the left and right hands, for example, when a next video is slid in, if the next video is a custom layout such as an advertisement, two sets of different multimedia data display interfaces can be provided, and the fingers of the user are closer to detail buttons of the advertisement; the reverse application may also be used to place the close button of the advertisement farther from the user's finger, thereby achieving a longer advertisement dwell time.
Example III
Fig. 10 is a schematic structural diagram of an information processing apparatus according to a third embodiment of the present disclosure, where the apparatus may be suitable for identifying a mobile phone, and the apparatus may be implemented by software and/or hardware and is generally integrated on an electronic device.
As shown in fig. 10, the apparatus includes:
the acquiring module 310 is configured to acquire, after a sliding event is intercepted, sliding information corresponding to the sliding event and device information of an electronic device, where the sliding event is triggered by a sliding operation on a display screen of the electronic device;
the input module 320 is configured to input the sliding information and the device information into an attribute determination model, so as to obtain a hand recognition result corresponding to the sliding event, where the hand recognition result indicates whether the hand triggering the sliding event is a left hand or a right hand;
a determining module 330, configured to determine layout information of a multimedia data display interface based on the hand recognition result;
and a display module 340, configured to display the multimedia data presentation interface on the electronic device based on the layout information.
In this embodiment, after intercepting a sliding event, the device acquires sliding information corresponding to the sliding event and device information of an electronic device through an acquiring module 310, where the sliding event is triggered by a sliding operation on a display screen of the electronic device; inputting the sliding information and the equipment information into an attribute determination model through an input module 320 to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand; determining layout information of a multimedia data display interface based on the hand recognition result through a determining module 330; the multimedia data presentation interface is displayed on the electronic device based on the layout information by a display module 340. According to the device, the acquired sliding information and equipment information are input into the attribute determination model to obtain the hand recognition result corresponding to the sliding event, so that the accuracy of the hand recognition result is improved, meanwhile, the layout information of the multimedia data display interface can be adjusted based on the difference of the hand recognition results, and the user experience is enhanced.
Further, the sliding information includes:
the method comprises the steps of enabling position information of at least two touch screen points and time stamps corresponding to the position information of each touch screen point.
Further, the device information includes:
the screen width of the electronic device, the screen height of the electronic device, and the screen density of the electronic device.
Further, the sliding information is stored in a file; the device information is stored in a file name of the file.
Further, the attribute determination model is obtained by training a neural network model based on a training set, and the training set comprises: a first number of first sample pairs, a second number of second sample pairs, and device information;
the first pair of samples is formed from left-hand slide information and information characterizing a left hand, the second pair of samples is formed from right-hand slide information and information characterizing a right hand,
the left-hand sliding information is collected after a left-hand sliding control is triggered in the event monitoring interface, and the right-hand sliding information is collected after a right-hand sliding control is triggered in the event monitoring interface.
Further, the determining module 330 is specifically configured to:
and determining the display position of the dynamic display control when the multimedia data display interface displays the set multimedia data based on the hand recognition result, wherein the layout information comprises the display position.
Further, the display position is located in a hot spot area, and the hot spot area is formed by an area with the operating frequency larger than a set threshold value in the display area of the electronic equipment.
The information processing device can execute the information processing method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 11 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present disclosure. Fig. 11 shows a schematic structural diagram of an electronic device 400 suitable for use in implementing embodiments of the present disclosure. The electronic device 400 in the embodiments of the present disclosure may include, but is not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, PDA), tablet computers (Portable Android Device, PAD), portable multimedia players (Portable Media Player, PMP), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, as well as stationary terminals such as digital TVs, desktop computers, and the like. The electronic device 400 shown in fig. 11 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way.
As shown in fig. 11, the electronic device 400 may include one or more processing means (e.g., a central processing unit, a graphics processor, etc.) 401, which may perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 402 or a program loaded from a storage 408 into a random access Memory (Random Access Memory, RAM) 403. The one or more processing devices 401 implement the information processing method as provided in the present disclosure. In the RAM403, various programs and data necessary for the operation of the electronic device 400 are also stored. The processing device 401, the ROM 402, and the RAM403 are connected to each other by a bus 404. An Input/Output (I/O) interface 405 is also connected to bus 404.
In general, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a liquid crystal display (Liquid Crystal Display, LCD), a speaker, a vibrator, and the like; storage 408 including, for example, magnetic tape, hard disk, etc., storage 408 being for storing one or more programs; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate with other devices wirelessly or by wire to exchange data. While fig. 11 shows an electronic device 400 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communications device 409, or from storage 408, or from ROM 402. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 401.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (EPROM or flash Memory), an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as the hypertext transfer protocol (Hyper Text Transfer Protocol, HTTP), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a Local Area Network (LAN), a Wide Area Network (WAN), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device 400; or may exist alone without being assembled into the electronic device 400.
The computer readable medium stores one or more computer programs which when executed by a processing device implement the method of: the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device 400 to: computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. Each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented in software or hardware. The name of a module does not in some cases define the module itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a field programmable gate array (Field Programmable Gate Array, FPGA), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a special standard product (Application Specific Standard Parts, ASSP), a System On Chip (SOC), a complex programmable logic device (Complex Programming logic device, CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, example 1 provides an information processing method, including:
after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment;
inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand;
determining layout information of a multimedia data display interface based on the hand recognition result;
and displaying the multimedia data display interface on the electronic equipment based on the layout information.
In accordance with one or more embodiments of the present disclosure, example 2 is in accordance with the method of example 1,
the sliding information includes:
the method comprises the steps of enabling position information of at least two touch screen points and time stamps corresponding to the position information of each touch screen point.
In accordance with one or more embodiments of the present disclosure, example 3 is in accordance with the method of example 1,
the device information includes:
the screen width of the electronic device, the screen height of the electronic device, and the screen density of the electronic device.
In accordance with one or more embodiments of the present disclosure, example 4 is in accordance with the method of example 1,
the sliding information is stored in a file; the device information is stored in a file name of the file.
In accordance with one or more embodiments of the present disclosure, example 5 is in accordance with the method of example 1,
the attribute determination model is obtained by training a neural network model based on a training set, and the training set comprises: a first number of first sample pairs, a second number of second sample pairs, and device information;
the first pair of samples is formed from left-hand slide information and information characterizing a left hand, the second pair of samples is formed from right-hand slide information and information characterizing a right hand,
the left-hand sliding information is collected after a left-hand sliding control is triggered in the event monitoring interface, and the right-hand sliding information is collected after a right-hand sliding control is triggered in the event monitoring interface.
In accordance with one or more embodiments of the present disclosure, example 6 is in accordance with the method of example 1,
the step of determining layout information of the multimedia data display interface based on the hand recognition result comprises the following steps:
and determining the display position of the dynamic display control when the multimedia data display interface displays the set multimedia data based on the hand recognition result, wherein the layout information comprises the display position.
In accordance with one or more embodiments of the present disclosure, example 7 is in accordance with the method of example 6,
the display position is located in a hot spot area, and the hot spot area is formed by an area with the operation frequency larger than a set threshold value in the display area of the electronic equipment.
According to one or more embodiments of the present disclosure, example 8 provides an information processing apparatus, comprising:
the electronic equipment comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring sliding information corresponding to a sliding event and equipment information of electronic equipment after the sliding event is intercepted, and the sliding event is triggered by sliding operation on a display screen of the electronic equipment;
the input module is used for inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand;
the determining module is used for determining layout information of the multimedia data display interface based on the hand recognition result;
and the display module is used for displaying the multimedia data display interface on the electronic equipment based on the layout information.
Example 9 provides an electronic device according to one or more embodiments of the present disclosure, comprising:
One or more processing devices;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processing devices, cause the one or more processing devices to implement the methods of any of examples 1-7.
According to one or more embodiments of the present disclosure, example 10 provides a computer-readable medium having stored thereon a computer program which, when executed by a processing device, implements a method as described in any of examples 1-7.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (10)

1. An information processing method, characterized in that the method comprises:
after a sliding event is intercepted, acquiring sliding information corresponding to the sliding event and equipment information of electronic equipment, wherein the sliding event is triggered by sliding operation on a display screen of the electronic equipment;
inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand;
determining layout information of a multimedia data display interface based on the hand recognition result;
and displaying the multimedia data display interface on the electronic equipment based on the layout information.
2. The method of claim 1, wherein the sliding information comprises:
The method comprises the steps of enabling position information of at least two touch screen points and time stamps corresponding to the position information of each touch screen point.
3. The method of claim 1, wherein the device information comprises:
the screen width of the electronic device, the screen height of the electronic device, and the screen density of the electronic device.
4. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the sliding information is stored in a file; the device information is stored in a file name of the file.
5. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the attribute determination model is obtained by training a neural network model based on a training set, and the training set comprises: a first number of first sample pairs, a second number of second sample pairs, and device information;
the first pair of samples is formed from left-hand slide information and information characterizing a left hand, the second pair of samples is formed from right-hand slide information and information characterizing a right hand,
the left-hand sliding information is collected after a left-hand sliding control is triggered in the event monitoring interface, and the right-hand sliding information is collected after a right-hand sliding control is triggered in the event monitoring interface.
6. The method of claim 1, wherein the determining layout information of a multimedia data presentation interface based on the hand recognition result comprises:
and determining the display position of the dynamic display control when the multimedia data display interface displays the set multimedia data based on the hand recognition result, wherein the layout information comprises the display position.
7. The method of claim 6, wherein the step of providing the first layer comprises,
the display position is located in a hot spot area, and the hot spot area is formed by an area with the operation frequency larger than a set threshold value in the display area of the electronic equipment.
8. An information processing apparatus, characterized in that the apparatus comprises:
the electronic equipment comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring sliding information corresponding to a sliding event and equipment information of electronic equipment after the sliding event is intercepted, and the sliding event is triggered by sliding operation on a display screen of the electronic equipment;
the input module is used for inputting the sliding information and the equipment information into an attribute determination model to obtain a hand recognition result corresponding to the sliding event, wherein the hand recognition result indicates whether the hand triggering the sliding event is left hand or right hand;
The determining module is used for determining layout information of the multimedia data display interface based on the hand recognition result;
and the display module is used for displaying the multimedia data display interface on the electronic equipment based on the layout information.
9. An electronic device, comprising:
one or more processing devices;
a storage means for storing one or more programs;
when the one or more programs are executed by the one or more processing devices, the one or more processing devices are caused to implement the method of any of claims 1-7.
10. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processing device, implements the method according to any of claims 1-7.
CN202210113109.0A 2022-01-29 2022-01-29 Information processing method, device, electronic equipment and medium Pending CN116560552A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210113109.0A CN116560552A (en) 2022-01-29 2022-01-29 Information processing method, device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210113109.0A CN116560552A (en) 2022-01-29 2022-01-29 Information processing method, device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN116560552A true CN116560552A (en) 2023-08-08

Family

ID=87492049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210113109.0A Pending CN116560552A (en) 2022-01-29 2022-01-29 Information processing method, device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN116560552A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117170982A (en) * 2023-11-02 2023-12-05 建信金融科技有限责任公司 Man-machine detection method, device, electronic equipment and computer readable medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117170982A (en) * 2023-11-02 2023-12-05 建信金融科技有限责任公司 Man-machine detection method, device, electronic equipment and computer readable medium
CN117170982B (en) * 2023-11-02 2024-02-13 建信金融科技有限责任公司 Man-machine detection method, device, electronic equipment and computer readable medium

Similar Documents

Publication Publication Date Title
CN110321958B (en) Training method of neural network model and video similarity determination method
JP2023547917A (en) Image segmentation method, device, equipment and storage medium
CN111399729A (en) Image drawing method and device, readable medium and electronic equipment
CN111190520A (en) Menu item selection method and device, readable medium and electronic equipment
CN110865734B (en) Target object display method and device, electronic equipment and computer readable medium
CN113033682B (en) Video classification method, device, readable medium and electronic equipment
CN113032172B (en) Abnormality detection method and device and electronic equipment
CN112487871B (en) Handwriting data processing method and device and electronic equipment
CN111738316B (en) Zero sample learning image classification method and device and electronic equipment
CN113986003A (en) Multimedia information playing method and device, electronic equipment and computer storage medium
CN113760834B (en) File classification method, device, equipment and medium
CN116560552A (en) Information processing method, device, electronic equipment and medium
CN111126159A (en) Method, apparatus, electronic device, and medium for tracking pedestrian in real time
CN114417169A (en) Information recommendation optimization method, device, medium, and program product
CN112183388B (en) Image processing method, device, equipment and medium
CN110084298A (en) Method and device for detection image similarity
CN113255812A (en) Video frame detection method and device and electronic equipment
CN111552620A (en) Data acquisition method, device, terminal and storage medium
CN112200183A (en) Image processing method, device, equipment and computer readable medium
CN111931075A (en) Content recommendation method and device, computer equipment and storage medium
CN112231023A (en) Information display method, device, equipment and storage medium
US20230281983A1 (en) Image recognition method and apparatus, electronic device, and computer-readable medium
US20230306602A1 (en) Information generation method and apparatus, electronic device, and computer readable medium
CN111832354A (en) Target object age identification method and device and electronic equipment
CN116188887A (en) Attribute recognition pre-training model generation method and attribute recognition model generation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination