CN110858230B - Data processing method, apparatus and machine readable medium - Google Patents

Data processing method, apparatus and machine readable medium Download PDF

Info

Publication number
CN110858230B
CN110858230B CN201810892250.9A CN201810892250A CN110858230B CN 110858230 B CN110858230 B CN 110858230B CN 201810892250 A CN201810892250 A CN 201810892250A CN 110858230 B CN110858230 B CN 110858230B
Authority
CN
China
Prior art keywords
interaction
distance
content
information
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810892250.9A
Other languages
Chinese (zh)
Other versions
CN110858230A (en
Inventor
李佳文
崔威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201810892250.9A priority Critical patent/CN110858230B/en
Publication of CN110858230A publication Critical patent/CN110858230A/en
Application granted granted Critical
Publication of CN110858230B publication Critical patent/CN110858230B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • G06Q30/0271Personalized advertisement

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a data processing method, a data processing device and a device, wherein the method specifically comprises the following steps: providing a plurality of interactive contents, wherein the plurality of interactive contents respectively correspond to the interactive distance condition information; determining first distance information between a user and a display device; determining target interaction content according to the first distance information and the interaction distance condition information; and controlling the display device to display the target interactive content. The embodiment of the application can improve the interactivity in the pushing process and the user experience.

Description

Data processing method, apparatus and machine readable medium
Technical Field
The present application relates to the field of information delivery technologies, and in particular, to a data processing method, apparatus, and machine readable medium.
Background
With the increasing popularity of large screen terminals, a wide variety of outdoor media has grown explosively. Pushing content such as advertisements to users on large screen terminals becomes a very common pushing way.
The present pushing method can broadcast contents such as characters, pictures, voice, video and the like in a fixed large-screen terminal, so that a user can watch the played contents.
However, in the current push method, the user is generally only able to passively watch the content played by the large-screen terminal, which makes the push effect poor. Particularly, in the case that the user is not interested in the played content, a good pushing effect cannot be obtained.
Disclosure of Invention
In view of the foregoing, an embodiment of the present application provides a data processing method, a data processing apparatus, and a device, so as to solve the problems of the related art.
In order to solve the above-mentioned problems, an embodiment of the present application discloses a data processing method, including: providing a plurality of interactive contents, wherein the plurality of interactive contents respectively correspond to the interactive distance condition information; determining first distance information between a user and a display device; determining target interaction content according to the first distance information and the interaction distance condition information; and controlling the display device to display the target interactive content.
To solve the above problems, an embodiment of the present application further discloses a data processing system, including: a display device, a distance information detection device and a processing device;
wherein the processing device is coupled with the display device, and the processing device is coupled with the distance information detection device;
The distance information detection device detects first distance information between a user and the display device;
the processing device provides a plurality of interactive contents, the plurality of interactive contents respectively correspond to the interactive distance condition information, the target interactive contents are determined according to the first distance information and the interactive distance condition information, and the target interactive contents are sent to the display device so that the display device displays the target interactive contents.
In order to solve the above problem, an embodiment of the present application further discloses a data processing apparatus, including:
the interactive content providing module is used for providing various interactive contents, and the various interactive contents respectively correspond to the interactive distance condition information;
a distance information determining module for determining first distance information between the user and the display device;
the target interaction content determining module is used for determining target interaction content according to the first distance information and the interaction distance condition information; and
and the display control module is used for controlling the display device to display the target interactive content.
To solve the above problem, an embodiment of the present application further discloses an apparatus, including:
One or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform one or more of the methods described previously.
To address the above issues, an embodiment of the present application also discloses one or more machine readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods described previously.
As can be seen from the foregoing, the data processing method, apparatus and machine readable medium according to the embodiments of the present application have at least the following advantages:
according to the embodiment of the application, the target interaction content matched with the first distance information is pushed to the user according to the first distance information between the user and the display device. Because the target interaction content can be used for man-machine interaction, the interaction distance condition information corresponding to the target interaction content can represent the distance suitable for the man-machine interaction corresponding to the target interaction content, and therefore the method and the device can display the target interaction content corresponding to the interaction distance condition information under the condition that the first distance information between the user and the display device is suitable for the man-machine interaction corresponding to the interaction distance condition information, the interactivity in the pushing process can be improved, and the user experience is improved.
In addition, the embodiment of the application can enable the user to select and view the interested content through the target interactive content, so that the conversion effect corresponding to the interested content can be improved, and the pushing effect can be further improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a data processing system in accordance with an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps of a first embodiment of a data processing method according to the present application;
FIG. 3 is a flowchart illustrating steps of a second embodiment of a data processing method according to the present application;
FIG. 4 is a flowchart illustrating the steps of a third embodiment of a data processing method according to the present application;
FIG. 5 is a flowchart illustrating the steps of a fourth embodiment of a data processing method according to the present application;
FIG. 6 is an illustration of displaying content based on first distance information between a user and a display device according to an embodiment of the present application;
FIG. 7 is a block diagram of an embodiment of a data processing apparatus of the present application; and
fig. 8 is a schematic structural diagram of an apparatus according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the application, fall within the scope of protection of the application.
The inventive concept is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the description herein of specific embodiments is not intended to limit the inventive concepts to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the application.
Reference in the specification to "one embodiment," "an embodiment," "one particular embodiment," etc., means that a particular feature, structure, or characteristic may be included in the described embodiments, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, where a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature, structure, or characteristic in connection with other ones of the embodiments whether or not explicitly described. In addition, it should be understood that the items in the list included in this form of "at least one of A, B and C" may include the following possible items: (A); (B); (C); (A and B); (A and C); (B and C); or (A, B and C). Likewise, an item listed in this form of "at least one of A, B or C" may mean (A); (B); (C); (A and B); (A and C); (B and C); or (A, B and C).
In some cases, the disclosed embodiments may be implemented as hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried on or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be executed by one or more processors. A machine-readable storage medium may be implemented as a storage device, mechanism, or other physical structure (e.g., volatile or non-volatile memory, a media disc, or other media other physical structure device) for storing or transmitting information in a form readable by a machine.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or ordering. Preferably, however, such specific arrangement and/or ordering is not necessary. Rather, in some embodiments, such features may be arranged in a different manner and/or order than as shown in the drawings. Furthermore, inclusion of a feature in a particular figure that is not necessarily meant to imply that such feature is required in all embodiments and that, in some embodiments, may not be included or may be combined with other features.
The embodiment of the application provides a data processing method, which specifically comprises the following steps: providing a plurality of interactive contents, wherein the plurality of interactive contents can respectively correspond to the interactive distance condition information; determining first distance information between a user and a display device; determining target interaction content according to the first distance information and the interaction distance condition information; and controlling the display device to display the target interactive content.
In the embodiment of the application, the display device is used for displaying the content. The embodiment of the application can provide various interactive contents, and one interactive content can be used for man-machine interaction. The plurality of interaction contents can be respectively corresponding to the interaction distance condition information. For one type of interactive content, the corresponding interaction distance condition information may characterize a distance suitable for the interaction corresponding to the interactive content.
According to an embodiment, the interaction distance condition information may include: the first interaction distance condition information may include interaction content corresponding to the first interaction distance condition information: and the interactive content is felt. The first interaction distance condition information may represent a distance suitable for somatosensory interaction.
According to another embodiment, the interaction distance condition information may include: the second interaction distance condition information includes: the interactive contents are touched. The second interaction distance condition information may characterize a distance suitable for touch interaction.
According to the embodiment of the application, the target interaction content matched with the first distance information is pushed to the user according to the first distance information between the user and the display device.
Because the target interaction content can be used for man-machine interaction, the interaction distance condition information corresponding to the target interaction content can represent the distance suitable for interaction corresponding to the target interaction content, and therefore the method and the device can display the target interaction content corresponding to the interaction distance condition information under the condition that the first distance information between the user and the display device is suitable for man-machine interaction corresponding to the interaction distance condition information, interactivity in the pushing process can be improved, and user experience is improved. In addition, the embodiment of the application can enable the user to select and view the interested content through the target interactive content, so that the conversion effect corresponding to the interested content can be improved, and the pushing effect can be further improved.
The embodiment of the application can be applied to outdoor media scenes. Outdoor media may refer to a medium of propagation that exists in public spaces, particularly advertising content that is placed in outdoor venues such as the roof of buildings and the front of business, roadsides, etc., and may be in a primary form including, but not limited to: balloon, airship, carriage, large inflatable model, corridor of high-grade district in colleges and universities, elevator cab, etc. The outdoor media can be further classified into a flat outdoor copper character, an iron character, an organic glass character, a solid wood plaque, a neon lamp, an enterprise image sculpture of a public space, an LED (light emitting diode ) electronic outdoor screen, an outdoor LCD (liquid crystal display, liquid Crystal Display for short) advertisement machine, an outdoor electronic newspaper reading column, and the like according to different propagation media.
In an outdoor media scene, the traditional technology generally broadcasts non-interactive contents such as characters, pictures (such as static pictures, dynamic pictures and the like), audio, video and the like in a display device, wherein the non-interactive contents can not support man-machine interaction; for the user, he can only passively watch the non-interactive content played by the display device, which makes the pushing less effective.
The method and the device can display the target interaction content corresponding to the interaction distance condition information under the condition that the first distance information between the user and the display device is suitable for the man-machine interaction corresponding to the interaction distance condition information, so that the interactivity in the pushing process can be improved, and the user experience is improved.
With reference to fig. 1, a schematic structural diagram of a data processing system according to an embodiment of the present application may specifically include: a display device 101, a distance information detection device 102, and a processing device 103;
wherein, the processing device 103 is coupled with the display device 101, and the processing device 103 is coupled with the distance information detection device 102;
the distance information detecting device 102 detects first distance information between the user and the display device 101, the processing device 103 provides a plurality of kinds of interaction content, the plurality of kinds of interaction content respectively correspond to the interaction distance condition information, determines target interaction content according to the first distance information and the interaction distance condition information, and sends the target interaction content to the display device 101 so that the display device 101 displays the target interaction content.
In the embodiment of the present application, the display device 101 is used for displaying content. Specifically, the display device 101 may be used to display the content transmitted by the processing device 103. The display device 101 may be any device having a display function, such as an LED or an LCD, and the embodiment of the present application is not limited to the specific display device 101.
Distance information detecting means 102 for detecting first distance information between the user and the display means 101. In practical applications, the front plane of the distance information detecting device 102 and the front plane of the display device 101 may be located on the same vertical plane, so that the first distance information between the user and the display device 101 may be identical to the first distance information between the user and the distance information detecting device 102. Alternatively, the distance information detecting means 102 may be located on the upper plane or the front plane of the display means 101.
Of course, the first distance information between the user and the display device 101 may be different from the first distance information between the user and the distance information detecting device 102. For example, the distance d between the user and the display device 101 may be determined according to the distance d1 between the user and the distance information detecting device 102 and the distance d2 between the distance information detecting device 102 and the display device 101. The embodiment of the present application is not limited to a specific azimuth relationship between the distance information detecting device 102 and the display device 101.
The embodiment of the application does not limit the ranging mode corresponding to the distance information detecting device 102. The ranging manners corresponding to the distance information detecting device 102 may include, but are not limited to: ultrasound, radar, WIFI (wireless fidelity ) probes, cameras, microphones, infrared sensors, and the like.
One skilled in the art can determine the number of distance information detecting devices 102 corresponding to one display device 101 according to the actual application requirements. The determination basis for the correspondence of the number of distance information detecting means 102 may include: the display device 101 corresponds to the environmental information, and/or the detection range of the distance information detection device 102. The environment information corresponding to the display apparatus 101 may include: the display device 101 corresponds to parameters of the space, such as length, width, height parameters, and the like. The environment information corresponding to the display device 101 may further include: the orientation parameters of the display device 101 in space. For example, a plurality of distance information detecting devices 102 may be symmetrically provided with respect to the display device 101 at intermediate positions in the longitudinal/width direction of the space. As another example, if the display device 101 is located at a non-intermediate position in the length/width direction of the space, a plurality of distance information detecting devices 102 may be asymmetrically provided with respect to the display device 101, for example, if the display device 101 is located at a left position in the length/width direction of the space, the number of distance information detecting devices 102 provided at a right position of the display device may be greater than the number of distance information detecting devices 102 provided at a left position of the display device. It will be appreciated that the embodiment of the present application is not limited to the specific number of distance information detecting devices 102 corresponding to one display device 101.
In addition, it is understood that one skilled in the art may set one display device 101 or a plurality of display devices 101 in one space according to actual application requirements, and the embodiment of the present application does not limit the specific number of display devices 101 in one space.
The processing device 103 may receive the first distance information sent by the distance information detecting device 102, and determine, according to the first distance information, the content that needs to be displayed by the display device 101, so that the content displayed by the display device matches with the first distance information.
In the embodiment of the present application, the coupling between the devices may specifically include: contact connection or non-contact connection between devices. The contact connection may include: data line connections, etc. The non-contact connection may include: WIFI connection, bluetooth connection, infrared connection, etc. It will be appreciated that embodiments of the application are not limited to particular coupling arrangements between devices.
The distance information detecting means 102 detects first distance information between the user and the display means 101. The first distance information is used to characterize the distance between the user and the display device 101. The first distance information may directly or indirectly characterize the distance between the user and the display device 101.
According to one embodiment, the first distance information may be a distance value. According to another embodiment, the first distance information may be a distance parameter from which the processing means 103 may determine the distance value. Taking an ultrasonic ranging mode as an example, an ultrasonic transmitter transmits ultrasonic waves to a certain direction, timing is started at the same time of transmitting time, the ultrasonic waves propagate in the air, the ultrasonic waves return immediately when the ultrasonic waves hit an obstacle (a user in the embodiment of the application), the ultrasonic receiver stops timing immediately when receiving reflected waves, and distance parameters detected by an ultrasonic device can comprise: time t recorded by the timer; the propagation speed of the ultrasonic wave in the air is 340m/s, and the processing means 103 can calculate the distance(s) of the emission point from the obstacle, namely: s=340 t/2. It will be appreciated that embodiments of the present application are not limited to a particular process of determining a distance value based on a distance parameter.
According to a further embodiment, the first distance information may include: user image information. The user image information can be monitored through the camera, and the distance corresponding to the user image information can be determined according to whether the user image information is obtained through monitoring or whether the user image information obtained through monitoring meets preset conditions or not because the position of the camera is relatively fixed.
The processing device 103 may send the target interactive content to the display device 101, so that the display device 101 displays the target interactive content.
The processing device 103 may be a device having a processing function. The processing device 103 may include, but is not limited to: CPU, MPU (microprocessor Micro Processor Unit), MCU (microcontroller Micro Control Unit), SOC (System on Chip), etc. Examples of processing means 103 may include: computers, etc.
In practical applications, the display device 101 and the processing device 103 may be integrated in the same device, or the display device 101 and the processing device 103 may be separately disposed, and it is understood that the specific arrangement of the display device 101 and the processing device 103 is not limited in the embodiments of the present application.
In the embodiment of the application, the interactive content can be used for man-machine interaction. Optionally, the interactive contents may include: and the interaction interface or interaction prompt information is used for enabling the user to perform man-machine interaction through the interaction content.
In an alternative embodiment of the present application, the interactive content may be provided by APP (Application). The APP may provide an interface that includes interface elements (interface element), which may refer to a series of elements included in a software or system interface that may meet the interaction requirements of a user. The interface elements may include: controls, etc. Wherein, the control can refer to a component for providing or realizing the user interface function, the control is encapsulation of data and methods, and the control can have own properties and methods. The presentation of the control may include: buttons, dialog boxes, or windows, etc.
In an alternative embodiment of the present application, the interactive contents may include: and the interactive content is felt. The somatosensory interaction can support the user to use natural somatosensory actions (limb actions and/or gestures), such as simple gestures of waving a hand to swing, turn around, advance, retreat, lift the hand, jump, squat and the like, so that the content of the display device can be controlled in a high altitude, and new man-machine interaction experience and information input are realized. According to the embodiment of the application, under the condition that the first distance information between the user and the display device is suitable for somatosensory interaction, the somatosensory interaction content is displayed, so that the interactivity in the pushing process can be improved, and the user experience is improved.
In an optional embodiment of the present application, the somatosensory interaction content may include: information (such as name or category, introduction, etc.) of at least one content, and somatosensory action information corresponding to at least one content. The somatosensory interaction content can respond to somatosensory actions, and the information of the content and the somatosensory action information can play roles of prompting and guiding so that a user can trigger the required content through the somatosensory action information. The somatosensory motion information may include: information of limb movements and/or gestures. Further alternatively, if the somatosensory action of the user is received, the target content corresponding to the somatosensory action can be determined and displayed.
In an application example of the present application, the somatosensory interaction content may include: the name or category of the content, so that the user selects the content of interest according to the name or category; further, in response to the somatosensory action of the user, a target content corresponding to the somatosensory action may be determined and displayed, and the target content may be a content corresponding to a certain name or a content under a certain category.
The embodiment of the application can identify the somatosensory operation of the user through the somatosensory identification device of the somatosensory camera. In one embodiment, the principle of the somatosensory motion of the user identified by the somatosensory recognition device may be: transmitting infrared rays to perform three-dimensional positioning on the space; the motion sensor can track the motion of the user, and a digital skeleton of the user is established according to the data, so that M (M is a natural number, for example, M is 20) parts of the human body can be tracked in real time. Of course, the embodiment of the application does not limit the specific process of the somatosensory recognition device for recognizing the somatosensory action of the user.
In an alternative embodiment of the present application, the interactive contents may include: the interactive contents are touched. Touch interaction is a contact interaction mode. The user can touch the content on the display device through fingers or a touch pen, so that the touch operation can be realized, the keyboard and mouse operations can be eliminated, and the man-machine interaction is more straightforward. Display devices supporting touch interactions may include, but are not limited to: capacitive touch screens, resistive touch screens, or surface acoustic wave touch screens, among others. According to the embodiment of the application, under the condition that the first distance information between the user and the display device is suitable for touch interaction, the touch interaction content is displayed, so that the interactivity in the pushing process can be improved, and the user experience is improved.
The touch interaction content may include: information (e.g., name or category, introduction, etc.) of at least one content. The touch interaction content can respond to the touch event, so that the interface corresponding to the touch event can be jumped to, that is, the target content corresponding to the touch event can be displayed. The touch events described above may include, but are not limited to: click events, slide events, etc.
In the embodiment of the present application, optionally, the interaction distance condition information may specifically include:
presetting a distance value range; and/or
Monitoring to obtain user image information; and/or
The monitored user image information accords with preset conditions.
In this embodiment of the present application, optionally, the processing device 103 determines the shortest target distance information from the first distance information of the multiple users, and determines the target interaction content according to the target distance information and the interaction distance condition information. The embodiment of the application can meet the interaction requirement of the user closest to the display device.
In this embodiment of the present application, optionally, the processing device 103 may respond to a triggering operation of the user on the target interactive content, and send target content corresponding to the triggering operation to the display device 101, so that the display device 101 displays the target content; the target content may specifically include: next level interactive content, or non-interactive content.
In the embodiment of the application, the ith (i is a natural number) level interactive content can be correspondingly entered into the ith level interface after man-machine interaction. The first-level interaction content can correspond to a first-level interface after human-computer interaction, and the second-level interaction content can correspond to a second-level interface after human-computer interaction, and the like. Wherein the secondary interactive content may be linked to the primary interactive content. For example, in the case of touch interaction, the primary interactive content may be jumped from the primary interactive content to the secondary interactive content in response to a click operation by the user, or the primary interactive content may be returned from the secondary interactive content in response to a return operation by the user.
The user can jump to non-interactive content by triggering the i-th level of interactive content. For example, the user may jump to the non-interactive content by triggering the name of the non-interactive content corresponding to the i-th level of interactive content.
In this embodiment of the present application, optionally, in a preset period of time after receiving a last trigger operation of the first user on the target interactive content, the display device 101 continuously displays the target content corresponding to the last trigger operation.
In practical application, the triggering operation of the first user may correspond to the receiving time, and the last triggering operation may refer to a triggering operation with the latest receiving time. It should be noted that, the last trigger operation may be changed according to the trigger operation of the first user. In the case where the last trigger operation is changed, the preset time period is also changed.
The processing device 103 may not transmit the content other than the target content to the display device 101 after transmitting the target content to the display device 101 in a preset period of time after receiving the last trigger operation of the first user on the target interactive content, that is, the content displayed by the display device 101 does not change in the preset period of time. The embodiment of the application can reduce the influence of other users on the content displayed by the display device in the interaction process of the first user and improve the interaction experience of the first user.
If the second distance information between the second user and the display device corresponds to the second interactive content within the preset time period after the last trigger operation of the first user on the target interactive content is received, the processing device 103 may not send the second interactive content to the display device 101, so as to reduce the influence of the second user on the target content of the first user. Wherein the first user and the second user may refer to different arbitrary users.
In this embodiment of the present application, optionally, the first distance information may also correspond to non-interactive content, where the processing device 103 may send the non-interactive content to the display device 101, so that the display device 101 displays the non-interactive content.
Method embodiment one
Referring to fig. 2, a flowchart illustrating steps of a first embodiment of a data processing method according to the present application may specifically include the following steps:
step 201, providing a plurality of interactive contents, wherein the plurality of interactive contents can respectively correspond to the interactive distance condition information;
step 202, determining first distance information between a user and a display device;
step 203, determining target interaction content according to the first distance information and the interaction distance condition information;
and 204, controlling the display device to display the target interactive content.
At least one step included in the method of the embodiment of the present application may be executed by a server and/or a client, and the embodiment of the present application is not limited to the specific execution body corresponding to the step. The server and the client are located in a wired or wireless network, and data interaction is carried out between the server and the client through the wired or wireless network.
In step 201, a server may provide a variety of interactive contents. Wherein the above-mentioned various interactive contents may be provided by an operator, and in particular, at least one interactive content input by the operator may be received. The server may issue the plurality of kinds of interactive contents to the client, so that the client stores the plurality of kinds of interactive contents.
The interaction distance condition information may characterize a distance suitable for human-machine interaction. For one type of interactive content, the corresponding interaction distance condition information may characterize a distance suitable for the interaction corresponding to the interactive content.
According to the embodiment of the application, under the condition that the first distance information between the user and the display device is suitable for man-machine interaction, the interactive content corresponding to the interactive distance condition information is displayed, so that the interactivity in the pushing process can be improved, and the user experience is improved.
The person skilled in the art can determine the interaction distance condition information according to the actual application requirement, so as to display the interaction content corresponding to the interaction distance condition information under the condition that the distance information is suitable for man-machine interaction.
In an alternative embodiment of the present application, the interaction distance condition information may include: the first interaction distance condition information may include interaction content corresponding to the first interaction distance condition information: and the interactive content is felt. The first interaction distance condition information may characterize a distance suitable for somatosensory interaction.
In another alternative embodiment of the present application, the interaction distance condition information may include: the second interaction distance condition information may include interaction content corresponding to the second interaction distance condition information: the interactive contents are touched. The second interaction distance condition information may characterize a distance suitable for touch interaction.
In practical applications, the distance suitable for somatosensory interaction is generally greater than the distance suitable for touch interaction, and therefore, the distance corresponding to the first interaction distance condition information is generally greater than the distance corresponding to the second interaction distance condition information.
The embodiment of the application can provide the following technical scheme of the interaction distance condition information:
technical solution 1
In the technical solution 1, the interaction distance condition information may specifically include: and presetting a distance value range.
The preset distance value range can be determined by a person skilled in the art according to actual application requirements. The determining basis of the preset distance value range may include: and displaying the corresponding environmental information of the device and/or detecting the detection condition of the distance information detecting device.
In an optional embodiment of the present application, a detection test may be performed by using a distance information detection device under the environmental information corresponding to the display device, so as to obtain the preset distance value range.
Optionally, the preset distance value range may include: the first preset distance value range may correspond to first interaction distance condition information, and the first preset distance value range may be a first distance threshold to a second distance threshold, where the second distance threshold may be greater than the first distance threshold.
Optionally, the preset distance value range may include: the second preset distance value range may correspond to second interaction distance condition information, and the second preset distance value range may be a third distance threshold to a first distance threshold, where the first distance threshold may be greater than the third distance threshold.
In an application example of the present application, assuming that the size of the display device is 1.2 m×1.8 m, referring to table 1, an illustration of an upper boundary and a lower boundary corresponding to a preset distance value range according to an embodiment of the present application is shown. It will be understood that the upper boundaries and the lower boundaries shown in table 1 are merely illustrative, and in fact, those skilled in the art may determine the upper boundaries and the lower boundaries corresponding to the preset distance value ranges according to the environmental information corresponding to the display device and/or the detection situation of the distance information detection device. It will be appreciated that the above 1.2 m×1.8 m is merely an example of the size of the display device, and those skilled in the art may actually determine the size of the display device according to practical application requirements, and the embodiment of the present application is not limited to the specific size of the display device.
TABLE 1
Preset distance value range Upper boundary of Lower boundary of
A first preset distance value range 5-7 m 2 meters
Second preset distance value range 2 meters 0 meter
Technical solution 2
In the technical solution 2, the interaction distance condition information may specifically include: and monitoring to obtain the image information of the user.
In the embodiment of the application, the user image information can be monitored through the camera, and as the position of the camera is relatively fixed, whether the first distance information between the user and the display device is suitable for human-computer interaction can be judged according to whether the user image information is monitored; in particular, in case of monitoring the user image information, it may be determined that the first distance information between the user and the display device is suitable for human-computer interaction.
In an optional embodiment of the present application, when the user image information is obtained through preliminary monitoring, the distance information may be considered to conform to the first interaction distance condition information. Specifically, the user image information is not monitored at the previous time, but is monitored at the current time, the distance information can be considered to be in accordance with the first interaction distance condition information.
Technical solution 3
In the technical solution 3, the interaction distance condition information may specifically include: the monitored user image information accords with preset conditions. Specifically, limb information and/or facial range of the user in the user image information can be determined according to the monitored user image information.
Alternatively, the first interaction distance condition information may include: the limb information of the user in the user image information is first limb information. The first limb information may include a more complete limb, in which case the user may be considered to be farther from the display device. For example, the first limb information may include: upper limb (forearm upper arm hand), and/or lower limb (thigh lower leg meniscus), etc.
Optionally, the second interaction distance condition information may include: the limb information of the user in the user image information is second limb information, and the size corresponding to the second limb information accords with the size condition. The second limb information may include an incomplete limb, in which case, if the limb is larger (e.g., thicker), the user may be considered to be closer to the display device. For example, the second limb information includes: incomplete upper limbs, etc.
The interaction distance condition information is described in detail through the technical schemes 1 to 3, it can be understood that a person skilled in the art may adopt any one or combination of the technical schemes 1 to 3 according to actual application requirements, or may also adopt other technical schemes according to actual application requirements, and the embodiment of the application does not limit specific interaction distance condition information.
In step 202, the first distance information is used to characterize a distance between a user and a display device. The first distance information may directly or indirectly characterize a distance between the user and the display device.
According to one embodiment, the first distance information may be a distance value. According to another embodiment, the first distance information may be a distance parameter, and the distance value may be determined according to the distance parameter. According to a further embodiment, the first distance information may include: user image information. The user image information can be monitored through the camera, and the distance corresponding to the user image information can be determined according to whether the user image information is obtained through monitoring or whether the user image information obtained through monitoring meets preset conditions or not because the position of the camera is relatively fixed.
In step 203, the target interactive content may be determined according to the first distance information and the interactive distance condition information. Alternatively, the first distance information may be matched with interaction distance condition information corresponding to the target interaction content. For example, the first distance information may be within a preset distance range corresponding to the target interactive content. As another example, the first distance information may include user image information. For another example, the user image information corresponding to the first distance information accords with the preset condition corresponding to the target interactive content.
In an alternative embodiment of the present application, step 203 may specifically include: and determining target interaction distance condition information matched with the first distance information, and determining target interaction content according to the target interaction distance condition information. The first distance information and the interaction distance condition information can be matched to obtain target interaction distance condition information matched with the first distance information. For example, if the target interaction distance condition information matched with the first distance information is the first interaction distance condition information, the target interaction content may be somatosensory interaction content. For another example, if the target interaction distance condition information matched with the first distance information is the second interaction distance condition information, the target interaction content may be touch interaction content.
Step 204 may send the target interactive content to a display device to cause the display device to display the target interactive content.
In an alternative embodiment of the present application, step 204 may specifically include: and controlling the display device to display somatosensory interaction content under the condition that the first distance information is matched with the first interaction distance condition information. The first interaction distance condition information can represent a distance suitable for somatosensory interaction, such as a distance between 2 meters and 5 meters, in which case, the somatosensory interaction content is displayed, so that the somatosensory interaction between a user and a machine can be supported, the user can input feedback of the user or select interesting content through the somatosensory interaction, and the interactivity of the user can be improved.
In an alternative embodiment of the present application, step 204 may specifically include: and controlling the display device to display touch interaction content under the condition that the first distance information accords with the second interaction distance condition information. The second interaction distance condition information may represent a distance suitable for touch interaction, for example, a distance between 0 m and 2 m, in which case, touch interaction content is displayed, so that a user and a machine can be supported to perform touch interaction, and feedback of the user or interested content is input or selected through the touch interaction, and interactivity of the user can be improved.
In an alternative embodiment of the present application, the targeted interaction content described above may be provided by an APP. For example, the somatosensory interaction content may be provided by a somatosensory interaction APP. As another example, touch interaction content may be provided through a touch interaction APP.
In an alternative embodiment of the present application, the method of the embodiment of the present application may further include: and controlling the display device to display non-interactive content under the condition that the first distance information is matched with the non-interactive distance condition information. The non-interactive distance condition information may characterize a distance suitable for non-interaction, such as a distance other than 5 meters to 7 meters, etc., in which case non-interactive content may be displayed, which may not be responsive to an interaction event, and may include: text, pictures, audio and video, and the like.
It should be noted that, at least one user may move in a space corresponding to the display device, in this case, the first distance information between the user and the display device may be changed. For example, in a first period of time, the first distance information corresponding to the user a matches the non-interactive distance condition information, and non-interactive content is displayed on the display device. And in the second time period, the first distance information corresponding to the user A is matched with the first interaction distance condition information, so that the content displayed by the display device can be switched from the non-interaction content to the somatosensory interaction content. Further, in the third time period, the first distance information corresponding to the user a is matched with the second interaction distance condition information, so that the content displayed by the display device can be switched from the somatosensory interaction content to the touch interaction content. For the same user, he can enjoy a content switching experience that matches the first distance information.
In summary, according to the data processing method of the embodiment of the present application, according to the first distance information between the user and the display device, the target interactive content adapted to the first distance information is pushed to the user. Because the target interaction content can be used for man-machine interaction, the interaction distance condition information corresponding to the target interaction content can represent the distance suitable for the man-machine interaction corresponding to the target interaction content, and therefore the method and the device can display the target interaction content corresponding to the interaction distance condition information under the condition that the first distance information between the user and the display device is suitable for the man-machine interaction corresponding to the interaction distance condition information, the interactivity in the pushing process can be improved, and the user experience is improved.
In addition, the embodiment of the application can enable the user to select and view the interested content through the interactive content, so that the conversion effect corresponding to the interested content can be improved, and the pushing effect can be further improved.
Method embodiment II
Referring to fig. 3, a flowchart illustrating steps of a second embodiment of a data processing method according to the present application may specifically include the following steps:
step 301, providing a plurality of interactive contents, wherein the plurality of interactive contents can respectively correspond to the interactive distance condition information;
step 302, determining first distance information between a user and a display device;
step 303, determining target interaction content according to the first distance information and the interaction distance condition information;
and 304, controlling the display device to display the target interactive content.
With respect to the first embodiment of the method shown in fig. 2, the method of this embodiment may further include:
step 305, responding to a triggering operation of a user on the target interactive content, and controlling the display device to display target content corresponding to the triggering operation, where the target content specifically may include: next level interactive content, or non-interactive content.
According to the embodiment of the application, on the basis of displaying the target interactive content on the display device, the triggering operation of the user on the target interactive content can be received. For somatosensory interaction content, the triggering operation can be somatosensory action; alternatively, for the touch interaction content, the triggering operation may be a touch operation.
The embodiment of the application can respond to the triggering operation, and particularly can control the display device to display the target content corresponding to the triggering operation. The target content may be the next-level interactive content, so that the user continues to interact with the next-level interactive content. Alternatively, the target content may be non-interactive content, so that the user views the non-interactive content of interest.
In an alternative embodiment of the present application, the method may further include: and controlling the display device to continuously display the target content corresponding to the last triggering operation in a preset time period after the last triggering operation of the first user on the target interactive content is received.
The first user can refer to any user, and in order to improve the interaction experience of the first user in the interaction process of the first user, the influence of other users on the content displayed by the display device is reduced; specifically, in a preset period of time after the last trigger operation of the first user, continuously displaying target content corresponding to the last trigger operation on the display device.
The starting time of the preset time period may be the receiving time of the last triggering operation, and the time length of the preset time period may be determined by a person skilled in the art according to practical application requirements, for example, the time length may be 30 seconds, 1 minute, 2 minutes, etc., and the longer the time length is, the smaller the influence of other users on the content displayed by the display device is.
According to the embodiment of the application, in the preset time period after the last trigger operation of the first user, if the second distance information corresponding to the second user is matched with the interaction distance condition information, the second interaction content is not required to be displayed if the second distance information corresponding to the second interaction content between the second user and the display device is assumed, so that the interaction experience of the first user can be improved.
In an alternative embodiment of the present application, the method may further include: and if the first distance information between the first user and the display device is matched with the interaction distance condition information outside a preset time period after the last triggering operation of the first user on the target interaction content is received, controlling the display device to continuously display the target content corresponding to the last triggering operation. Under the condition that the time exceeds the preset time period, if the first distance information between the first user and the display device is still matched with the interaction distance condition information, the interaction requirement of the first user is still met, namely, the interaction requirement of the first user can be met by reducing the influence of other users on the content displayed by the display device.
In summary, in the data processing method of the embodiment of the present application, in response to a triggering operation of a user on an interactive content, a target content corresponding to the triggering operation is displayed on a display device, where the target content may be a content interested by the user, or an entry of the content interested by the user may be included in the target content; therefore, the embodiment of the application can enable the user to select and view the interested contents through interaction, so that the conversion effect corresponding to the interested contents can be improved, and the pushing effect can be improved.
Method example III
Referring to fig. 4, a flowchart illustrating steps of a third embodiment of a data processing method according to the present application may specifically include the following steps:
step 401, providing a plurality of interactive contents, wherein the plurality of interactive contents can respectively correspond to the interactive distance condition information;
step 402, determining first distance information between a user and a display device;
step 403, determining shortest target distance information from the first distance information of the plurality of users;
step 404, determining target interaction content according to the target distance information and the interaction distance condition information;
and step 405, controlling the display device to display the target interactive content.
The embodiment of the application can determine the shortest target distance information from the first distance information of a plurality of users, and display the target interaction content corresponding to the interaction distance condition information corresponding to the shortest target distance information on the display device, so that the interaction requirement of the user closest to the display device can be met.
In an application example of the present application, distance information of a plurality of users is matched with interaction distance condition information, and it is assumed that distance information of a first group of users is matched with first interaction distance condition information, and distance information of a second group of users is matched with second interaction distance condition information; because the distance corresponding to the second interaction distance condition information is shorter than the distance corresponding to the first interaction distance condition information, touch interaction content corresponding to the second interaction distance condition information can be displayed on the display device, and therefore interaction requirements of a second batch of users can be met.
Method example IV
For a better understanding of the embodiments of the present application by a person skilled in the art, the embodiments of the present application are described herein by way of a specific example involving the provision of N distance information detecting means, e.g. a radar probe, which may be coupled to the processing means, respectively, N being a natural number, e.g. N being 4, etc., at the upper part and/or at the front and/or at the side of the display means.
Referring to fig. 5, a flowchart illustrating steps of a fourth embodiment of a data processing method according to the present application may specifically include the following steps:
step 501, establishing a mapping relation between distance information and distance condition information;
alternatively, the interaction distance condition information may include: and presetting a distance value range. Referring to table 2, an illustration of the upper and lower boundaries corresponding to a range of preset distance values in accordance with an embodiment of the present application is shown. The first preset distance value range corresponds to a first interaction distance condition, the second preset distance value range corresponds to a second interaction distance condition, and the third preset distance value range corresponds to a non-interaction distance condition.
TABLE 2
Preset distance value range Upper boundary of Lower boundary of
A first preset distance value range 5-7 m 2 meters
Second preset distance value range 2 meters 0 meter
Third preset distanceRange of separation values Without any means for 7 m
Step 502, establishing a mapping relation between the distance parameter and the distance information according to the distance parameter of the distance information detection device;
step 503, adding a label corresponding to the distance condition information to the content material according to the mapping relation between the content material and the distance condition information;
the server may provide a variety of content materials that may originate from or be customized by the carrier.
Referring to table 3, an illustration of a mapping relationship between content material and interaction distance conditions is shown in an embodiment of the present application.
TABLE 3 Table 3
Step 504, receiving real-time distance parameters sent by the distance information detection device;
step 505, determining first distance information corresponding to the real-time distance parameter according to a mapping relation between the distance parameter and the distance information;
step 506, determining target distance condition information corresponding to the first distance information according to a mapping relation between the distance information and the distance condition;
step 507, determining a target content material corresponding to the target distance condition information according to the label of the content material;
step 508, controlling the display device to display the target content material.
In summary, according to the data processing method of the embodiment of the present application, according to the first distance information between the user and the display device, the content material adapted to the distance information is pushed to the user. Specifically, in the case that the first distance information between the user and the display device is not suitable for interaction, pushing the non-interactive content material to the user so that the user views the non-interactive content material; or, in case that the first distance information between the user and the display device is suitable for somatosensory interaction, pushing somatosensory interaction content materials to the user so that the user feeds back or obtains interesting content to the machine through the somatosensory interaction; alternatively, in the event that the first distance information between the user and the display device is suitable for touch interaction, the touch interaction content material is pushed to the user to enable the user to feed back to the machine or obtain content of interest through the touch interaction. For the same user, the user can enjoy the content switching experience matched with the distance information, so that the interaction probability of the user is improved, and the user experience can be further improved.
Referring to fig. 6, an illustration of displaying content based on first distance information between a user and a display device according to an embodiment of the present application is shown, where in a case where the first distance information is within a third preset distance value range, non-interactive content such as pictures, text, audio, video, etc. may be displayed, so that the user views the non-interactive content in a case of long distance; or, under the condition that the first distance information is within a first preset distance value range, somatosensory interaction content can be displayed, so that a user can obtain interesting content through somatosensory interaction; or, in the case that the first distance information is within the second preset distance value range, the touch interaction content may be displayed, so that the user obtains the content of interest through the touch interaction.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some blocks may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the application.
The embodiment of the application also provides a data processing device.
Referring to FIG. 7, there is shown a block diagram of an embodiment of a data processing apparatus of the present application, which may include the following modules:
an interactive content providing module 701, configured to provide a plurality of interactive contents, where the plurality of interactive contents may respectively correspond to the interactive distance condition information;
a distance information determining module 702 for determining first distance information between a user and a display device;
a target interactive content determining module 703, configured to determine target interactive content according to the first distance information and the interactive distance condition information; and
and a first display control module 704, configured to control the display device to display the target interactive content.
Optionally, the interaction distance condition information may include: the first interaction distance condition information may include interaction content corresponding to the first interaction distance condition information: and the interactive content is felt.
Optionally, the interaction distance condition information may include: the second interaction distance condition information may include interaction content corresponding to the second interaction distance condition information: the interactive contents are touched.
Optionally, the first distance information is matched with the interaction distance condition information corresponding to the target interaction content.
Optionally, the interaction distance condition information may specifically include:
presetting a distance value range; and/or
Monitoring to obtain user image information; and/or
The monitored user image information accords with preset conditions.
Optionally, the target interactive content determining module 703 may include:
the target distance information determining sub-module is used for determining shortest target distance information from first distance information of a plurality of users;
and the target interaction content determining sub-module is used for determining target interaction content according to the target distance information and the interaction distance condition information.
Optionally, the apparatus may further include:
the second display control module is used for responding to the triggering operation of the user on the target interactive content and controlling the display device to display target content corresponding to the triggering operation, and the target content comprises: next level interactive content, or non-interactive content.
Optionally, the apparatus may further include:
and the third display control module is used for controlling the display device to continuously display the target content corresponding to the last trigger operation in a preset time period after the last trigger operation of the first user on the target interactive content is received.
In summary, according to the data processing device provided by the embodiment of the application, the target interaction content matched with the first distance information is pushed to the user according to the first distance information between the user and the display device. Because the target interaction content can be used for man-machine interaction, the interaction distance condition information corresponding to the target interaction content can represent the distance suitable for the man-machine interaction corresponding to the target interaction content, and therefore the method and the device can display the target interaction content corresponding to the interaction distance condition information under the condition that the first distance information between the user and the display device is suitable for the man-machine interaction corresponding to the interaction distance condition information, the interactivity in the pushing process can be improved, and the user experience is improved.
In addition, the embodiment of the application can enable the user to select and view the interested content through the interactive content, so that the conversion effect corresponding to the interested content can be improved, and the pushing effect can be further improved.
For the device embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and similar references are made to each other.
Embodiments of the application may be implemented as a system or apparatus configured as desired using any suitable hardware and/or software. Fig. 8 schematically illustrates an exemplary apparatus 1100 that may be used to implement various embodiments described in the present disclosure.
For one embodiment, fig. 8 illustrates an exemplary apparatus 1100, the apparatus 1100 may include: one or more processors 1102, a system control module (chipset) 1104 coupled to at least one of the processors 1102, a system memory 1106 coupled to the system control module 1104, a non-volatile memory (NVM)/storage 1108 coupled to the system control module 1104, one or more input/output devices 1110 coupled to the system control module 1104, and a network interface 1112 coupled to the system control module 1106. The system memory 1106 may include: instructions 1162, the instructions 1162 being executable by the one or more processors 1102.
The processor 1102 may include one or more single-core or multi-core processors, and the processor 1102 may include any combination of general-purpose or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the apparatus 1100 can function as a server, a target device, a wireless device, etc., as described in embodiments of the present application.
In some embodiments, the apparatus 1100 may include one or more machine-readable media (e.g., system memory 1106 or NVM/storage 1108) having instructions and one or more processors 1102, in combination with the one or more machine-readable media, configured to execute the instructions to implement the modules included in the foregoing apparatus to perform the actions described in embodiments of the present application.
The system control module 1104 of an embodiment may include any suitable interface controller for providing any suitable interface to at least one of the processors 1102 and/or any suitable device or component in communication with the system control module 1104.
The system control module 1104 for one embodiment may include one or more memory controllers to provide interfaces to the system memory 1106. The memory controller may be a hardware module, a software module, and/or a firmware module.
The system memory 1106 for one embodiment may be used for loading and storing data and/or instructions 1162. For one embodiment, the system memory 1106 may include any suitable volatile memory, such as, for example, a suitable DRAM (dynamic random Access memory). In some embodiments, system memory 1106 may comprise: double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
The system control module 1104 for one embodiment may include one or more input/output controllers to provide interfaces to the NVM/storage 1108 and the input/output device(s) 1110.
NVM/storage 1108 of one embodiment may be used to store data and/or instructions 1182. NVM/storage 1108 may include any suitable nonvolatile memory (e.g., flash memory, etc.) and/or may include any suitable nonvolatile storage device(s), such as one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives, etc.
NVM/storage 1108 may include storage resources that are physically part of the device on which device 1100 is installed or which may be accessed by the device without being part of the device. For example, NVM/storage 1108 may be accessed over a network via network interface 1112 and/or through input/output devices 1110.
Input/output device(s) 1110 for one embodiment may provide an interface for apparatus 1100 to communicate with any other suitable device, input/output device 1110 may include a communication component, an audio component, a sensor component, and the like.
The network interface 1112 of an embodiment may provide an interface for the device 1100 to communicate over one or more networks and/or with any other suitable device, and the device 1100 may communicate wirelessly with one or more components of a wireless network in accordance with any of one or more wireless network standards and/or protocols, such as accessing a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof.
For one embodiment, at least one of the processors 1102 may be packaged together with logic of one or more controllers (e.g., memory controllers) of the system control module 1104. For one embodiment, at least one of the processors 1102 may be packaged together with logic of one or more controllers of the system control module 1104 to form a System In Package (SiP). For one embodiment, at least one of the processors 1102 may be integrated on the same new product as the logic of one or more controllers of the system control module 1104. For one embodiment, at least one of the processors 1102 may be integrated on the same chip as logic of one or more controllers of the system control module 1104 to form a system on a chip (SoC).
In various embodiments, apparatus 1100 may include, but is not limited to: a desktop computing device or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.), among others. In various embodiments, device 1100 may have more or fewer components and/or different architectures. For example, in some embodiments, the apparatus 1100 may include one or more cameras, keyboards, liquid Crystal Display (LCD) screens (including touch screen displays), non-volatile memory ports, multiple antennas, graphics chips, application Specific Integrated Circuits (ASICs), and speakers.
Wherein if the display comprises a touch panel, the display screen may be implemented as a touch screen display to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The embodiment of the application also provides a non-volatile readable storage medium, in which one or more modules (programs) are stored, where the one or more modules are applied to an apparatus, and the apparatus can be caused to execute instructions (instructions) of each method in the embodiment of the application.
In one example, an apparatus is provided, comprising: one or more processors; and instructions in one or more machine-readable media stored thereon, which when executed by the one or more processors, cause the apparatus to perform a method as in an embodiment of the application, the method may comprise: the method shown in fig. 2 or fig. 3 or fig. 4 or fig. 5 or fig. 6.
One or more machine-readable media are also provided in one example, having instructions stored thereon that, when executed by one or more processors, cause an apparatus to perform a method as in an embodiment of the application, the method may comprise: the method shown in fig. 2 or fig. 3 or fig. 4 or fig. 5 or fig. 6.
The foregoing has outlined a data processing method, a data processing device and a device according to the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, and the above examples are provided to assist in understanding the method and core idea of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (16)

1. A method of data processing, comprising:
providing a plurality of interactive contents, wherein the plurality of interactive contents respectively correspond to the interactive distance condition information;
determining first distance information between a user and a display device;
determining target interaction content according to the first distance information and the interaction distance condition information; the interactive content and the target interactive content are used for man-machine interaction;
controlling the display device to display the target interactive content; according to the change of the first distance information, changing the target interaction content displayed by the display device;
the interaction distance condition information includes: the first interaction distance condition information comprises interaction contents corresponding to the first interaction distance condition information: somatosensory interaction content; the interaction distance condition information includes: the second interaction distance condition information includes interaction content corresponding to the second interaction distance condition information: touching the interactive content; the distance corresponding to the first interaction distance condition information is larger than the distance corresponding to the second interaction distance condition information.
2. The method of claim 1, wherein the first distance information matches interaction distance condition information corresponding to the target interaction content.
3. The method of claim 1, wherein the interaction distance condition information comprises:
presetting a distance value range; and/or
Monitoring to obtain user image information; and/or
The monitored user image information accords with preset conditions.
4. The method of claim 1, wherein determining target interaction content based on the first distance information and the interaction distance condition information comprises:
determining shortest target distance information from first distance information of a plurality of users;
and determining target interaction content according to the target distance information and the interaction distance condition information.
5. The method according to claim 1, wherein the method further comprises:
responding to the triggering operation of the user on the target interaction content, controlling the display device to display target content corresponding to the triggering operation, wherein the target content comprises: next level interactive content, or non-interactive content.
6. The method of claim 5, wherein the method further comprises:
and controlling the display device to continuously display the target content corresponding to the last triggering operation in a preset time period after the last triggering operation of the first user on the target interactive content is received.
7. A data processing system, comprising: a display device, a distance information detection device and a processing device;
wherein the processing device is coupled with the display device, and the processing device is coupled with the distance information detection device;
the distance information detection device detects first distance information between a user and the display device;
the processing device provides a plurality of interactive contents, the plurality of interactive contents respectively correspond to the interactive distance condition information, a target interactive content is determined according to the first distance information and the interactive distance condition information, and the target interactive content is sent to the display device so that the display device displays the target interactive content; the interactive content and the target interactive content are used for man-machine interaction; according to the change of the first distance information, changing the target interaction content displayed by the display device;
the interaction distance condition information includes: the first interaction distance condition information comprises interaction contents corresponding to the first interaction distance condition information: somatosensory interaction content; the interaction distance condition information includes: the second interaction distance condition information includes interaction content corresponding to the second interaction distance condition information: touching the interactive content; the distance corresponding to the first interaction distance condition information is larger than the distance corresponding to the second interaction distance condition information.
8. The system of claim 7, wherein the first distance information matches interaction distance condition information corresponding to the target interaction content.
9. The system of claim 7, wherein the interaction distance condition information comprises:
presetting a distance value range; and/or
Monitoring to obtain user image information; and/or
The monitored user image information accords with preset conditions.
10. The system of claim 7, wherein the processing means determines shortest target distance information from first distance information of a plurality of users, and determines target interactive contents based on the target distance information and the interactive distance condition information.
11. The system according to claim 7, wherein the processing device responds to a triggering operation of a user on the target interactive content, and sends target content corresponding to the triggering operation to the display device, so that the display device displays the target content; the target content includes: next level interactive content, or non-interactive content.
12. The system of claim 11, wherein the display device continuously displays the target content corresponding to the last trigger operation for a preset period of time after receiving the last trigger operation of the first user on the target interactive content.
13. A data processing apparatus, comprising:
the interactive content providing module is used for providing various interactive contents, and the various interactive contents respectively correspond to the interactive distance condition information;
a distance information determining module for determining first distance information between the user and the display device;
the target interaction content determining module is used for determining target interaction content according to the first distance information and the interaction distance condition information; and
the display control module is used for controlling the display device to display the target interaction content; the interactive content and the target interactive content are used for man-machine interaction; according to the change of the first distance information, changing the target interaction content displayed by the display device;
the interaction distance condition information includes: the first interaction distance condition information comprises interaction contents corresponding to the first interaction distance condition information: somatosensory interaction content; the interaction distance condition information includes: the second interaction distance condition information includes interaction content corresponding to the second interaction distance condition information: touching the interactive content; the distance corresponding to the first interaction distance condition information is larger than the distance corresponding to the second interaction distance condition information.
14. The apparatus of claim 13, wherein the interaction distance condition information comprises:
presetting a distance value range; and/or
Monitoring to obtain user image information; and/or
The monitored user image information accords with preset conditions.
15. An apparatus, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform the method of one or more of claims 1-6.
16. A machine readable medium having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method of one or more of claims 1 to 6.
CN201810892250.9A 2018-08-07 2018-08-07 Data processing method, apparatus and machine readable medium Active CN110858230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810892250.9A CN110858230B (en) 2018-08-07 2018-08-07 Data processing method, apparatus and machine readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810892250.9A CN110858230B (en) 2018-08-07 2018-08-07 Data processing method, apparatus and machine readable medium

Publications (2)

Publication Number Publication Date
CN110858230A CN110858230A (en) 2020-03-03
CN110858230B true CN110858230B (en) 2023-12-01

Family

ID=69634726

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810892250.9A Active CN110858230B (en) 2018-08-07 2018-08-07 Data processing method, apparatus and machine readable medium

Country Status (1)

Country Link
CN (1) CN110858230B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841733A (en) * 2011-06-24 2012-12-26 株式会社理光 Virtual touch screen system and method for automatically switching interaction modes
JP2013218379A (en) * 2012-04-04 2013-10-24 Sharp Corp Display device and program
JP2017125901A (en) * 2016-01-13 2017-07-20 株式会社 寺 Information providing system
CN107179827A (en) * 2017-04-01 2017-09-19 深圳怡化电脑股份有限公司 The intelligent interactive method and system of a kind of finance device
CN108322885A (en) * 2017-01-12 2018-07-24 腾讯科技(深圳)有限公司 Interactive information acquisition methods, interactive information setting method and user terminal, system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841733A (en) * 2011-06-24 2012-12-26 株式会社理光 Virtual touch screen system and method for automatically switching interaction modes
JP2013218379A (en) * 2012-04-04 2013-10-24 Sharp Corp Display device and program
JP2017125901A (en) * 2016-01-13 2017-07-20 株式会社 寺 Information providing system
CN108322885A (en) * 2017-01-12 2018-07-24 腾讯科技(深圳)有限公司 Interactive information acquisition methods, interactive information setting method and user terminal, system
CN107179827A (en) * 2017-04-01 2017-09-19 深圳怡化电脑股份有限公司 The intelligent interactive method and system of a kind of finance device

Also Published As

Publication number Publication date
CN110858230A (en) 2020-03-03

Similar Documents

Publication Publication Date Title
US20240137462A1 (en) Display apparatus and control methods thereof
KR102244925B1 (en) Manipulation of content on a surface
US20120169774A1 (en) Method and apparatus for changing a size of screen using multi-touch
US20090021494A1 (en) Multi-modal smartpen computing system
US10802622B2 (en) Electronic device and method for controlling same
US11256463B2 (en) Content prioritization for a display array
WO2020057257A1 (en) Application interface switching method and mobile terminal
WO2016155446A1 (en) Information display method, channel management platform, and terminal
EP2743816A2 (en) Method and apparatus for scrolling screen of display device
KR20140111790A (en) Method and apparatus for inputting keys using random valuable on virtual keyboard
CN111124223A (en) Application interface switching method and electronic equipment
US20180196782A1 (en) Methods and devices for providing optimal viewing displays
CN114926242B (en) Live commodity processing method and device, electronic equipment and storage medium
CN107609146B (en) Information display method and device, terminal and server
EP2660701A1 (en) Method for inputting touch and touch display apparatus
WO2021088707A1 (en) Interaction method for application program interface, and electronic device
CN110858230B (en) Data processing method, apparatus and machine readable medium
CN110032320B (en) Page rolling control method and device and terminal
CN111124236B (en) Data processing method, device and machine-readable medium
CN111145604A (en) Method and device for recognizing picture books and computer readable storage medium
KR102462054B1 (en) Method and device for implementing user interface of live auction
CN105760092B (en) A kind of application control method, apparatus and electronic equipment for touch panel device
KR20140131051A (en) electro device comprising pressure sensor and method for controlling thereof
US20150019962A1 (en) Method and apparatus for providing electronic document
CN112181129B (en) Device control method, device and machine-readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant