CN113485604B - Interactive terminal, interactive system, interactive method and computer readable storage medium - Google Patents

Interactive terminal, interactive system, interactive method and computer readable storage medium Download PDF

Info

Publication number
CN113485604B
CN113485604B CN202110872022.7A CN202110872022A CN113485604B CN 113485604 B CN113485604 B CN 113485604B CN 202110872022 A CN202110872022 A CN 202110872022A CN 113485604 B CN113485604 B CN 113485604B
Authority
CN
China
Prior art keywords
interactive content
content window
display
interactive
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110872022.7A
Other languages
Chinese (zh)
Other versions
CN113485604A (en
Inventor
冯朋朋
踪家双
马明园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
BOE Intelligent loT Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
BOE Intelligent loT Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, BOE Intelligent loT Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202110872022.7A priority Critical patent/CN113485604B/en
Publication of CN113485604A publication Critical patent/CN113485604A/en
Application granted granted Critical
Publication of CN113485604B publication Critical patent/CN113485604B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application provides an interactive terminal, an interactive system, an interactive method and a computer readable storage medium; the interactive terminal comprises: a display for displaying a plurality of interactive content windows divided into a plurality of interactive content window groups; a detector for detecting a real-time position and/or a user selection operation of an interactive content window; the real-time location includes: a real-time position of a user located within the interactive area, or a real-time position of a mouse pointer in a display area of the interactive content window; and the processor is used for controlling the display to adjust the display effect of the interactive content window related to the real-time position according to the detected real-time position and/or controlling the display to adjust the display effect of the interactive content window group affiliated to the selected interactive content window according to the detected selection operation of the interactive content window by the user. According to the method and the device, the search interest of the user for the product can be increased through a man-machine interaction mode, and propaganda of the product is promoted.

Description

Interactive terminal, interactive system, interactive method and computer readable storage medium
Technical Field
The present disclosure relates to the field of computers, and in particular, to an interactive terminal, an interactive system, an interactive method, and a computer readable storage medium.
Background
Banking sites are important places for displaying bank images, selling products and providing services, and in recent years, the informatization construction of the banking sites is accelerated, so that various display terminals are widely applied. Also, active research is continually being conducted on the user experience.
Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
The embodiment of the application provides an interactive terminal, an interactive system, an interactive method and a computer readable storage medium, which can display products to a user more conveniently and rapidly, and increase interaction with the user.
The embodiment of the application provides an interactive terminal, which comprises:
a display for displaying a plurality of interactive content windows, the plurality of interactive content windows being divided into a plurality of interactive content window groups;
a detector for detecting a real-time position and/or a user selection operation of the interactive content window; wherein the real-time location comprises: the real-time position of the user in the preset interaction area or the real-time position of the mouse pointer in the display area of the interaction content window;
And the processor is used for controlling the display to adjust the display effect of the interactive content window related to the real-time position according to the detected real-time position, and/or controlling the display to adjust the display effect of the interactive content window group affiliated to the selected interactive content window according to the detected selection operation of the interactive content window by the user.
In an exemplary embodiment, the processor controlling the display to adjust the display effect of the interactive content window associated with the real-time location according to the detected real-time location includes:
the processor controls the display to display the interactive content windows corresponding to the real-time positions according to the real-time positions detected by the detector according to the preset first display effects, and/or displays the first number of interactive content windows positioned at two sides of the interactive content window corresponding to the real-time positions according to the preset second display effects.
In an exemplary embodiment, the display displaying a plurality of interactive content windows includes:
the display displays the interactive content windows in a mode of being arranged along a first direction;
the detector comprises a radar or infrared detector and is used for positioning the user in the preset interaction area to obtain the real-time position of the user.
In an exemplary embodiment, the processor controls the display to adjust the display effect of the interactive content window group to which the selected interactive content window belongs according to the detected selection operation of the interactive content window by the user, and the method includes:
the processor controls the display to expand the next interactive content window of the selected interactive content window to display according to the selection operation of the interactive content window detected by the detector; the next interactive content window is the interactive information of the product corresponding to the selected interactive content window.
In an exemplary embodiment, the processor is further configured to detect whether the selected interactive content window is located in a centered position of the group of interactive content windows to which the selected interactive content window belongs before controlling the display to expand a next level of interactive content windows of the selected interactive content window for display;
if yes, controlling the display to expand the next interactive content window of the selected interactive content window for display;
if not, firstly adjusting the selected interactive content window to the central position, and then controlling the display to expand the next interactive content window of the selected interactive content window for display.
In an exemplary embodiment, the processor controlling the display to expand a next level interactive content window of the selected interactive content window for display includes:
the processor controls the display to display according to the following mode: in the group of interactive content windows to which the selected interactive content window belongs, the next interactive content window part of the selected interactive content window shields the interactive content window adjacent to the selected interactive content window, and the interactive content window part close to the middle position shields the interactive content window far away from the middle position in any two adjacent interactive content windows except the selected interactive content window.
In an exemplary embodiment, the processor controlling the display to expand a next level interactive content window of the selected interactive content window for display further comprises:
and the processor controls the display to display the interactive content windows except the selected interactive content window according to a preset third display effect in the interactive content window group affiliated to the selected interactive content window.
In an exemplary embodiment, the processor is further configured to:
After controlling the display to expand the next-stage interactive content window of the selected interactive content window for display, responding to the detector to detect the selection operation of other users on other interactive content windows, and detecting whether the other interactive content windows and the selected interactive content window belong to the same interactive content window group;
if yes, controlling the display to send out a prompt, and not controlling the display to expand the next interactive content window of the other interactive content windows;
and if not, controlling the display to expand the next interactive content window of the other interactive content windows for displaying.
In an exemplary embodiment, the display includes a plurality of display panels, the plurality of display panels and the plurality of interactive content window groups being in one-to-one correspondence.
In an exemplary embodiment, the interactive terminal further includes:
the camera is used for collecting images of users in the preset interaction area;
the processor is also used for determining attribute information of the user according to the acquired image of the user and determining a product to be recommended according to the attribute information of the user; or the acquired images of the users are sent to a management platform and/or a customer relationship management system to acquire products to be recommended; and controlling the display to adjust the interactive content window displayed to the user according to the product to be recommended.
The embodiment of the application also provides an interaction system, which comprises: the interactive terminal of any embodiment above, and a management platform;
the management platform is used for providing an interactive content window to be displayed for the interactive terminal to display.
In an exemplary embodiment, the interactive terminal further includes a camera, configured to collect an image of a user in a preset interactive area and send the image to the management platform;
the management platform is also used for determining attribute information of the user according to the image of the user by itself or through interaction with a client relationship management system; determining a product to be recommended according to the attribute information of the user, and sending the product to the interactive terminal;
the processor in the interactive terminal is further configured to control the display to expand a next interactive content window of the selected interactive content window corresponding to the product to be recommended in response to detecting a selection operation of the interactive content window.
The embodiment of the application also provides an interaction method, which comprises the following steps:
displaying a plurality of interactive content windows;
detecting a real-time position and/or a selection operation of a user on an interactive content window; the real-time position is the real-time position of a user in a preset interaction area or the real-time position of a mouse pointer in a display area of an interaction content window;
And adjusting the display effect of the interactive content window related to the real-time position according to the detected real-time position, and/or adjusting the display effect of the interactive content window group affiliated to the selected interactive content window according to the detected selection operation of the user on the interactive content window.
In an exemplary embodiment, the adjusting the display effect of the interactive content window related to the real-time position according to the detected real-time position includes:
displaying an interactive content window corresponding to the real-time position according to a preset first display effect according to the detected real-time position;
and/or displaying the first number of interactive content windows positioned at the two sides of the interactive content window corresponding to the real-time position according to a preset second display effect.
In an exemplary embodiment, the adjusting the display effect of the interactive content window group to which the selected interactive content window belongs according to the detected selection operation of the interactive content window by the user includes:
according to the detected selection operation of the interactive content window, expanding the next-stage interactive content window of the selected interactive content window for display; the next-stage interactive content window is the interactive information of the product corresponding to the selected interactive content window.
In an exemplary embodiment, before the expanding the next interactive content window of the selected interactive content window for display, the method further includes:
detecting whether the selected interactive content window is positioned at the center of the affiliated interactive content window group;
if yes, directly performing the operation of displaying the next-stage interactive content window of the interactive content window selected by unfolding;
if not, firstly adjusting the selected interactive content window to the central position, and then carrying out the operation of expanding the next interactive content window of the selected interactive content window for displaying.
In an exemplary embodiment, the expanding the next level of interactive content window of the selected interactive content window for display includes:
the display is performed as follows: in the group of interactive content windows to which the selected interactive content window belongs, the next interactive content window part of the selected interactive content window shields the interactive content window adjacent to the selected interactive content window, and the interactive content window part close to the middle position shields the interactive content window far away from the middle position in any two adjacent interactive content windows except the selected interactive content window.
In an exemplary embodiment, the expanding the next interactive content window of the selected interactive content window for display further includes:
and displaying the interactive content windows except the selected interactive content window according to a preset third display effect in the interactive content window group affiliated to the selected interactive content window.
In an exemplary embodiment, after the expanding the next interactive content window of the selected interactive content window for display, the method further includes:
in response to detecting selection operations of other users on other interactive content windows, detecting whether the other interactive content windows and the expanded interactive content windows belong to the same interactive content window group;
if yes, sending out a prompt, and not expanding the next interactive content window of the other interactive content windows;
if not, the next interactive content window of the other interactive content windows is unfolded for display.
In an exemplary embodiment, the interaction method further includes:
acquiring images of users in the preset interaction area;
the acquired images of the users are sent to a management platform so that the management platform can determine the attribute information of the users according to the images of the users, or the management platform can send the images to a client relationship management system to carry out client identity recognition so as to determine the attribute information of the users;
Receiving a product to be recommended, which is determined by the management platform according to the attribute information of the user;
and adjusting the interactive content window displayed to the user according to the product to be recommended.
The embodiment of the application also provides a non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer executable instructions for executing the interaction method of any embodiment.
According to the method and the device, the search interest of the user for the product can be increased through a man-machine interaction mode, and propaganda of the product is promoted.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the technical aspects of the present application, and are incorporated in and constitute a part of this specification, illustrate the technical aspects of the present application and together with the examples of the present application, and not constitute a limitation of the technical aspects of the present application.
Fig. 1 is a schematic diagram of an interactive terminal provided in an embodiment of the present application;
FIG. 2a is a schematic diagram of an example of an interactive content window displayed in enlarged form based on a user's real-time location;
FIG. 2b is a schematic diagram of an example of adjusting a selected interactive content window position;
FIG. 2c is a schematic diagram of an example of expanding a next level interactive content window;
FIG. 3 is a schematic diagram of an interactive system provided by an embodiment of the present application;
FIG. 4 is a flow chart of an interaction method provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of the interactive system of example 1;
FIG. 6 is a schematic diagram of the interactive system in example 2;
FIG. 7 is a schematic diagram of an interactive system in example 3;
FIG. 8 is a schematic illustration of the interactive content window horizontally arranged display of example 4;
FIG. 9a is a schematic diagram of an interactive content window of example 4 displayed correspondingly enlarged according to real-time location;
FIG. 9b is a schematic diagram of the effect of the enlarged display of example 4 as a function of user movement;
FIG. 10 is a schematic diagram of a user sliding interactive content window in example 4;
FIG. 11 is a schematic diagram of a user clicking on an interactive content window in example 4;
fig. 12 is a schematic diagram of multiple users operating on different screens in example 5.
Detailed Description
Embodiments of the present application will be described in detail below with reference to the accompanying drawings. Embodiments may be implemented in a number of different forms. One of ordinary skill in the art will readily recognize the fact that the patterns and contents may be altered into one or more forms without departing from the spirit and scope thereof. Therefore, the present invention should not be construed as being limited to the following embodiments. The embodiments and features of the embodiments herein may be combined with each other arbitrarily without conflict.
In the drawings, the size of one or more constituent elements, thicknesses of layers or regions may be exaggerated for clarity. Accordingly, one aspect herein is not necessarily limited to that size, and the shapes and sizes of various components in the drawings do not reflect actual proportions. Furthermore, the drawings schematically illustrate ideal examples, and the implementations herein are not limited to the shapes or the numerical values and the like shown in the drawings.
The ordinal numbers of "first", "second", "third", etc. in this document are provided to avoid intermixing of constituent elements and are not intended to be limiting in terms of number. Herein, "plurality" means two or more than two numbers.
In this document, for convenience, terms such as "middle", "upper", "lower", "front", "rear", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like are used to describe the positional relationship of the constituent elements with reference to the accompanying drawings, only for convenience in describing the present specification and simplifying the description, and do not indicate or imply that the apparatus or elements to be referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present document. The positional relationship of the constituent elements is appropriately changed according to the direction in which the constituent elements are described. Therefore, the present invention is not limited to the words described in the specification, and may be appropriately replaced according to circumstances.
In this document, the terms "mounted," "connected," and "connected" are to be construed broadly, unless otherwise specifically indicated and defined. For example, it may be a fixed connection, a removable connection, or an integral connection; may be a mechanical connection, or an electrical connection; may be directly connected, or indirectly connected through intermediate members, or may be in communication with the interior of two elements. The meaning of the terms described above herein will be understood by those of ordinary skill in the art as appropriate.
In this context, "electrically connected" includes the case where constituent elements are connected together by an element having some electric action. The "element having a certain electric action" is not particularly limited as long as it can transmit an electric signal between the connected constituent elements. Examples of the "element having some electric action" include not only an electrode and a wiring but also a switching element such as a transistor, a resistor, an inductor, a capacitor, other elements having one or more functions, and the like.
As used herein, "parallel" refers to a state in which two straight lines form an angle of-10 ° or more and 10 ° or less, and thus, may include a state in which the angle is-5 ° or more and 5 ° or less. Further, "vertical" refers to a state in which an angle formed by two straight lines is 80 ° or more and 100 ° or less, and thus may include a state in which an angle is 85 ° or more and 95 ° or less.
By "about" herein is meant not strictly limited to numerical values which are within the limits of permitted process and measurement errors.
The embodiment of the application provides an interactive terminal, as shown in fig. 1, including:
a display 11, a detector 12 and a processor 13;
the display 11 is for displaying a plurality of interactive content windows divided into a plurality of interactive content window groups;
The detector 12 is used for detecting the real-time position and/or the selection operation of the interactive content window by the user; wherein, the real-time location includes: the real-time position of the user in the preset interaction area or the real-time position of the mouse pointer in the display area of the interaction content window;
the processor 13 is configured to control the display to adjust a display effect of the interactive content window related to the real-time position according to the detected real-time position, and/or control the display to adjust a display effect of the interactive content window group to which the selected interactive content window belongs according to the detected user selection operation of the interactive content window.
The interactive content window displayed by the display 11 may come from a management platform, for example, may be an interactive content window corresponding to a currently mainly recommended hot product; the management platform may periodically update the displayed interactive content window.
The preset interaction area may be an area located in front of the display 11 or the detector 12, and a distance between the preset interaction area and the display 11 or the detector 12 is within a predetermined value; users outside the interaction area may be considered to be unable to interact with the interactive terminal due to the large distance.
The content in the interactive content window can be an icon, a moving picture, a small view screen, a control and the like corresponding to the product.
According to the embodiment of the application, the search interest of the user for the product can be increased through a man-machine interaction mode, and the propaganda of the product is promoted; after a user selects a certain interactive content window, the display effect can be adjusted by taking the interactive content window group to which the interactive content window belongs as a whole, so that the interactive effect is more obvious, more convenient and better in interactive experience.
In an exemplary embodiment, the processor 13 controls the display to adjust the display effect of the interactive content window associated with the real-time location based on the detected real-time location, including:
the processor 13 controls the display 11 to display the interactive content windows corresponding to the real-time positions according to the preset first display effects and/or displays the first number of interactive content windows on both sides of the interactive content window corresponding to the real-time positions according to the preset second display effects according to the real-time positions detected by the detector 12.
In this embodiment, the first and second display effects may be set to display effects other than those during normal display by themselves; for example, the first display effect is to enlarge and display according to a preset size, and the second display effect is to gradually reduce the display size from the middle to the two sides; and when the first display effect and the second display effect are adopted, the interactive content windows corresponding to the real-time positions are enlarged to a preset size for display, and the interactive content windows on the two sides of the interactive content windows are sequentially reduced one by one on the basis of the preset size until the normal display size is reached. For example, the first display effect is a breathing effect in which zooming in and zooming out is performed alternately and slowly, a jumping effect in which the position moves up and down, a rotating effect, or the like. The display effect of the interactive content window corresponding to the real-time position of the user or the user-operated mouse pointer is changed, so that the interactive interestingness is improved, and the user is conveniently reminded to interact or view with the interactive content window.
In an example of this embodiment, as shown in fig. 2a, the local display area in the display includes five interactive content windows arranged in a horizontal direction in one interactive content window group, and in this example, each interactive content window group includes the same interactive content window, and each interactive content window group is also arranged in a horizontal direction. In fig. 2a only one complete set of interactive content windows 50 is shown, wherein interactive content windows 51, 52, 53, 54, 55 are in sequence from left to right; the remaining interactive content window groups are not fully shown; the dashed box in fig. 2a is used to indicate the extent of the set of interactive content windows, which are not actually displayed.
Assuming that the user is currently standing in front of the interactive content windows 52 in the set of interactive content windows 50, the interactive content windows 52 in the set are enlarged for display, and the interactive content windows 51 and 53 in the set are correspondingly enlarged, but do not exceed the display size of the interactive content windows 52.
In one implementation of this embodiment, the displaying the plurality of interactive content windows by the display 11 includes:
the display 11 displays a plurality of interactive content windows in a manner of being arranged along a first direction;
The detector 12 may comprise a radar or infrared detector for locating the user in the interaction area to obtain the real-time position of the user.
The first direction may be a horizontal direction, or a vertical direction, a diagonal direction, or the like.
The detector 12 may also use a camera, an ultrasonic detector, or the like to obtain the real-time position of the user.
In this embodiment, the display may include a display panel for displaying the interactive content windows, and the arrangement along the horizontal direction may mean that the interactive content windows are sequentially distributed along a horizontal axis (generally, a horizontal line) of the display panel, and each interactive content window may be arranged in a straight line, that is, each interactive content window has the same position in the longitudinal direction; the interactive content windows may also be arranged in a curve, with some interactive content windows being located vertically higher and some interactive content windows being located vertically lower. Similarly, arranging along a vertical direction may mean that the interactive content windows are sequentially distributed along the longitudinal axis of the display panel.
In this embodiment, the interactive content windows are arranged in a horizontal manner as an example, and the interactive content windows corresponding to the real-time positions of the users are described as follows: taking the transverse axis (or the extension line of any edge of the upper and lower edges) of the display panel as the transverse axis of a plane rectangular coordinate system, taking the longitudinal axis (the extension line of any edge of the left and right edges) of the display panel as the longitudinal axis of the coordinate system, taking any angle of the display panel as the origin of coordinates, determining the abscissa (taking the real-time position of the user as a point) or the abscissa range (taking the real-time position of the user as an area) of the detected projection of the real-time position of the user in the coordinate system, and taking the interactive content window containing the corresponding abscissa or the interactive content window in the abscissa range as the interactive content window corresponding to the real-time position of the user on the display panel; if the abscissa range contains two or more interactive content windows, the interactive content window with the largest area in the display area in the abscissa range can be used as the interactive content window corresponding to the real-time position of the user. For example, if the display panel is arranged on a wall vertical to the ground, the interactive content window corresponding to the real-time position of the user, namely, the interactive content window in the display area facing the user.
In another implementation of this embodiment, the detector 12 may include a touch pad, mouse, keyboard, track point, or the like.
In this embodiment, the display 11 may be displayed in any arrangement, for example, in a manner of being arranged along the first direction, in a checkerboard arrangement, in a circular arrangement, or the like.
In one example of the present embodiment, the detector 12 is a touch pad, and the display 11 may include a display panel and a display driving unit; the touch pad can be arranged on the surface of the display panel to form a touch screen; the touch pad may be further configured such that the display area and the touch area are independent, and the track of the movement of the user's finger or the touch pen on the touch pad is acquired and sent to the display driving unit, and is correspondingly displayed as the movement of the mouse pointer in the display panel through the display driving unit.
Similarly, when the detector 12 is a mouse, a keyboard, a pointing stick, or the like, the moving direction and distance of the mouse pointer can be detected, so as to determine the real-time position of the mouse pointer.
In this embodiment, the interactive content window corresponding to the real-time position of the mouse pointer may refer to: the occupied display area coincides with the real-time position of the mouse pointer.
In an exemplary embodiment, the processor 13 controls the display to adjust the display effect of the interactive content window group to which the selected interactive content window belongs according to the detected selection operation of the interactive content window by the user, including:
the processor 13 controls the display 11 to expand the next interactive content window of the selected interactive content window to display according to the selection operation of the interactive content window detected by the detector 12; the next-stage interactive content window is the interactive information of the product corresponding to the selected interactive content window.
In this embodiment, the content of the interactive content window displayed initially may be thumbnail information of the product, such as a picture representative of the product; the interactive information of the product can be detailed information of the product with an interactive function, such as 360-degree panoramic photos of the product with changeable display angles, enlarged and reduced or changed display positions along with the operation of a user; or the initially displayed interactive content window corresponds to a major class of products, namely a plurality of products contained in the major class, and when the interactive content window is selected, the next-stage interactive content window contains the interactive content window of all or part of the products in the plurality of products; for example, the credit card is a product category, the interactive content window corresponding to the product is selected, then the interactive content window corresponding to each of a plurality of different types of credit cards is displayed in the next-stage interactive content window, and the user can check the interactive content window of the next stage by selecting one of the interactive content windows of the credit cards, and the content can be the detailed information of the credit card with the interactive function.
In this embodiment, optionally, the processor is further configured to detect, before controlling the display to expand a next level of the selected interactive content window to display, whether the selected interactive content window is located in a central position of the group of interactive content windows to which the selected interactive content window belongs;
if yes, controlling the display to expand the next interactive content window of the selected interactive content window for display;
if not, firstly adjusting the selected interactive content window to the central position, and then controlling the display to expand the next interactive content window of the selected interactive content window for display.
In the alternative scheme, if the interactive content window selected by the user is not positioned at the middle position in the group, the selected interactive content window is firstly adjusted to the middle position in the group, and then the next interactive content window is unfolded for display; if the interactive content window is located at the middle position in the group, the next-level interactive content window can be directly unfolded for display. In one example, assuming that the user selects an interactive content window 52 in the interactive content window group 50 on the basis of the situation shown in fig. 2a, the interactive content window 52 is moved to the position of the original interactive content window 53 in the group, i.e. to the middle position of the interactive content window group 50, as shown in fig. 2b, and then the next interactive content window of the interactive content window 52 is presented, as shown in fig. 2 c.
In this embodiment, after the next-level interactive content window is selected, if there is a next-level interactive content window, the next-level interactive content window may be continuously displayed, and so on. If the content of the next-stage interactive content window is the detailed information with the interactive function, such as 360-degree panoramic photos, the detected operation can be to rotate the photos, move the photos in different directions, modify the display angle, modify the display position, zoom the photos, and the like; for example, if the user slides down the page, the content of the next page of the detailed information can be displayed.
In this embodiment, the next-level interactive content window may be located in a display area where no other interactive content window is located, and if the next-level interactive content window is located in a display area where another interactive content window is located, the interactive content window blocked by the next-level interactive content window may be moved to another position.
In one implementation of this embodiment, the processor controlling the display to expand a next level of the selected interactive content window for display includes:
the processor controls the display to display as follows: in the group of interactive content windows to which the selected interactive content window belongs, the next interactive content window part of the selected interactive content window shields the interactive content window adjacent to the selected interactive content window, and the interactive content window part close to the middle position shields the interactive content window far away from the middle position in any two adjacent interactive content windows except the selected interactive content window.
As shown in fig. 2c, in an example of this embodiment, the next-stage interactive content window 521 of the selected interactive content window 52 is displayed to the user as the only window capable of being displayed completely in the present interactive content window group 50, and in this example, the next-stage interactive content window 521 may include interactive content windows 522, 523, 524, so that the user can see the next-stage interactive content window at a glance; the other interactive content windows in the interactive content window group 50 are partially displayed in a sequential stacking manner, for example, the interactive content windows 53 and 54 are partially blocked by the next interactive content window of the interactive content window 52, and the partial interactive content windows 51 and 55 are respectively blocked, so that the user can conveniently select other interactive content windows in the group.
In this embodiment, optionally, the processor controlling the display to expand the next interactive content window of the selected interactive content window to display further includes:
the processor controls the display to display the interactive content windows except the selected interactive content window according to a preset third display effect in the interactive content window group affiliated to the selected interactive content window.
In the alternative scheme, a reverse changing mode is used for highlighting the next-stage interactive content window of the interactive content window selected by the user, for example, the interactive content window which is not selected is reduced or darkened, so that the next-stage interactive content window becomes more obvious and is convenient for the user to check.
In this embodiment, if other interactive content windows are selected before and one or more next-level interactive content windows are correspondingly expanded for display, before displaying the next-level interactive content window for the currently selected interactive content window, the previously displayed next-level interactive content window may be closed first, that is, only one next-level interactive content window is reserved in the display area; the next-stage interactive content window displayed before can be reserved, namely, the next-stage interactive content window which can coexist a plurality of different interactive content windows in the display area, and the plurality of next-stage interactive content windows can be tiled or overlapped for display. When the display is overlapped, the operation on a certain position is regarded as the operation on the next-stage interactive content window at the forefront end; when the display is tiled, different interactive content windows can interact with different users respectively, so that a plurality of users can perform interactive operation at the same time, and the use efficiency is improved.
In an alternative of this embodiment, a window may be fixed as the next interactive content window of each selected interactive content window, that is, after each interactive content window is selected, the display content in this fixed window may be changed accordingly, and become the display content of the next interactive content window of the selected interactive content window.
In this embodiment, the opened next-stage interactive content window may be closed after the user does not perform an operation for a period of time, or may be closed after the user performs other operations, or may be manually closed by the user. Optionally, whether a user exists in a front position corresponding to the previously opened next-stage interactive window is detected, if the time length of continuously detecting no user exceeds a threshold time, the next-stage interactive window is closed, and if the time length of continuously detecting the user or the time length of continuously detecting no user is less than the threshold time, the next-stage interactive window is kept open.
In one implementation of this embodiment, the detector 12 includes a touch pad, which may be disposed on a surface of the display panel to form a touch screen, or may be a touch pad independent of the display panel; the user can directly select the interactive content window by clicking or long-pressing the interactive content window on the touch screen, or select the interactive content window corresponding to the real-time position of the mouse pointer by clicking or long-pressing the interactive content window on the independent touch pad. Or the detector 12 comprises an infrared detector, a camera and the like, so that the user can select the interactive content window by adopting a mode of space gesture control, eye control or real-time position of the user, for example, when the real-time position of the user is detected to be unchanged for a certain time, the interactive content window corresponding to the real-time position of the user can be used as the interactive content window selected by the user.
In another implementation of the present embodiment, the detector 12 includes a mouse, keyboard, track point, or the like; the user can select the interactive content window corresponding to the real-time position of the mouse pointer by clicking or double clicking the mouse key, pressing the carriage return on the keyboard, pressing the pointing stick and the like.
In one implementation of this embodiment, the processor may be further configured to:
after controlling the display to expand the next-stage interactive content window of the selected interactive content window for display, responding to the detector to detect the selection operation of other users on other interactive content windows, and detecting whether the other interactive content windows and the selected interactive content window belong to the same interactive content window group;
if yes, controlling the display to send out a prompt, and not controlling the display to expand the next interactive content window of the other interactive content windows;
and if not, controlling the display to expand the next interactive content window of the other interactive content windows for displaying.
In this embodiment, it may be ensured that the operation of the current user is not disturbed; if the other users and the current user select the interactive content window in an interactive content window group, the selection operation is ignored; the next level of interactive content window may be opened if the interactive content window is selected in the further group.
The display may send out the prompt by using a display effect such as dithering to display the interactive content window selected by other users, or may send out a prompt by using a built-in speaker.
Wherein it can be identified by a camera or other detector whether the current user or other user is selecting other interactive content windows.
In an exemplary embodiment, the processor 13 may be further configured to control the display 11 to adjust the display position of the corresponding interactive content window according to the moving operation of the interactive content window detected by the detector 12.
In this embodiment, if the interactive content windows are displayed in a certain order, adjusting the display position of the interactive content windows may change the display order of the interactive content windows; adjusting the display position of the interactive content window may also refer to moving the interactive content window to other positions of the display area, such as from the center of the display area to the upper left corner.
In one implementation of the present embodiment, the detector 12 includes a touch pad; the user can directly press the interactive content window on the touch screen and slide, or slide after long pressing the interactive content window, or click other positions after long pressing the interactive content window, or select the interactive content window on the independent touch pad first and then slide to move the selected interactive content window.
In another implementation of the present embodiment, the detector 12 includes a mouse, keyboard, track point, or the like; the user may first select an interactive content window and then move the selected interactive content window by dragging a mouse, or pressing a directional key on a keyboard, or pushing a track point, etc.
In an exemplary embodiment, the interactive terminal further includes:
the camera is used for acquiring images of users in a preset interaction area;
the processor 13 is further configured to determine a product to be recommended according to the acquired image of the user; the control display 11 adjusts the interactive content window displayed to the user according to the product to be recommended.
The embodiment can display different products by identifying different users, so that the product recommendation is more accurate.
In this embodiment, the captured image of the user may be a photograph, or may be a video or a screenshot in a video. The products to be recommended may be in the form of entities or services.
In this embodiment, if there are multiple users in the interaction area, when determining the product to be recommended, the recommendation may be performed according to the image of the user that enters the interaction area first among the multiple users, or the recommendation may be performed according to the image of the user that operates the interaction content window first among the multiple users. Or, determining a product to be recommended for each user, and adjusting the displayed interactive content window according to the product to be recommended corresponding to the user when the user is currently detected to operate.
In this embodiment, the processor may determine the attribute information of the user according to the image of the user, determine the product to be recommended according to the attribute information of the user, or may obtain the product to be recommended by sending the image of the user to the management platform and/or CRM (Customer Relationship Management ) system.
The management platform can process the image of the user, such as performing operations of feature extraction, face recognition and the like on the image of the user, and obtain attribute information of the user, such as age, gender and the like of the user according to the extracted or recognized result. The management platform can directly determine the product corresponding to the attribute information of the user according to the corresponding relation between the attribute information and the product which are stored in advance locally or the corresponding rule, and the product is used as the product to be recommended and sent to the interactive terminal; or the management platform can send the obtained attribute information of the user to the CRM system, the CRM system determines the product to be recommended according to the pre-stored corresponding relation or corresponding rule, and the product to be recommended is sent to the interaction terminal by the management platform.
In one implementation of this embodiment, the processor 13 controls the display 11 to adjust the interactive content window displayed to the user according to the product to be recommended, including one or more of the following ways:
The control display 11 replaces all or part of the displayed interactive content window with the interactive content window corresponding to the product to be recommended;
in response to a user selection operation of the interactive content window, the display 11 is controlled to expand the interactive content window corresponding to the next level of the interactive content window corresponding to the product to be recommended for display.
In this embodiment, the next level of interactive content windows may have multiple groups, and when a product to be recommended is to be recommended, only one or multiple groups of next level of interactive content windows corresponding to the product to be recommended may be displayed; the processor 13 may pre-store the correspondence between the product and the interactive content window. For example, the interactive content window a includes the next interactive content window of multiple credit cards, and after the user selects the interactive content window a, only the next interactive content window of the credit cards to be recommended to the user is unfolded and displayed.
In an exemplary embodiment, the display 11 may include a plurality of display panels, which correspond one-to-one with a plurality of interactive content window groups.
In this embodiment, for example, the display 11 is a spliced screen of 1*6, one interactive content window group may be displayed on each of 6 screens. The number of the interactive content window groups to be divided can be determined according to the number of the display panels in the display, and the number of the interactive content window groups to be divided can be the same; therefore, the display panel and the interactive content window groups can be in one-to-one correspondence, and a user can conveniently distinguish different interactive content window groups more clearly and conveniently during interaction.
In one implementation of the present embodiment, the display 11 may include a plurality of spliced screens;
the processor 13 may control the display 11 to adjust the display effect of the interactive content window related to the real-time position in the screen in response to the real-time position detected by the detector 12 for any one of the screens, and/or control the display to adjust the display effect of the interactive content window group to which the selected interactive content window in the screen belongs according to the detected selection operation of the interactive content window in any one of the screens by the user.
In this embodiment, the contents of the interactive content window groups displayed in the plurality of screens may be identical to each other, may be partially identical to each other, or may be completely different from each other; for example, if three screens are provided, the interactive content windows to be displayed can be divided into a group, and the same interactive content window group is displayed on the three screens respectively, namely, the same interactive content windows are displayed on the three screens; or the interactive content window to be displayed can be divided into three groups, the three groups are respectively displayed on three screens, the interactive content windows displayed on a plurality of screens are continuously displayed along a certain direction, for example, the interactive content windows can be continuously displayed along the direction of the arrangement of the screens, and at the moment, the three screens can be regarded as a large screen spliced into a whole to jointly display the interactive content window.
In an example of this embodiment, the interactive content window corresponding to the real-time position is enlarged according to the real-time position of the user, for example, the display 11 includes a screen a, a screen B, and a screen C, if the screen a is preceded by the user a and the screen B is preceded by the user B, the interactive content window corresponding to the real-time position of the user a in the screen a is enlarged and displayed, the interactive content window corresponding to the real-time position of the user B in the screen B is enlarged and displayed, and the screen C is preceded by no user, wherein the interactive content window remains unchanged.
In yet another example of the present embodiment, the display effect is changed accordingly according to the operation on the interactive content window, for example, in the above example, when the user a selects one interactive content window in the screen a, the next interactive content window of the interactive content window is expanded in the screen a; the user B slides an interactive content window in the screen B, and the position of the slid interactive content window in the screen B is correspondingly changed; screen C has no user operation and the interactive content window therein remains unchanged.
In an exemplary embodiment, the interactive terminal further includes: the camera is used for collecting videos of users in a preset interaction area;
the processor 13 is further configured to determine, as the first user, a user who first enters the interactive area or performs an operation on the interactive content window according to the acquired video;
The processor 13 controls the display to adjust the display effect of the interactive content window in relation to the real-time position only in response to the real-time position of the first user detected by the detector 12, the real-time position of the other user not affecting the display effect or being adjusted to be further display effect to be different from the adjustment according to the real-time position of the first user;
and/or the processor 13 controls the display to adjust the display effect of the interactive content window group to which the selected interactive content window belongs only in response to the detected selection operation of the interactive content window by the first user, and the selection operation of the other user does not affect the display effect or is adjusted to be different from the adjustment according to the selection operation of the first user.
In this embodiment, if only one user is in the interaction area, the user may be directly used as the first user, and then the user entering the interaction area is used as the other user; alternatively, once the user's operation on the interactive content window is detected, the user is taken as the first user.
In this embodiment, if the determined first user leaves the interaction area or stops operating, and the interaction area still has a plurality of users remaining, the first user of the plurality of users entering the interaction area or the first user operating the interactive content window may be determined as the first user.
The embodiment can avoid the mutual interference between users under the condition of multiple users; for example, after the user a enters the interaction area, the corresponding interaction content window can be enlarged along with the real-time position of the user a; and when the user B enters the interaction area, the interaction content window is not correspondingly enlarged according to the real-time position of the user B. For example, if the user A clicks one interactive content window, the next interactive content window is unfolded for display, if the user B clicks the other interactive content window, no response is made, or the other interactive content window only shakes but does not open the next interactive content window.
In one implementation of this embodiment, the display 11 includes a plurality of spliced screens, each having a respective camera; one first user may be individually determined for each screen that is responsive only to real-time locations or operations detected for the first user, and is not responsive or responds with additional display effects to real-time locations or operations detected for other users.
The embodiment of the application also provides an interaction system, as shown in fig. 3, including the interaction terminal 1 in any of the embodiments above, and the management platform 2.
The management platform 2 is used for providing an interactive content window to be displayed for the interactive terminal 1 to display.
The management platform 2 may organize and pick the interactive content windows to be displayed, and perform operations such as adding, modifying, deleting, and sorting on the interactive content windows. In addition, the management platform 2 may also manage a hierarchical relationship between the interactive content windows, so that when the user selects an interactive content window, a next interactive content window of the selected interactive content window may be provided to the interactive terminal 1 for display. Alternatively, the hierarchical relationship and the interactive content windows of each level may be stored locally in the interactive terminal 1, so that the interactive terminal 1 may automatically expand the interactive content window of the next level of the selected interactive content window.
In an exemplary embodiment, the interactive terminal 1 further includes a camera, which is used for acquiring an image of a user in the interactive area and sending the image to the management platform 2;
the management platform 2 is also used for determining attribute information of the user by self or through interaction with a customer relationship management system according to the acquired image of the user; determining a product to be recommended according to the attribute information of the user, and sending the product to the processor 13;
The processor 13 is further configured to control the display to expand a next interactive content window of the selected interactive content window corresponding to the product to be recommended for display in response to detecting the selection operation of the interactive content window.
In this embodiment, the corresponding relationship between different attribute information and the product to be recommended may be reserved in advance in the management platform 2, or a rule for determining the product to be recommended according to the attribute information may be preset, and the management platform 2 determines the product to be recommended according to the attribute information of the user based on the corresponding relationship or rule.
The embodiment of the application also provides an interaction method, as shown in fig. 4, comprising steps S110-S130:
s110, displaying a plurality of interactive content windows;
s120, detecting a real-time position and/or a selection operation of a user on an interactive content window; the real-time position is the real-time position of a user in a preset interaction area or the real-time position of a mouse pointer in a display area of an interaction content window;
s130, adjusting the display effect of the interactive content window related to the real-time position according to the detected real-time position, and/or adjusting the display effect of the interactive content window group affiliated to the selected interactive content window according to the detected selection operation of the user on the interactive content window.
The embodiment of the application provides an interaction method which attracts users to actively explore products in a man-machine interaction mode and improves the propaganda effect of the products. After a user selects a certain interactive content window, the display effect can be adjusted by taking the interactive content window group to which the interactive content window belongs as a whole, so that the interactive effect is more obvious, more convenient and better in interactive experience.
In this embodiment, steps S120 and S130 may be performed multiple times, for example, after adjusting the display effect of the interactive content window according to the real-time position, if the selection operation for the interactive content window is detected again, the adjustment may be continued on the display effect of the interactive content window group to which the selected interactive content window belongs.
In an exemplary embodiment, the adjusting the display effect of the interactive content window related to the real-time position according to the detected real-time position includes:
displaying an interactive content window corresponding to the real-time position according to a preset first display effect according to the detected real-time position;
and/or displaying the first number of interactive content windows positioned at the two sides of the interactive content window corresponding to the real-time position according to a preset second display effect.
In an exemplary embodiment, the adjusting the display effect of the interactive content window group to which the selected interactive content window belongs according to the detected selection operation of the interactive content window by the user includes:
according to the detected selection operation of the interactive content window, expanding the next-stage interactive content window of the selected interactive content window for display; the next-stage interactive content window is the interactive information of the product corresponding to the selected interactive content window.
In this embodiment, optionally, before the expanding the next interactive content window of the selected interactive content window for display, the method further includes:
detecting whether the selected interactive content window is positioned at the center of the affiliated interactive content window group;
if yes, directly performing the operation of displaying the next-stage interactive content window of the interactive content window selected by unfolding;
if not, firstly adjusting the selected interactive content window to the central position, and then carrying out the operation of expanding the next interactive content window of the selected interactive content window for displaying.
In one implementation manner of this embodiment, the expanding the next interactive content window of the selected interactive content window for display may include:
The display is performed as follows: in the group of interactive content windows to which the selected interactive content window belongs, the next interactive content window part of the selected interactive content window shields the interactive content window adjacent to the selected interactive content window, and the interactive content window part close to the middle position shields the interactive content window far away from the middle position in any two adjacent interactive content windows except the selected interactive content window.
In this embodiment, optionally, expanding the next interactive content window of the selected interactive content window for display further includes:
and displaying the interactive content windows except the selected interactive content window according to a preset third display effect in the interactive content window group affiliated to the selected interactive content window.
In one implementation manner of this embodiment, after the next level interactive content window of the selected interactive content window is expanded for display, the method may further include:
in response to detecting selection operations of other users on other interactive content windows, detecting whether the other interactive content windows and the expanded interactive content windows belong to the same interactive content window group;
If yes, sending out a prompt, and not expanding the next interactive content window of the other interactive content windows;
if not, the next interactive content window of the other interactive content windows is unfolded for display.
In an exemplary embodiment, the interaction method further comprises:
acquiring images of users in the preset interaction area;
the acquired images of the users are sent to a management platform so that the management platform can determine the attribute information of the users according to the images of the users, or the management platform can send the images to a client relationship management system to carry out client identity recognition so as to determine the attribute information of the users;
receiving a product to be recommended, which is determined by the management platform according to the attribute information of the user;
and adjusting the interactive content window displayed to the user according to the product to be recommended.
In one implementation of this embodiment, adjusting the interactive content window displayed to the user according to the product to be recommended may include one or more of the following:
replacing all or part of the displayed interactive content window with the interactive content window corresponding to the product to be recommended;
and in response to the selection operation of the user on the interactive content window, expanding a next interactive content window of the selected interactive content window, which corresponds to the product to be recommended.
The following describes embodiments of the present application with five examples.
Example 1
The present example provides an interactive system for a financial journey that may be applied at a banking outlet or other financial institution. Wherein a financial journey may include a plurality of financial products that are associated with one another, and a user may be aware of one financial product while learning about the other financial products; such as to a quick loan service in the details of the deposit card, so that the user may be guided to learn about the quick loan service during the process of viewing the financial product of the deposit card.
The financial product may be in the form of either an entity, such as a credit card, or a service, such as an account security service.
As shown in fig. 5, includes: an interactive terminal 1 and a management platform 2.
In this example, the interactive terminal 1 may include a display 11, a radar 121, a touch pad 122, a processor 13, and a camera 14.
The display 11 is configured to display a plurality of interactive content windows of the financial journey provided by the management platform 2, the plurality of interactive content windows being divided among different groups of interactive content windows; the display 11 may include a display panel and a display driving unit, where the display driving unit is used to drive the display panel to display. The display panel can be an independent screen or a spliced screen; such as but not limited to LCD tiled screens, or OLED tiled screens, or LED screens, etc. The interactive content window groups are in one-to-one correspondence with the screens, namely, each screen displays one interactive content window group respectively.
The spliced screen of the transverse 1*6 is adopted in the embodiment, and other screens or other spliced forms can be adopted in practical application according to practical environments, requirements and the like. The display panel can be arranged on an object such as a wall or a display board which is vertical to the ground, and a user can see the displayed interactive content window when facing the screen.
The radar 121 is used for detecting the real-time position of the user in front of the screen based on radar technology; the radar 121 may locate a user in the interaction region by transmitting a probe wave to the interaction region and detecting the reflected echo.
The touch pad 122 is used to detect user operations on the interactive content window, such as, but not limited to, selection, movement, etc. In this example, the touch pad 122 may be a transparent plate, and is disposed on the surface of the display panel, and forms a touch screen together with the display panel; in response to touch operations such as clicking and sliding performed by the user on the touch pad 122, the touch pad 122 may sample the touch operation of the user, obtain a track or an operation instruction corresponding to the touch operation of the user, and send the track or the operation instruction to the display driving unit.
The processor 13 is configured to correspondingly control the display 11 to adjust the displayed interactive content window according to the detection result of the radar 121 and/or the touch operation detected by the touch pad 122. For example, among a plurality of interactive content windows currently displayed, an interactive content window corresponding to the real-time position is determined according to the real-time position of the user, the determined interactive content window is enlarged, or the determined interactive content window is displayed by adopting other display effects. For another example, when it is detected that the user clicks a certain displayed interactive content window, a next interactive content window of the interactive content window is expanded to display for viewing and touch operation by the user.
The camera 14 is used for image acquisition of the user in the interaction area and is provided to the management platform 2.
In this example, the management platform 2 includes a content management module 21, an image processing module 22, and a content recommendation module 23.
The content management module 21 is used for storing the interactive content window of the financial journey to be displayed and determining the interactive content window to be displayed, which is sent to the interactive terminal 1; it may also be used to add, delete, modify, sort, etc. interactive content windows for financial tours. The interactive content window corresponding to a financial product can comprise a representative picture of the product, and the interactive content window can be unfolded to display interactive information of the product after being selected; the interactable information of the product can be details of the financial product (such as detailed graphic introduction, product video, AR content of the product, etc.), or can be interactive content windows of a plurality of types of sub-products covered by the financial product. In this example, after the user selects a certain interactive content window (i.e., after the picture of the corresponding product is clicked), the details of the product or a more specific classification of the product may be displayed for the user to further learn about the product or select a more specific product.
The image processing module 22 is used for determining attribute information of the user according to the image of the user acquired by the camera 14; in this example, the user may be identified from an image of the user, and attribute information such as age and sex of the user may be determined.
The content recommendation module 23 is configured to determine a product to be recommended according to attribute information of a user, and add an interactive content window corresponding to the product to be recommended to the interactive content window to be displayed for display by the display 11; or when the user clicks a certain interactive content window and displays a plurality of kinds of products subordinate to the product, the product to be recommended is preferentially displayed. For example, if the user clicks the interactive content window corresponding to the credit card, and the interactive content window has a face recognition function, the attribute information of the user can be determined according to the image of the user shot by the camera 14, and which credit card or cards are recommended to the user can be determined according to the attribute information; and preferentially displaying the interactive content window corresponding to the one or more recommended credit cards when the expanded next interactive content window is displayed.
In this example, the content recommendation module 23 may pre-reserve the correspondence between different attribute information and the product to be recommended, or pre-set a rule for determining the product to be recommended according to the attribute information, so that the product to be recommended may be determined according to the attribute information of the user based on the correspondence or the rule
In an alternative of this example, a camera, an ultrasonic detector, an infrared detector, or the like may be used for positioning instead of the radar 121, or instead of the touch pad 122 for operation detection, such as positioning the user through a photographed image or an infrared imaging result; and extracting the behavior characteristics of the user in the acquired video, and determining that the user clicks or slides when the behavior characteristics of the user meet the preset conditions.
Example 2
The present example provides an interactive system for a financial journey, as shown in figure 6, comprising: an interactive terminal 1, a management platform 2 and a bank CRM system 3.
The interactive terminal 1 and the management platform 2 are the same as in example 1.
In this example, the image processing module determines attribute information of the user according to the image of the user collected by the camera 14, specifically, the image of the user collected by the camera 14 or the feature of the image of the user is extracted and then transmitted to the bank CRM system 3 for customer identification, and the identification result of the customer identification returned by the bank CRM system 3 is also included in the attribute information of the user.
Example 3
The present example provides an interactive system for financial trips, as shown in FIG. 7, the management platform 2 can be seen in example 1; the interactive terminal 1 comprises a display 11, an input device 123, a processor 13, a camera 14. Among them, the display 11, the processor 13, and the camera 14 can be referred to example 1; the input device 123 serves as a detector in this example.
Optionally, the interaction system further comprises a CRM system 3, and the interaction of the CRM system 3 and the management platform 2 can be seen in example 2.
The input device 123 may include a touch pad, or an input device including a mouse, a keyboard, a pointing stick, and the like, and an interface thereof, and may be independently disposed in a control area other than the display panel, such as on a console beside the screen, on which a user operates the input device 123, may move a position of a mouse pointer on the screen, or may select or move an interactive content window.
The input device 123 is used to detect real-time position and/or operation of the mouse pointer in the display area of the display.
The processor 13 may correspondingly enlarge the interactive content window displayed at the real-time position according to the real-time position of the mouse pointer; the next level interactive content window of the interactive content window at the real-time position of the mouse pointer can be displayed according to clicking, double clicking, long pressing and other selection operations; the display position of the interactive content window at the real-time position of the mouse pointer can be changed correspondingly according to the moving operation.
In this example, if the display 11 adopts a split screen, an input device 123 may be configured for each screen separately, for controlling the movement of the mouse pointer on the screen or performing operations such as selection and movement; alternatively, the tiled screen may be considered as an integral screen, with only one input device 123 being provided.
Example 4
The present example provides a method of interaction for a financial journey, which may be applied in the interaction system of example 1 or 2.
The interaction method of the present example comprises the following steps 301-306:
301. for banking-push financial products, respectively storing corresponding interactive content windows (such as representative pictures displayed as financial products) and hierarchical relations among the interactive content windows in the management platform 2; all or part of the saved interactive content windows are sent to the interactive terminal 1 as first interactive content windows to be displayed, and the interactive content windows to be displayed are displayed on a screen through the display 11.
In this example, the interactive contents window is displayed in a horizontally arranged manner, as shown in fig. 8.
302. The radar 121 in the interactive terminal 1 detects whether there is a user in the interactive area, and if there is no user in the interactive area, the processor 13 may control the display 11 to display standby special effects, attract the user to participate in the interaction, such as the interactive content window of the financial product is enlarged or reduced one by one, and flows circularly in the horizontal direction, etc.
303. In response to the radar 121 detecting that a user enters the interactive area, the processor 13 controls the display 11 to enlarge and display an interactive content window corresponding to the real-time position of the user according to the real-time position of the user detected by the radar 121 so as to attract the user to click; specifically, the interactive content windows corresponding to the real-time positions may be displayed by using an enlarged effect, and a certain number of interactive content windows located on the left and right sides of the interactive content windows maintain a default size or are displayed by using enlarged effects sequentially reduced in size until the size is reduced to the default size, as shown in fig. 9 a.
In this example, in addition to the zooming in, other display effects such as a respiratory special effect may be used to display the interactive content window corresponding to the real-time position.
In this example, after detecting that the user enters the interaction area, the camera 14 is adopted to shoot the image of the user, the management platform 2 is used for user identification, the attribute information of the user is determined, the interactive content window corresponding to the financial product to be recommended is determined according to the attribute information of the user, and the interactive content window is displayed by the display 11; for example, the interactive content windows of the financial products 1, 2 and 3 are originally displayed in the screen, and the financial products 4 and 5 to be recommended are determined according to the attribute information of the user, so that one interactive content window of three financial products originally displayed can be reserved, and the other two interactive content windows are replaced by interactive content windows of the financial products 4 and 5. In this alternative, the management platform 2 may perform user identification by itself or through a CRM system.
304. In response to the radar 121 detecting a movement of the user, the processor 13 controls the display 11 to correspondingly adjust the enlarged interactive content window according to the real-time position of the user, so that the enlarged interactive content window changes along with the real-time position change of the user, and as shown in fig. 9b, the user starts to be positioned at the left position of the display panel, and the interactive content window which the user is facing on the display panel is enlarged; when the user moves to the right of the display panel, the enlarged interactive content window is correspondingly changed into the interactive content window which is opposite to the user at the moment.
Similarly, if other display effects are used to display the interactive content window corresponding to the user position, the interactive content window using the display effect is changed correspondingly along with the movement of the user.
305. In response to the touch pad 122 detecting the sliding of the user on the interactive content window, the processor 13 controls the display 11 to adjust the display position of the corresponding interactive content window, as shown in fig. 10; for example, the original arrangement sequence of the interactive content windows is that the interactive content window a, the interactive content window b, the interactive content window c, the interactive content window d, the interactive content window e and the interactive content window f … … are sequentially arranged from right to left, and when the user detects that the interactive content window d slides leftwards and stops at the position of the interactive content window e or stops at the position between the interactive content window e and the interactive content window f, the interactive content window d is moved to the left of the interactive content window e and displayed on the right of the picture f.
306. In response to the touch pad 122 detecting a click of the interactive content window by the user, the processor 13 controls the display 11 to present a next level interactive content window of the interactive content window; as shown in fig. 11, if it is detected that the user clicks on an interactive content window of a certain automobile, a next interactive content window containing details of the automobile (for example, 360 ° panoramic photograph of the automobile) is expanded and displayed for the user to view the details; the user can also operate the photograph, such as rotating the photograph to change the display angle, or zooming in on the photograph, or changing the display location, etc. Other interactive content windows may be moved to both sides at this time so as not to be obscured by the expanded next level interactive content window. If the user clicks other interactive content windows, the next interactive content window displaying the detailed information of the automobile can be collected, and the next interactive content window of the clicked interactive content window is expanded to display the next interactive content window of the newly clicked interactive content window.
In this example, one interactive content window may correspond to a product category, which may contain multiple products that are subordinate to the interactive content window; after the interactive content window is selected, for various products subordinate to the interactive content window, only the interactive content window corresponding to the product to be recommended is displayed as the next-stage interactive content window; for example, the interactive content window "transacting credit cards" contains various credit cards, and in response to the interactive content window being selected, the user image can be collected and sent to the management platform 2 to determine attribute information of the user, and the management platform 2 determines two types of credit cards to be recommended according to the attribute information of the user and returns the two types of credit cards to the interactive terminal 1; the interactive terminal 1 preferentially displays the next interactive window corresponding to the two recommended credit cards in the multiple credit cards corresponding to the interactive content window.
The steps 304, 305, 306 are not consecutive.
The method comprises the steps of firstly horizontally arranging representative pictures of popular financial products of a bank and displaying the pictures on a screen, attracting users to actively participate in interaction through a dynamic interaction design, enabling the users to fully know related products through the interaction process, further generating willingness to purchase the related products, and achieving the purpose of marketing; the method can solve the problems of single marketing mode and poor user experience of the current bank or other financial institutions, realize interesting interaction of the bank financial journey, bring the user with easy and interesting experience, and further promote the propaganda and marketing effects of the products while promoting the user experience.
Example 5
The present embodiment provides an interaction method applicable to a financial journey of a multi-user scene, which can be applied to the interaction system provided in example 1 or 2.
In addition to enabling steps 301-306 in example 4, the present example may also support a scenario where multiple people interact; in the example, the display panel adopts a spliced screen, and when the interactive content windows are displayed, the interactive content windows are also divided into the same number of groups; for example, the interactive content windows are also divided into 6 groups according to the physical screen by adopting a 1*6 spliced screen, and in the example, the interactive content windows on each screen are the same, for example, the interactive content windows with the same number, the same content and the same sequence are all available; in practical application, the interactive content windows in the interactive content window groups displayed on different screens can be different. The number of screens spliced and the number of interactive content windows on each screen can be adjusted as needed.
In this example, each screen as a display panel may have a corresponding touch pad 122; in response to the touch pad 122 corresponding to one screen detecting a touch operation of the user on the screen, the processor 13 correspondingly controls the display 11 to adjust the interactive content window displayed on the screen, that is, adjust the display effect of the corresponding interactive content window group; thus, each screen can independently support touch operation and independently adjust the displayed interactive content window. As shown in fig. 12, the user in front of the three screens can click on different interactive content windows on the facing screen respectively, and the next interactive content windows of the different interactive content windows are displayed in the three newly opened windows respectively; for example, if the user clicks the interactive content window a before the first screen, displaying the next interactive content window of the interactive content window a in a large size at the central position of the first screen; similarly, when the user before the second and third screens clicks on the interactive content window b and the interactive content window c, respectively, the second and third screens will display the interactive content window b and the next interactive content window of the interactive content window c in a large size at the center position, respectively. If a user in front of a screen slides the interactive content window displayed on the screen, only the position of the interactive content window displayed on the screen correspondingly changes, and the display of other screens is not affected.
In this example, optionally, user identification may be performed by the camera 14 to avoid multi-user interference. The multi-user interference refers to that more than one user performs touch operation on the same screen in the same time period, for example, when one user clicks one interactive content window on one screen, another user clicks another interactive content window on the same screen, and the user cannot judge that the user clicks the other user only by virtue of the touch pad 122, so that the next interactive content window of the clicked interactive content window is expanded to replace the next interactive content window of the previous clicked interactive content window.
In an alternative solution of this example, each screen may be equipped with a camera 14 to perform real-time image acquisition or video acquisition on an interaction area of the screen (i.e. an area where a user can perform touch operation on the screen), and the processor 13 may include a function of performing face recognition and motion recognition, and may identify, according to a result of the acquisition of the camera 14, that several users are in front of a screen corresponding to the camera 14, and identify which user is currently performing touch operation on the screen.
If the processor 13 identifies more than one face in the image or video acquired by the camera 14, judging that the front of the screen corresponding to the camera 14 is not the only user, and determining the first user of the multiple users for performing touch operation on the screen through action identification; if the first user performs the touch operation on the screen, the processor 13 controls the display 11 to correspondingly adjust the interactive content window displayed on the screen according to the touch operation detected by the touch pad 122, for example, displaying the next interactive content window of the interactive content window or changing the display position of the interactive content window; if the other user is performing the touch operation on the screen, the processor 13 ignores the touch operation on the screen detected by the touch pad 122, or controls the display 11 to respond with a display effect that does not affect other display areas, such as displaying the interactive content window clicked by the other user with a dithering effect, but does not move or expand the next interactive content window, so that the experience of interrupting the user who is performing the interaction before the interruption can be avoided. If the processor 13 determines that the first user leaves the acquisition area of the camera 14 or stops operating according to the acquisition result of the camera 14, the first user performing the touch operation on the screen can be determined again from the remaining users before the screen.
If the processor 13 only identifies one face in the image acquired by the camera 14, it is determined that the front of the screen corresponding to the camera 14 is the only user, and the processor 13 controls the display 11 to correspondingly adjust the interactive content window displayed on the screen, such as the next interactive content window displayed or changing the display position of the interactive content window, according to the touch operation detected on the screen.
The drawings herein are directed to only the implementations referred to herein, and other implementations may refer to designs. In case of no conflict, the embodiments of the present application and features of the embodiments may be combined with each other to create new embodiments.
It will be understood by those skilled in the art that various modifications and equivalent substitutions may be made to the embodiments herein without departing from the spirit and scope of the embodiments herein, and are intended to be encompassed by the scope of the claims herein.

Claims (17)

1. An interactive terminal, comprising:
a display for displaying a plurality of interactive content windows, the plurality of interactive content windows being divided into a plurality of interactive content window groups;
a detector for detecting a user's selection operation of the interactive content window, or for detecting a real-time position and a user's selection operation of the interactive content window; wherein the real-time location comprises: the real-time position of the user in the preset interaction area or the real-time position of the mouse pointer in the display area of the interaction content window;
The processor is used for controlling the display to adjust the display effect of the interactive content window related to the real-time position according to the detected real-time position, and controlling the display to adjust the display effect of the interactive content window group affiliated to the selected interactive content window according to the detected selection operation of the interactive content window by the user;
according to the detected selection operation of the user on the interactive content window, controlling the display to adjust the display effect of the interactive content window group affiliated to the selected interactive content window, including: the processor controls the display to expand the next interactive content window of the selected interactive content window to display according to the selection operation of the interactive content window detected by the detector; the next-stage interactive content window is interactive information of a product corresponding to the selected interactive content window;
the processor is further configured to detect whether the selected interactive content window is located in a central position of the subordinate interactive content window group before controlling the display to expand a next-stage interactive content window of the selected interactive content window for display; if yes, controlling the display to expand the next interactive content window of the selected interactive content window for display; if not, firstly adjusting the selected interactive content window to the central position, and then controlling the display to expand the next interactive content window of the selected interactive content window for display.
2. The interactive terminal of claim 1, wherein the processor controlling the display to adjust a display effect of an interactive content window associated with the real-time location based on the detected real-time location comprises:
the processor controls the display to display the interactive content windows corresponding to the real-time positions according to the real-time positions detected by the detector according to the preset first display effects, and/or displays the first number of interactive content windows positioned at two sides of the interactive content window corresponding to the real-time positions according to the preset second display effects.
3. The interactive terminal of claim 2, wherein the display displaying a plurality of interactive content windows comprises:
the display displays a plurality of interactive content windows in a mode of being arranged along a first direction;
the detector comprises a radar or infrared detector and is used for positioning the user in the preset interaction area to obtain the real-time position of the user.
4. The interactive terminal of claim 1, wherein the processor controlling the display to expand a next level interactive content window of the selected interactive content window for display comprises:
The processor controls the display to display according to the following mode: in the group of interactive content windows to which the selected interactive content window belongs, the next interactive content window part of the selected interactive content window shields the interactive content window adjacent to the selected interactive content window, and the interactive content window part close to the middle position shields the interactive content window far away from the middle position in any two adjacent interactive content windows except the selected interactive content window.
5. The interactive terminal of claim 4, wherein the processor controlling the display to expand a next level interactive content window of the selected interactive content window for display further comprises:
and the processor controls the display to display the interactive content windows except the selected interactive content window according to a preset third display effect in the interactive content window group affiliated to the selected interactive content window.
6. The interactive terminal of claim 1, wherein the processor is further configured to:
after controlling the display to expand the next-stage interactive content window of the selected interactive content window for display, responding to the detector to detect the selection operation of other users on other interactive content windows, and detecting whether the other interactive content windows and the selected interactive content window belong to the same interactive content window group;
If yes, controlling the display to send out a prompt, and not controlling the display to expand the next interactive content window of the other interactive content windows;
and if not, controlling the display to expand the next interactive content window of the other interactive content windows for displaying.
7. The interactive terminal of claim 1, wherein the display comprises a plurality of display panels, the plurality of display panels and the plurality of interactive content window groups being in one-to-one correspondence.
8. The interactive terminal of claim 1, further comprising:
the camera is used for collecting images of users in the preset interaction area;
the processor is also used for determining attribute information of the user according to the acquired image of the user and determining a product to be recommended according to the attribute information of the user; or the acquired images of the users are sent to a management platform and/or a customer relationship management system to acquire products to be recommended; and controlling the display to adjust the interactive content window displayed to the user according to the product to be recommended.
9. An interactive system, comprising: the interactive terminal of any one of claims 1-8, and a management platform;
The management platform is used for providing an interactive content window for the interactive terminal to display.
10. The interactive system of claim 9, wherein the interactive terminal further comprises a camera for capturing an image of a user in a preset interactive area and transmitting the image to the management platform;
the management platform is also used for determining attribute information of the user according to the image of the user by itself or through interaction with a client relationship management system; determining a product to be recommended according to the attribute information of the user, and sending the product to the interactive terminal;
the processor in the interactive terminal is further used for controlling the display to expand a next interactive content window of the selected interactive content window, which corresponds to the product to be recommended, to display in response to detecting the selection operation of the interactive content window.
11. An interaction method, comprising:
displaying a plurality of interactive content windows, the plurality of interactive content windows being divided into a plurality of interactive content window groups;
detecting a user selection operation on the interactive content window, or detecting a real-time position and a user selection operation on the interactive content window; wherein the real-time location comprises: the real-time position of the user in the preset interaction area or the real-time position of the mouse pointer in the display area of the interaction content window;
According to the detected real-time position, adjusting the display effect of the interactive content window related to the real-time position, and according to the detected selection operation of the user on the interactive content window, adjusting the display effect of the interactive content window group affiliated to the selected interactive content window;
the step of adjusting the display effect of the interactive content window group to which the selected interactive content window belongs according to the detected selection operation of the user on the interactive content window comprises the following steps: according to the detected selection operation of the interactive content window, expanding the next-stage interactive content window of the selected interactive content window for display; the next-stage interactive content window is interactive information of a product corresponding to the selected interactive content window;
the method further comprises the following steps before the next interactive content window of the selected interactive content window is expanded for display: detecting whether the selected interactive content window is positioned at the center of the affiliated interactive content window group; if yes, directly performing the operation of displaying the next-stage interactive content window of the interactive content window selected by unfolding; if not, firstly adjusting the selected interactive content window to the central position, and then carrying out the operation of expanding the next interactive content window of the selected interactive content window for displaying.
12. The interactive method of claim 11, wherein adjusting the display effect of the interactive content window associated with the real-time location based on the detected real-time location comprises:
displaying an interactive content window corresponding to the real-time position according to a preset first display effect according to the detected real-time position;
and/or displaying the first number of interactive content windows positioned at the two sides of the interactive content window corresponding to the real-time position according to a preset second display effect.
13. The interactive method of claim 11, wherein expanding the next level interactive content window of the selected interactive content window for display comprises:
the display is performed as follows: in the group of interactive content windows to which the selected interactive content window belongs, the next interactive content window part of the selected interactive content window shields the interactive content window adjacent to the selected interactive content window, and the interactive content window part close to the middle position shields the interactive content window far away from the middle position in any two adjacent interactive content windows except the selected interactive content window.
14. The interactive method of claim 13, wherein expanding the next interactive content window of the selected interactive content window for display further comprises:
and displaying the interactive content windows except the selected interactive content window according to a preset third display effect in the interactive content window group affiliated to the selected interactive content window.
15. The interactive method of claim 11, wherein the expanding the next interactive content window of the selected interactive content window for display further comprises:
in response to detecting selection operations of other users on other interactive content windows, detecting whether the other interactive content windows and the expanded interactive content windows belong to the same interactive content window group;
if yes, sending out a prompt, and not expanding the next interactive content window of the other interactive content windows;
if not, the next interactive content window of the other interactive content windows is unfolded for display.
16. The interaction method of claim 11, further comprising:
acquiring images of users in the preset interaction area;
the acquired images of the users are sent to a management platform so that the management platform can determine the attribute information of the users according to the images of the users, or the management platform can send the images to a client relationship management system to carry out client identity recognition so as to determine the attribute information of the users;
Receiving a product to be recommended, which is determined by the management platform according to the attribute information of the user;
and adjusting the interactive content window displayed to the user according to the product to be recommended.
17. A non-transitory computer-readable storage medium having stored therein computer-executable instructions for performing the interaction method of any of claims 11-16.
CN202110872022.7A 2021-07-30 2021-07-30 Interactive terminal, interactive system, interactive method and computer readable storage medium Active CN113485604B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110872022.7A CN113485604B (en) 2021-07-30 2021-07-30 Interactive terminal, interactive system, interactive method and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110872022.7A CN113485604B (en) 2021-07-30 2021-07-30 Interactive terminal, interactive system, interactive method and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113485604A CN113485604A (en) 2021-10-08
CN113485604B true CN113485604B (en) 2024-02-09

Family

ID=77943774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110872022.7A Active CN113485604B (en) 2021-07-30 2021-07-30 Interactive terminal, interactive system, interactive method and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113485604B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116126196A (en) * 2021-11-12 2023-05-16 西安青松光电技术有限公司 Display method, device, equipment, system and readable storage medium
CN116301482B (en) * 2023-05-23 2023-09-19 杭州灵伴科技有限公司 Window display method of 3D space and head-mounted display device
CN116594510B (en) * 2023-06-15 2024-04-16 深圳蓝普视讯科技有限公司 Interaction method and intelligent interaction system based on big data and big screen

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001318665A (en) * 2000-05-11 2001-11-16 Matsushita Electric Ind Co Ltd Device and method for controlling display
CN101021765A (en) * 2006-02-14 2007-08-22 三星电子株式会社 Apparatus and method for managing layout of a window
CN101510978A (en) * 2008-02-13 2009-08-19 索尼株式会社 Image display apparatus, image display method, program, and record medium
CN103970442A (en) * 2013-02-01 2014-08-06 海德堡印刷机械股份公司 Display apparatus related to position
CN106489112A (en) * 2015-03-08 2017-03-08 苹果公司 For manipulating equipment, method and the graphic user interface of user interface object using vision and/or touch feedback
CN110097400A (en) * 2019-04-29 2019-08-06 贵州小爱机器人科技有限公司 Information recommendation method, apparatus and system, storage medium, intelligent interaction device
CN110221752A (en) * 2019-05-13 2019-09-10 北京云迹科技有限公司 Large-size screen monitors interactive system and exchange method
CN112528050A (en) * 2020-11-30 2021-03-19 宁波市方略博华文化发展有限公司 Multimedia interaction system and method
CN112947815A (en) * 2021-04-27 2021-06-11 北京仁光科技有限公司 Multi-window interaction method and system, readable storage medium and electronic device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001318665A (en) * 2000-05-11 2001-11-16 Matsushita Electric Ind Co Ltd Device and method for controlling display
CN101021765A (en) * 2006-02-14 2007-08-22 三星电子株式会社 Apparatus and method for managing layout of a window
CN101510978A (en) * 2008-02-13 2009-08-19 索尼株式会社 Image display apparatus, image display method, program, and record medium
CN103970442A (en) * 2013-02-01 2014-08-06 海德堡印刷机械股份公司 Display apparatus related to position
CN106489112A (en) * 2015-03-08 2017-03-08 苹果公司 For manipulating equipment, method and the graphic user interface of user interface object using vision and/or touch feedback
CN110097400A (en) * 2019-04-29 2019-08-06 贵州小爱机器人科技有限公司 Information recommendation method, apparatus and system, storage medium, intelligent interaction device
CN110221752A (en) * 2019-05-13 2019-09-10 北京云迹科技有限公司 Large-size screen monitors interactive system and exchange method
CN112528050A (en) * 2020-11-30 2021-03-19 宁波市方略博华文化发展有限公司 Multimedia interaction system and method
CN112947815A (en) * 2021-04-27 2021-06-11 北京仁光科技有限公司 Multi-window interaction method and system, readable storage medium and electronic device

Also Published As

Publication number Publication date
CN113485604A (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN113485604B (en) Interactive terminal, interactive system, interactive method and computer readable storage medium
US10579187B2 (en) Display control apparatus, display control method and display control program
US10289371B2 (en) Electronic device and control method thereof
EP2444918B1 (en) Apparatus and method for providing augmented reality user interface
US8823741B2 (en) Transparent display apparatus and method for operating the same
KR102173123B1 (en) Method and apparatus for recognizing object of image in electronic device
CN111062312A (en) Gesture recognition method, gesture control method, device, medium and terminal device
US20120256886A1 (en) Transparent display apparatus and method for operating the same
CN107659769B (en) A kind of image pickup method, first terminal and second terminal
US20120256854A1 (en) Transparent display apparatus and method for operating the same
US20140111542A1 (en) Platform for recognising text using mobile devices with a built-in device video camera and automatically retrieving associated content based on the recognised text
CN102334132A (en) Image object detection browser
US10290120B2 (en) Color analysis and control using an electronic mobile device transparent display screen
US10162507B2 (en) Display control apparatus, display control system, a method of controlling display, and program
US20170053427A1 (en) Apparatus, system, and method of controlling display of image, and recording medium
Robinson et al. The LivePaper system: augmenting paper on an enhanced tabletop
CN112698775B (en) Image display method and device and electronic equipment
KR102303206B1 (en) Method and apparatus for recognizing object of image in electronic device
US20220283698A1 (en) Method for operating an electronic device in order to browse through photos
CN113273167B (en) Data processing apparatus, method and storage medium
US11675496B2 (en) Apparatus, display system, and display control method
JP2011123567A (en) Image processor, image processing method, image processing program and recording medium recording the same
US20230325216A1 (en) System for organizing and displaying information on a display device
US20230388445A1 (en) Non-mirrored preview of text based demonstration object in mirrored mobile webcam image
CN111859202A (en) Information collection method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant