WO2023246312A1 - Procédé d'interaction d'interface, appareil d'interaction d'interface et support de stockage informatique - Google Patents

Procédé d'interaction d'interface, appareil d'interaction d'interface et support de stockage informatique Download PDF

Info

Publication number
WO2023246312A1
WO2023246312A1 PCT/CN2023/091527 CN2023091527W WO2023246312A1 WO 2023246312 A1 WO2023246312 A1 WO 2023246312A1 CN 2023091527 W CN2023091527 W CN 2023091527W WO 2023246312 A1 WO2023246312 A1 WO 2023246312A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
user
control
display screen
target
Prior art date
Application number
PCT/CN2023/091527
Other languages
English (en)
Chinese (zh)
Inventor
辛孟怡
赵静
崔诚伟
Original Assignee
京东方科技集团股份有限公司
京东方智慧物联科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 京东方智慧物联科技有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2023246312A1 publication Critical patent/WO2023246312A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present application relates to the field of interaction technology, and in particular to an interface interaction method, an interface interaction device and a computer storage medium.
  • Interface interaction refers to the interaction between the user and the interface displayed on the display screen of the interface interaction device.
  • the user can obtain information through the interface, or input information to the interface interaction device through the interface.
  • multiple controls are displayed through the display screen in the interface interaction device.
  • Each control can have a corresponding information column.
  • the interface interaction device can control the display screen to display the information column corresponding to the clicked control.
  • Embodiments of the present application provide an interface interaction method, an interface interaction device, and a computer storage medium.
  • the technical solutions are as follows:
  • an interface interaction method is provided.
  • the method is applied to an interface interaction device.
  • the interface interaction device includes a display screen and a human gesture recognition component.
  • the method includes:
  • a first interface is displayed through the display screen, and the first interface has a plurality of first controls
  • the human body posture recognition component In response to the human body posture recognition component detecting the human body posture of at least one user in the preset area, based on the relative position of the at least one user and the display screen, display the at least one user at a corresponding position in the first interface.
  • a second interface corresponding to the target first control is displayed through the display screen.
  • the preset area is located in the area directly facing the display screen
  • Displaying an image corresponding to the human body posture of the at least one user at a corresponding position in the first interface based on the relative position of the user and the display screen includes:
  • an image corresponding to the human body posture of the at least one user is displayed.
  • the human body posture recognition component detects the human body postures of at least two users in the preset area,
  • the second image corresponding to the target first control is displayed on the display screen.
  • interface including:
  • a second interface corresponding to the target first control overlapping with the first image is displayed, and the first image is an image corresponding to the human body posture of the first target user.
  • displaying an image corresponding to the human body posture of the at least one user at a corresponding position in the first interface based on the relative position of the at least one user and the display screen includes:
  • an image corresponding to the human body posture of the at least one user is displayed at the corresponding position in the first interface.
  • the human body posture of the user The size of the corresponding image is inversely related to the relative distance between the user and the display screen.
  • the second interface includes a folding control located at an edge of the second interface, and the folding control corresponds to at least two third controls,
  • the method further includes:
  • an operation on the folding control is issued, and at least two third controls corresponding to the folding control are displayed in the second interface.
  • the method further includes:
  • an operation for the third control is issued, a second interface corresponding to the third control is displayed through the display screen, and at least two third controls corresponding to the folding control are stopped from being displayed.
  • displaying the second interface corresponding to the target first control through the display screen includes:
  • the second interface corresponding to the target first control is displayed through the display screen, and the folding control is displayed at the folding position of the second interface.
  • the folding control is in a strip shape and is located at the edge of the second interface.
  • the image corresponding to the human body posture is a humanoid image corresponding to the human body posture.
  • the second interface includes a plurality of second controls, the plurality of second controls are arranged in a horizontal direction in the second interface, and the target first control is displayed through the display screen The corresponding second interface,
  • the method further includes:
  • a plurality of second sub-controls corresponding to the second control directly opposite the position of the target user among the plurality of second controls.
  • the plurality of second sub-controls are arranged along a second direction at the position where the second control is located, and the second direction intersects the horizontal direction.
  • a second sub-control including:
  • the plurality of second sub-controls pop up.
  • the interface interaction device also includes a speaker
  • the method further includes:
  • a plurality of gesture control information corresponding to the second interface is played through the speaker.
  • an interface interaction device includes a control component, a display screen, and a human posture recognition component.
  • the control component is electrically connected to the display screen and the human posture recognition component respectively. connection, the control components include:
  • a first display module used to control the display screen for a first interface, with a plurality of first controls on the first interface
  • the second display module is configured to respond to the human posture recognition component detecting the human posture of at least one user in the preset area, based on the relative position of the at least one user and the display screen, in the first interface
  • the corresponding position displays an image corresponding to the human body posture of the at least one user
  • a third display module configured to display the target first control through the display screen in response to the overlap time between the image corresponding to the human body posture and the target first control among the plurality of first controls being greater than a preset value.
  • the second interface corresponding to the control.
  • an interface interaction device includes a processor and a memory.
  • the memory stores at least one instruction, at least a program, a code set or an instruction set, and the At least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to implement the above-mentioned interface interaction method.
  • a computer storage medium stores at least one instruction, at least one program, a code set or an instruction set.
  • the at least one instruction, the at least one program , the code set or instruction set is loaded and executed by the processor to implement the above-mentioned interface interaction method.
  • a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the methods provided in the above various optional implementations.
  • the human posture recognition component detects the user's human posture in the preset area, based on the user's position, the corresponding human posture is displayed at the corresponding position in the first interface.
  • image and then when the overlap time between the image corresponding to the human posture and a control is greater than the preset value, the second interface corresponding to the control is displayed.
  • the interface interaction method is simplified, the problem of the relatively complicated interface interaction method in the related technology is solved, and the effect of simplifying the interface interaction method is achieved.
  • the user can interact with the interface interaction device without clicking on the interface interaction device, which simplifies the interface interaction method and improves the user experience.
  • Figure 1 is a schematic structural diagram of an interface interaction device provided by an embodiment of the present application.
  • Figure 2 is a flow chart of an interface interaction method according to an embodiment of the present application.
  • Figure 3 is a flow chart of an interface interaction method according to an embodiment of the present application.
  • Figure 4 is a schematic diagram of a first interface in an embodiment of the present application.
  • Figure 5 is a schematic diagram of the positional relationship between a user and a display screen in an embodiment of the present application
  • Figure 6 is a schematic diagram of the positional relationship between another user and the display screen in the embodiment of the present application.
  • Figure 7 is a schematic diagram of the positional relationship between a user and a display screen in an embodiment of the present application.
  • Figure 8 is a schematic diagram of a second interface provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of a second interface in an embodiment of the present application.
  • Figure 10 is a schematic diagram of another second interface in the embodiment of the present application.
  • Figure 11 is a schematic diagram of another second interface in the embodiment of the present application.
  • Figure 12 is a schematic diagram of another user and display screen provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of a third interface in the embodiment of the present application.
  • Figure 14 is a block diagram of an interface interaction device provided by an embodiment of the present application.
  • Interface interaction devices can be various smart devices including display screens, such as smartphones, tablets, laptops, vertical advertising machines, electronic picture frames, and outdoor electronic screens.
  • display screens such as smartphones, tablets, laptops, vertical advertising machines, electronic picture frames, and outdoor electronic screens.
  • interface interaction has become an important way of interaction.
  • the user can obtain various information from the interface interaction device.
  • the interface interaction device can also display various information to the user. For example, it can display various preset information to the user.
  • the user is usually required to actively operate the interface interaction device.
  • the user is required to actively click on a control displayed on the display screen of the interface interaction device, so that the interface interaction device can respond to the control clicked by the user.
  • this method can display information to the user and realize the function of interface interaction, the interaction process is relatively complicated (complex here may mean that it is complicated for the user).
  • the user needs to actively stay in front of the interface interaction device and actively click on the controls. Only then can interaction be achieved. This greatly reduces the user's enthusiasm for interface interaction, reduces the possibility of the interface interaction device being used, and further reduces the possibility that the information provided by the interface interaction device is known to the user. For example, if the information to be provided by the interface interaction device is advertising information, this method will reduce the possibility that the advertising information will be seen by the user and reduce the advertising effect.
  • FIG 1 is a schematic structural diagram of an interface interaction device provided by an embodiment of the present application.
  • the interface interaction device 10 may include a display screen 11 and a human gesture recognition component 12.
  • the human body posture recognition component 12 can be installed on the display screen 11, or can be independently installed outside the display screen 11.
  • Figure 1 shows a structure in which the human body posture recognition component 12 is installed on the display screen 11, but the embodiment of the present application does not No restrictions are imposed.
  • the human gesture recognition component 12 is installed on the display screen 11 , it can be located on the upper edge or the lower edge of the display screen 11 .
  • the display screen 11 may have an image display function, and the display screen 11 may emit light in the facing direction, so that users located in the preset area q in the facing direction can see the image displayed on the display screen 11 .
  • the display screen 11 can also be a touch screen to enhance user experience by enriching interface interaction.
  • the human body posture recognition component 12 can be used to obtain the user's human body posture related information in the preset area q. Data, such as posture, location, and gesture information.
  • the human posture recognition component 12 can obtain data related to human posture through various methods, such as obtaining data related to human posture through infrared sensors, etc., which are not limited in the embodiments of the present application.
  • the size of the preset area q can be determined by the detection capability of the human gesture recognition component 12 and the preset detection area. For example, if the detection capability of the human gesture recognition component 12 is strong, the size of the preset area q can be larger, and the preset detection area can be less than or equal to the maximum area that the detection capability of the human gesture recognition component 12 can cover.
  • the interface interaction device 10 may also include a control component 13, which may be electrically connected to the display screen 11 and the human gesture recognition component 12 respectively.
  • the control component 13 can be integrated into the display screen 11 , or the control component 13 can be independently provided outside the display screen 11 and connected to the display screen 11 in a wired or wireless manner.
  • the control component 13 can be used to control the display screen 11 and the human gesture recognition component 12 .
  • the image displayed on the display screen 11 can be controlled, or the data related to human posture collected by the human posture recognition component 12 can be obtained.
  • the interface interaction device 10 may also include a speaker 14.
  • the speaker 14 may be electrically connected to the control component 13 for emitting sound under the control of the control component 13.
  • the speaker 14 may be combined with the display screen 11, or the speaker 14 may be installed in the display screen 11. 14 can be independently arranged outside the display screen 11 .
  • the interface interaction device in the embodiment of the present application can be located in various areas, for example, it can be located in a bank lobby, a school, a car sales hall, an agricultural product sales point, a sales department, etc. Located in different areas, it can have different functions. For example, when it is located in a bank hall, it can be used to introduce financial-related information to users, such as deposit business, relevant regulations, financial services, and various financial products. It is located in a car sales hall. At the same time, it can be used to introduce the appearance, configuration, and selling price of the car to users, as well as provide simulated test drive functions, etc.
  • FIG. 2 is a flow chart of an interface interaction method according to an embodiment of the present application. This embodiment illustrates an example in which the interface interaction method is applied to the interface interaction device shown in FIG. 1 .
  • the interface interaction method may include the following steps:
  • Step 201 Display a first interface through a display screen, with a plurality of first controls on the first interface.
  • Step 202 In response to the human body posture recognition component detecting the human body posture of at least one user in the preset area, based on the relative position of the at least one user and the display screen, display the human body posture corresponding to the at least one user at the corresponding position in the first interface. image.
  • Step 203 In response to the image corresponding to the human body posture and the target first control among the plurality of first controls, If the overlap time is greater than the preset value, the second interface corresponding to the target first control is displayed on the display screen.
  • the interface interaction method displayed a plurality of first controls on the first interface of the display screen, and when the human posture recognition component detects the user's human posture in the preset area, based on the user's position, the image corresponding to the human posture is displayed at the corresponding position in the first interface, and then when the overlap time between the image corresponding to the human posture and a control is greater than the preset value, the second interface corresponding to the control is displayed.
  • the interface interaction method is simplified, the problem of the relatively complicated interface interaction method in the related technology is solved, and the effect of simplifying the interface interaction method is achieved.
  • the user can interact with the interface interaction device without clicking on the interface interaction device, which simplifies the interface interaction method and improves the user experience.
  • FIG. 3 is a flow chart of an interface interaction method according to an embodiment of the present application. This embodiment illustrates an example in which the interface interaction method is applied to the control component in the interface interaction device shown in FIG. 1 .
  • the interface interaction method may include the following steps:
  • Step 301 Display a first interface through a display screen, with a plurality of first controls on the first interface.
  • the control component can control the display screen to display a first interface through the display screen, the first interface having a plurality of first controls.
  • the controls involved in the embodiments of the present application may refer to icons displayed in the interface of the display screen.
  • Each control may correspond to a control signal, and the control may be displayed on the user's screen. It is triggered by gesture control and touch control.
  • the display screen can be controlled based on the control signal.
  • the display screen can be controlled to display corresponding content, or the speaker can be controlled to emit corresponding sounds.
  • Step 302 Detect the preset area through the human posture recognition component.
  • control component can continuously detect the preset area through the human gesture recognition component. Specifically, it can be to detect whether there is a user in the preset area, as well as the user's position, posture, gesture and other information.
  • control component can continuously detect the preset area through the human gesture recognition component after the interface interaction device is started, or it can also detect the preset area through the human gesture recognition component after it is started and after receiving a specified signal. Continuously detect the preset area, or, after startup, and after reaching the preset conditions (the conditions can be set in advance, such as a specified time after startup, or a specified moment after startup), and then through human posture recognition The component continuously checks the preset area.
  • Step 303 In response to the human body posture recognition component detecting the human body posture of at least one user in the preset area, based on the relative position of the at least one user and the display screen, display the human body posture corresponding to the at least one user at the corresponding position in the first interface. image.
  • the human posture recognition component detects the human posture of at least one user in the preset area, it indicates that at least one user is in the preset area.
  • the control component can display an image corresponding to the human body posture of at least one user at a corresponding position in the first interface based on the relative position of the at least one user and the display screen, which can enhance the appeal to users in the preset area. , which can then increase the probability that the user stays in the preset area and interacts with the interface interaction device, thus improving the interface interaction device.
  • Figure 4 is a schematic diagram of a first interface in an embodiment of the present application, wherein a plurality of first controls k1 are displayed in the first interface m1, and an image t1 corresponding to the human body posture.
  • the image t1 corresponding to the human body posture is a humanoid image corresponding to the human body posture.
  • the posture of the humanoid image is basically the same as the posture of the user in the preset area. If the user looks at the display screen, he will see a little person on the display screen whose movements are consistent with his own. This can greatly improve the user's understanding of the interface interaction device.
  • the humanoid image corresponding to the human body posture can have multiple display forms, for example, it can be a humanoid outline, and the outline can be filled with particles, or it can be a stick figure-shaped humanoid image.
  • step 303 may include:
  • the user usually moves in the horizontal direction relative to the interface interaction device, and the control component can obtain the horizontal position of at least one user in the preset area through the human gesture recognition component.
  • the corresponding position may be the position of the orthographic projection on the display screen when the user is located directly in front of the display screen, for example, as shown in Figure 5.
  • Figure 5 is a schematic diagram of the positional relationship between a user and the display screen in an embodiment of the present application. . If the user 51 is located directly in front of the display screen 11 , the image t1 corresponding to the user's human body posture may be located at the orthographic projection position of the user 51 on the display screen 11 .
  • the corresponding position may also refer to the corresponding position of the user in the preset area.
  • the position in the preset area and the position in the display screen may be mapped in the horizontal direction, so that the position in the preset area
  • Each position has a corresponding position in the display screen.
  • FIG. 6 is a schematic diagram of the positional relationship between another user and the display screen in an embodiment of the present application.
  • the position of the user 51 in the preset area q may correspond to the opposite position on the display screen 11, and the image t1 corresponding to the human body posture of the user 51 may be located at the opposite position. This achieves the effect that the image corresponding to the user's human body posture will move following the user's movement, and the user experience is better.
  • step 303 may also include:
  • the control component can respectively obtain the relative distance and relative position between at least two users and the display screen through the human gesture recognition component.
  • the relative distance may refer to the minimum distance between the user and the display screen, or the vertical distance between the user and the plane where the display screen is located.
  • For the relative position reference can be made to the content shown in Figure 5 above, which will not be described again here.
  • the size of the image corresponding to the user's human body posture is negatively related to the relative distance between the user and the display screen. That is to say, the smaller the distance between the user and the display screen, the larger the size of the image corresponding to the user's human body posture will be. This makes it easier for the user to find his or her corresponding image on the display screen, and increases the convenience of interface interaction methods. Interactivity and fun also increase user experience.
  • Figure 7 is a schematic diagram of the positional relationship between a user and the display screen in an embodiment of the present application. If the distance between user y1 and the display screen is relatively close, then the image corresponding to the human body posture of user y1 The size of y11 is larger and the distance between the user y2 and the display screen is farther, so the size of the image y12 corresponding to the human body posture of the user y2 is smaller.
  • the image corresponding to the user's human body posture can also move in the vertical direction in the first interface.
  • the image corresponding to the user's human body posture can be correspondingly moved in the first interface. Move in the vertical upward direction.
  • the image corresponding to the user's human body posture can correspondingly move in the vertical downward direction in the first interface, which is consistent with the above horizontal direction.
  • the image corresponding to the user's human body posture in the first interface can present a mirrored display method, that is, the user is displayed in the first interface in a mirrored manner, which improves the user's interactive experience.
  • the control component can control the display screen to display financial home housing.
  • House sand table and digitally display real estate market value, transaction volume and other data in the interface.
  • the housing financial island residential model can be displayed, and the AI smart housekeeper synchronously broadcasts the welcome speech, prompting customers to click on any property through gesture recognition for a detailed experience.
  • the real-time dynamic display of the property's attention rankings For different properties, the real-time dynamic display of the property's attention rankings , monthly trading volume and average price data.
  • the interior scene of the house can be displayed on the display screen, and the AR (augmented reality) real-scene roaming function in the house is provided to the user.
  • the user can move up, left, right and down. Watch and wait to experience the living room scene, and at the same time, you can check out the house purchase plan.
  • the control component can also play house-related information through the speakers, such as detailed introductions to residential properties and apartment types.
  • Step 304 In response to the overlap time between the image corresponding to the human body posture and the target first control among the plurality of first controls being greater than a preset value, display a second interface corresponding to the target first control on the display screen.
  • the image corresponding to the user's human body posture After the image corresponding to the user's human body posture is displayed on the display screen, the image will move as the user moves, and the user can then move to move the image corresponding to the human body posture to a position that overlaps with a certain first control.
  • the first control can change in a specified form, for example, it can emit light, become larger, have a collision effect, etc. This is not limited in the embodiments of the present application. .
  • the human posture recognition component may detect that there are human body postures of at least two users in the preset area. At this time, there may also be overlap times between images corresponding to the human body postures of multiple target users and multiple first controls. is greater than the preset value.
  • step 304 may include:
  • the control component can determine at least two target users. Among the first target users who are closest to the display screen, the control component can determine the first target user who is closest to the display screen among multiple users through the human gesture recognition component.
  • the default value can be set in advance, and can be set to 2 seconds to 5 seconds, for example 2 seconds, 3 seconds and 4 seconds etc.
  • the nearest first target user is more likely to be a user who actually wants to interact with the interface interaction device.
  • the control component can interact with the first target user, and display the second interface corresponding to the target first control overlapping the first image through the display screen.
  • confusion in the display on the display screen can be avoided, and on the other hand, the probability of the interface interaction device mistakenly interacting with users who only pass through the preset area can be reduced, that is, the probability of invalid interaction can be reduced.
  • FIG. 8 is a schematic diagram of a second interface provided by an embodiment of the present application.
  • the second interface m2 includes a plurality of second controls k2 and a folding control z located on the second interface m2.
  • the folding control z corresponds to at least two third controls. When the folding control z is triggered, the third control can be displayed in the second interface m2.
  • the folding control z is in a strip shape and is located at the edge of the second interface m2. This can avoid the folding control from blocking and affecting other content on the second interface.
  • a method of displaying folding controls may include:
  • the control component can determine the folding position of the folding control in the second interface by referring to the above-mentioned method of determining the position of the image corresponding to the human body posture in the first interface.
  • the folding control can be placed in a position facing the target user in the second interface. And it can follow the movement of the target user's position, so that the user can easily trigger the folding control at any time in each position.
  • the control component can display the second interface corresponding to the target first control through the display screen, and display the folding control at the folding position of the second interface.
  • FIG. 9 is a schematic diagram of a second interface in an embodiment of the present application, in which the folding control z is located at a position facing the target user 91 in the second interface m2 and can follow the target user 91 Mobile to facilitate target users 91 operations.
  • multiple second controls k2 are arranged in the horizontal direction in the second interface m2.
  • the multiple second controls k2 can also be arranged in the second interface m2 in other ways.
  • the embodiment of the present application is This is not restricted.
  • Step 305 Play multiple gesture control information corresponding to the second interface through the speaker.
  • the control component can play multiple gesture control information corresponding to the second interface through the speaker. This can facilitate the user to know the gesture control method in the second interface, where the gesture Control information can include:
  • playing multiple gesture control information corresponding to the second interface can also attract users to interact with the interface interaction device. There may also be situations where the user accidentally is in the preset area and triggers the first control in the first interface. In this case, the user can be reminded by playing a sound, and the user may hear the gesture control information.
  • the interest in interacting with the interface interaction device can enhance the ability of the interface interaction method to promote information.
  • Step 306 Obtain the hand information of the target user among at least one user through the human gesture recognition component.
  • the control component may obtain the hand information of the target user among at least one user through the human gesture recognition component when the second interface is displayed on the display screen.
  • the hand information may include information such as the position, posture, and gesture information of the target user's hand.
  • the target user may refer to the user determined in step 304, for whom the overlap time between the image corresponding to the human body posture and the target first control among the plurality of first controls is greater than the preset value.
  • the target user may be the user closest to the display screen.
  • Step 307 Display the image corresponding to the hand information in the second interface based on the hand information.
  • the control component may display the image corresponding to the hand information in the second interface based on the hand information obtained in step 306.
  • the image may be a palm-shaped image or other shapes, which are not limited in the embodiments of the present application.
  • image and continuously tracks the position of the user's hand to continuously display the image corresponding to the hand information to facilitate the user's gesture control on the second interface.
  • FIG. 10 is a schematic diagram of another second interface in an embodiment of the present application, in which the image t2 corresponding to the hand information is shown in the second interface m2.
  • the user can perform gesture control in the second interface m2 based on the image t2.
  • Step 308 Issue an operation for the folding control in response to the hand information, and display at least two third controls corresponding to the folding control in the second interface.
  • the control component can control the display screen, first play the transition animation on the second interface, and then display at least two third controls corresponding to the folding control on the second interface.
  • the transition animation may include the third control gradually popping up from the position of the folded control, or gradually appearing from other positions, and so on. When the number of third controls is large, it can be arranged in multiple rows; when the number of third controls is small, it can be arranged in one row.
  • control component When the control component receives an operation for the folding control, it can control the display screen so that the folding control sends out corresponding feedback, for example, it can vibrate, light up, disconnect from the triggering position of the gesture, and gradually disappear from the second interface. A sort of.
  • the operation on the folding control may be a cutting action performed by the user on the folding control through a gesture.
  • FIG 11 is a schematic diagram of another second interface in an embodiment of the present application, in which at least two third controls k3 corresponding to the folding control z are displayed in the second interface m2.
  • the second interface m2 can also display other controls, such as in-car controls and out-of-car controls to display other car-related information.
  • the interface can display the car interior for 360° viewing; after the exterior controls are triggered, the interface can display the outside of the car for 360° viewing.
  • the interface can also display the car's performance parameters on the left and right sides, including fuel consumption per 100 kilometers, engine displacement, top speed, length, width and height, rated power, acceleration to 100 kilometers, etc.
  • the interface interaction device can provide the user with a simulated test drive.
  • the simulated test drive the simulated car scene can be displayed through the display screen and the simulated car scene can be displayed through the speaker.
  • Play test drive gesture information such as: Please stretch your arms forward to start the new car test drive. You can use the height of your arms to control the direction. The left arm is high, to the left, and the right arm is high, to the right.
  • the driving speed and driving distance parameters can be displayed and changed in real time.
  • a small map is displayed in the lower right corner of the interface to mark the locations of surrounding business districts during the test drive.
  • 4 navigation points can also be set.
  • a three-dimensional user interface pop-up window on the right side of the vehicle displays information about the merchant at that location, including name, poster pictures, and promotional activity information.
  • the pop-up window disappears after being displayed for a specified period of time (such as 4 seconds), the car continues to drive, and when it reaches the next point, business information pops up on the right.
  • the driving process can be ended without setting.
  • Customers can switch to other modes through the menu.
  • the appointment test drive label can be displayed at a fixed position in the lower right corner of the interface.
  • the designated time interval (such as 4 seconds) can be flipped once to display details.
  • Step 309 Issue an operation for the third control in response to the hand information, display the second interface corresponding to the third control through the display screen, and stop displaying at least two third controls corresponding to the folding control.
  • the control component can display the second interface corresponding to the third control through the display screen, and stop displaying at least the second interface corresponding to the folding control.
  • Two third controls When the display of at least two third controls corresponding to the folding control is stopped, a preset transition animation may be played. For example, the at least two third controls may gradually shrink toward the position of the folding control and disappear at the position of the stacked control.
  • Step 310 Based on the position of the target user among the at least one users, display a plurality of second sub-controls corresponding to the second control directly opposite the position of the target user among the plurality of second controls on the second interface.
  • control component may display, on the second interface, a plurality of second sub-controls corresponding to the second control directly opposite the target user's position among the plurality of second controls based on the target user's position. That is, when the user is in front of a certain second control, the control component can control the display screen to display the second sub-control, so as to display more information to the user and improve the information interaction efficiency of the interface interaction method.
  • step 310 may include:
  • the control component may determine, through the human gesture recognition component, the target second control facing the position of the target user among the at least one user.
  • This display method can facilitate the user to view the second sub-control corresponding to the second control, and also facilitate the user to further control the second sub-control, thereby improving the user experience.
  • FIG. 12 is a schematic diagram of another user and a display screen provided by an embodiment of the present application.
  • a plurality of second sub-controls ss1 are arranged along the second direction f2 at the position of the second control k2, and the second direction f2 intersects the horizontal direction f1.
  • FIG. 12 shows a situation where the second direction f2 and the horizontal direction f1 intersect perpendicularly, but this is not limited.
  • the width of the second sub-control ss1 can be the same as the width of the second control to block the second control.
  • the target second control can first display the transition animation and then display the multiple second sub-controls ss1.
  • the transition animation may be rotation, gradient, etc., which is not limited in the embodiments of the present application.
  • the second interface m2 shown in Figure 10 can also be displayed on the display screen as the initial page in the interface interaction method provided by the embodiment of the present application. In this way, when the user passes by, different information can be automatically displayed according to the user's position. Detailed information improves the ability to display information to users.
  • the control component can control the display screen to display green financial products and services.
  • Figure 13 is a schematic diagram of a third interface in an embodiment of the present application, in which the third interface m3 is displayed on the display screen, and the scene of the third interface
  • the technology tree 121 can be included in the three-dimensional space, and there can be suspended particles in the three-dimensional space. There may be multiple floating windows c1 on both sides of the interface m3.
  • a plurality of suspended bubbles 1211 are provided on the technology tree 121, and the theme concepts can be displayed in the suspended bubbles 1211, including: green credit, green bonds, green funds, green insurance, derivatives, etc.
  • control component can control the display screen 11 to display a humanoid image corresponding to the user's human body posture in the third interface m3, and the control component can also play a voice through the speaker: Please make a specified gesture ( (such as extending your right hand with your palm upward, etc.) to check changes in your carbon sink.
  • a specified gesture such as extending your right hand with your palm upward, etc.
  • the particles in the three-dimensional scene can come together and become a ball, and then burst, popping up a three-dimensional user interface window, displaying theme concept words, such as: garbage Classification, energy saving and emission reduction (the content is divided into two categories: positive and negative feedback); the positive feedback font in the three-dimensional user interface is green and the negative feedback font is red.
  • the positive feedback can be: water drops fall on the carbon sink balance p1 in the lower right, and the bar chart p2 increases and displays animation; negative feedback can be: water drops fall on the carbon sink balance p1 in the lower right, and the bar chart p2 decreases and displays animation.
  • the carbon sink balance p1 tilts left or right according to the number of positive feedback and negative feedback words.
  • a data dashboard b1 can be set on the left side of the third interface m3, which can be used to display real-time data of carbon trading platforms in various regions and cities.
  • data dashboard b1 can display the carbon price trend, carbon trading volume, carbon trading amount, and market share of these cities in various cities (such as Beijing, Fujian, Guangdong, Hubei, Shanghai, Shenzhen, Tianjin, and Chongqing).
  • the interface interaction method displayed a plurality of first controls on the first interface of the display screen, and when the human posture recognition component detects the user's human posture in the preset area, based on the user's position, the image corresponding to the human posture is displayed at the corresponding position in the first interface, and then when the overlap time between the image corresponding to the human posture and a control is greater than the preset value, the second interface corresponding to the control is displayed.
  • the interface interaction method is simplified, the problem of the relatively complicated interface interaction method in the related technology is solved, and the effect of simplifying the interface interaction method is achieved.
  • the user can interact with the interface interaction device without clicking on the interface interaction device, which simplifies the interface interaction method and improves the user experience.
  • FIG. 14 is a block diagram of an interface interaction device provided by an embodiment of the present application.
  • the interface interaction device includes a control component 1410, a display screen 1420, and a human gesture recognition component 1430.
  • the control component 1410 is connected to the display screen 1420 and the human gesture recognition component 1430 respectively. Electrically connected, the control component 1410 includes:
  • the first display module 1411 is used to control the display screen for the first interface, which has a plurality of first controls;
  • the second display module 1412 is configured to respond to the human posture recognition component detecting the human posture of at least one user in the preset area, and display the at least one user at the corresponding position in the first interface based on the relative position of the at least one user to the display screen.
  • the third display module 1413 is configured to display the second interface corresponding to the target first control through the display screen in response to the overlap time between the image corresponding to the human body posture and the target first control among the plurality of first controls being greater than a preset value.
  • the interface interaction device displayed a plurality of first controls on the first interface of the display screen, and when the human body posture recognition component detects the user's human body posture in the preset area, based on the user's position, the image corresponding to the human posture is displayed at the corresponding position in the first interface, and then when the overlap time between the image corresponding to the human posture and a control is greater than the preset value, the second interface corresponding to the control is displayed.
  • the interface interaction method is simplified, the problem of the relatively complicated interface interaction method in the related technology is solved, and the effect of simplifying the interface interaction method is achieved.
  • the user can interact with the interface interaction device without clicking on the interface interaction device, which simplifies the interface interaction method and improves the user experience.
  • Embodiments of the present application also provide an interface interaction device.
  • the interface interaction device includes a processor and a memory.
  • the memory stores at least one instruction, at least a program, a code set, or an instruction set. At least one instruction, at least a program, a code set, or an instruction set.
  • the instruction set is loaded and executed by the processor to implement the above-mentioned interface interaction method.
  • Embodiments of the present application also provide a computer storage medium.
  • the computer storage medium stores at least one instruction, at least a program, a code set or an instruction set.
  • the at least one instruction, at least a program, code set or instruction set is loaded by the processor and Execute to implement the interface interaction method as above.
  • Embodiments of the present application also provide a computer program product or computer program.
  • the computer program The product or computer program includes computer instructions stored on a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the methods provided in the above various optional implementations.
  • At least one of A and B in this application is only an association relationship describing associated objects, indicating that three relationships can exist.
  • at least one of A and B can mean: A exists alone, and at the same time
  • at least one of A, B and C means that seven relationships can exist, which can mean: A exists alone, B exists alone, C exists alone, A and B exist at the same time, A and C exist at the same time, and both exist at the same time.
  • C and B seven situations A, B and C exist at the same time.
  • At least one of A, B, C and D means that fifteen relationships can exist, which can mean: A exists alone, B exists alone, C exists alone, D exists alone, A and B exist simultaneously, and at the same time A and C exist, A and D exist simultaneously, C and B exist simultaneously, D and B exist simultaneously, C and D exist simultaneously, A, B and C exist simultaneously, A, B and D exist simultaneously, A and C exist simultaneously and D, B, C and D exist at the same time, and A, B, C and D exist at the same time, these are fifteen situations.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • the program can be stored in a computer-readable storage medium.
  • the above-mentioned storage medium can be a read-only memory, a magnetic disk or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande se rapporte au domaine technique de l'interaction et divulgue un procédé d'interaction d'interface, un appareil d'interaction d'interface et un support de stockage informatique. Le procédé consiste à : afficher une première interface au moyen d'un écran d'affichage ; afficher une image correspondant à une posture de corps humain d'au moins un utilisateur à une position correspondante dans la première interface ; et en réponse au fait que le temps de chevauchement de l'image correspondant à la posture du corps humain et à une première commande cible d'une pluralité de premières commandes est supérieur à une valeur prédéfinie, afficher une seconde interface correspondant à la première commande cible au moyen de l'écran d'affichage. Selon la présente demande, l'image correspondant à la posture du corps humain est affichée à la position correspondante dans la première interface, puis, lorsque le temps de chevauchement de l'image correspondant à la posture du corps humain et à la commande est supérieur à la valeur prédéfinie, la seconde interface correspondant à la commande est affichée. De cette manière, l'utilisateur n'a pas besoin d'effectuer une opération, l'interaction entre l'utilisateur et un appareil d'interaction d'interface est mise en oeuvre de façon à résoudre le le problème de l'état de la technique selon lequel un procédé d'interaction d'interface est complexe, ce qui permet de simplifier le procédé d'interaction d'interface.
PCT/CN2023/091527 2022-06-23 2023-04-28 Procédé d'interaction d'interface, appareil d'interaction d'interface et support de stockage informatique WO2023246312A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210726631.6 2022-06-23
CN202210726631.6A CN115097995A (zh) 2022-06-23 2022-06-23 界面交互方法、界面交互装置以及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2023246312A1 true WO2023246312A1 (fr) 2023-12-28

Family

ID=83293544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/091527 WO2023246312A1 (fr) 2022-06-23 2023-04-28 Procédé d'interaction d'interface, appareil d'interaction d'interface et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN115097995A (fr)
WO (1) WO2023246312A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097995A (zh) * 2022-06-23 2022-09-23 京东方科技集团股份有限公司 界面交互方法、界面交互装置以及计算机存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US20140059484A1 (en) * 2005-12-12 2014-02-27 Sony Computer Entertainment Inc. Voice and video control of interactive electronically simulated environment
CN104871116A (zh) * 2012-12-27 2015-08-26 索尼公司 信息处理装置、信息处理方法及程序
WO2019038205A1 (fr) * 2017-08-22 2019-02-28 Ameria Ag Disposition de l'utilisateur pour les systèmes d'affichage commandés par geste sans contact tactile
CN115097995A (zh) * 2022-06-23 2022-09-23 京东方科技集团股份有限公司 界面交互方法、界面交互装置以及计算机存储介质

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8555207B2 (en) * 2008-02-27 2013-10-08 Qualcomm Incorporated Enhanced input using recognized gestures
US8749557B2 (en) * 2010-06-11 2014-06-10 Microsoft Corporation Interacting with user interface via avatar
CN109144235B (zh) * 2017-06-19 2024-01-23 深圳巧牛科技有限公司 基于头手协同动作的人机交互方法与系统
CN111309214A (zh) * 2020-03-17 2020-06-19 网易(杭州)网络有限公司 一种视频界面的设置方法及装置、电子设备、存储介质
CN113183133B (zh) * 2021-04-28 2024-02-09 华南理工大学 面向多自由度机器人的手势交互方法、系统、装置及介质
CN113760158A (zh) * 2021-04-30 2021-12-07 腾讯科技(深圳)有限公司 目标对象展示方法、对象关联方法、装置、介质及设备
CN113419800B (zh) * 2021-06-11 2023-03-24 北京字跳网络技术有限公司 交互方法、装置、介质和电子设备
CN113778217A (zh) * 2021-09-13 2021-12-10 海信视像科技股份有限公司 显示设备及显示设备控制方法
CN113946216A (zh) * 2021-10-18 2022-01-18 阿里云计算有限公司 人机交互方法、智能设备、存储介质及程序产品

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594469A (en) * 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US20140059484A1 (en) * 2005-12-12 2014-02-27 Sony Computer Entertainment Inc. Voice and video control of interactive electronically simulated environment
CN104871116A (zh) * 2012-12-27 2015-08-26 索尼公司 信息处理装置、信息处理方法及程序
WO2019038205A1 (fr) * 2017-08-22 2019-02-28 Ameria Ag Disposition de l'utilisateur pour les systèmes d'affichage commandés par geste sans contact tactile
CN115097995A (zh) * 2022-06-23 2022-09-23 京东方科技集团股份有限公司 界面交互方法、界面交互装置以及计算机存储介质

Also Published As

Publication number Publication date
CN115097995A (zh) 2022-09-23

Similar Documents

Publication Publication Date Title
JP7073238B2 (ja) クリエイティブカメラ
CN104503578B (zh) 具有横跨平台的触觉反馈的交互式触摸屏游戏象征
US10275098B1 (en) Laser mid-air hologram touch input buttons for a device
US20140218361A1 (en) Information processing device, client device, information processing method, and program
CN101523482B (zh) 显示装置、显示方法、信息记录介质及程序
KR20220032653A (ko) 증강 및 가상 현실 환경들과 상호작용하기 위한 시스템들, 방법들, 및 그래픽 사용자 인터페이스들
JP7008730B2 (ja) 画像に挿入される画像コンテンツについての影生成
US10341642B2 (en) Display device, control method, and control program for stereoscopically displaying objects
US20140267599A1 (en) User interaction with a holographic poster via a secondary mobile device
WO2023246312A1 (fr) Procédé d'interaction d'interface, appareil d'interaction d'interface et support de stockage informatique
US20110131502A1 (en) User location-based display method and apparatus
KR20070050878A (ko) 양방향 시스템 및 그에 관한 방법
WO2005116805A1 (fr) Systeme et methode interactifs
CN111667589B (zh) 动画效果触发展示的方法、装置、电子设备及存储介质
JPH0944724A (ja) セルフ・サービス装置および方法
TW200926055A (en) Image processing device, image processing method, and information recording medium
EP3764200B1 (fr) Transfert d'informations photo-augmentées en profondeur à l'aide de gestes et de plans d'occlusion contrôlés par interface utilisateur
EP2778887A2 (fr) Dispositif d'affichage interactif
CN111541928B (zh) 直播显示方法、装置、设备及存储介质
JP2010279460A (ja) プログラム、情報記憶媒体及びゲーム装置
CN203043526U (zh) 游戏装置
CN110471731A (zh) 显示界面绘制方法、装置、电子设备及计算机可读介质
EP3447610B1 (fr) Disponibilité de l'utilisateur pour systèmes d'affichage commandée par geste sans contact
JPH11134084A (ja) 三次元バーチャル・リアリティを改善する方法
EP2724307A2 (fr) Vendeur virtuel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23825965

Country of ref document: EP

Kind code of ref document: A1