CN116661656A - Picture interaction method and shooting display system - Google Patents

Picture interaction method and shooting display system Download PDF

Info

Publication number
CN116661656A
CN116661656A CN202310965367.6A CN202310965367A CN116661656A CN 116661656 A CN116661656 A CN 116661656A CN 202310965367 A CN202310965367 A CN 202310965367A CN 116661656 A CN116661656 A CN 116661656A
Authority
CN
China
Prior art keywords
display
picture
end user
interaction method
presentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310965367.6A
Other languages
Chinese (zh)
Other versions
CN116661656B (en
Inventor
李祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anke Youxuan Shenzhen Technology Co ltd
Original Assignee
Anke Youxuan Shenzhen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anke Youxuan Shenzhen Technology Co ltd filed Critical Anke Youxuan Shenzhen Technology Co ltd
Priority to CN202310965367.6A priority Critical patent/CN116661656B/en
Publication of CN116661656A publication Critical patent/CN116661656A/en
Application granted granted Critical
Publication of CN116661656B publication Critical patent/CN116661656B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to a picture interaction method which is used for an end user to interact with a shooting display system, wherein the shooting display system comprises a display screen, a visible light camera, an infrared light camera and at least one infrared emission fingerstall. The picture interaction method comprises the steps of controlling and operating display elements in an element display picture based on terminal user instruction actions; superposing the display elements after the control operation on the terminal user picture to generate a final display picture; and carrying out display operation on the final display picture. The application also provides a camera shooting display system, and the picture interaction method and the camera shooting display system receive the command action of the terminal user through the infrared camera, so that the terminal user can carry out personalized interaction control on display elements in the element display picture, and the diversity of the picture display effect is improved.

Description

Picture interaction method and shooting display system
Technical Field
The application relates to the field of picture interaction processing, in particular to a picture interaction method and a shooting display system.
Background
With the development of society, people are increasingly interested in various live forms, wherein watching live education and live shopping have become daily necessities of some users.
When the anchor conducts live broadcast teaching or live broadcast shopping, the anchor can introduce products or display teaching contents through the display terminal. However, the display terminal can only display static content or preset dynamic content, and the anchor can not perform interactive control on the content displayed in the display terminal or needs to perform simple interactive control through a mouse or a cursor, so that the display picture of the existing live broadcast display terminal can not perform personalized interactive control, and the display effect is single.
Therefore, there is a need to develop a picture interaction method and a camera display system to solve the above technical problems.
Disclosure of Invention
The embodiment of the application provides a picture interaction method capable of performing personalized interaction control and having various display effects and a camera shooting display system, which are used for solving the technical problems that a display picture cannot be subjected to personalized interaction control and the display effect is single in the conventional picture interaction method.
The embodiment of the application provides a picture interaction method, which is used for an end user to interact with a shooting display system and is characterized in that the shooting display system comprises a display screen, a visible light camera, an infrared light camera and at least one infrared emission finger sleeve, wherein the display content of the display screen comprises an end user picture shot by the visible light camera, an end user instruction action generated by the end user shot by the infrared light camera by using the infrared emission finger sleeve and an element display picture responding to the end user instruction action; the picture interaction method comprises the following steps:
based on the terminal user instruction action, performing control operation on the display elements in the element display picture;
superposing the display elements after the control operation on the terminal user picture to generate a final display picture; and
and carrying out display operation on the final display picture.
In the picture interaction method, if the display element is a display object, the control operation performed on the display element includes a plane translation operation, an enlargement and reduction operation, a rotation operation and a deformation operation;
if the display element is a display video, the control of the display element includes a planar translation operation, a zoom-in and zoom-out operation, and a play pause operation.
In the picture interaction method of the present application, the display content of the display screen further includes a plurality of terminal user instruction selection pictures for controlling and operating selection keys, and the picture interaction method further includes:
and determining the type of the control operation based on the end user instruction action of the end user on the control operation selection key.
In the picture interaction method of the present application, the zooming-in and zooming-out operation for the display element includes:
setting an initial position and a display position of the display element;
setting an initial position preset display size of the display element;
setting a preset display size of a display position of the display element; the preset display size of the display position is larger than the preset display size of the initial position;
and setting a middle bit preset display size of the middle bit based on the distance between the middle bit and the initial bit and the distance between the middle bit and the display bit, wherein the middle bit preset display size is larger than the initial bit preset display size and smaller than the display bit preset display size.
In the picture interaction method of the present application, the picture interaction method further includes:
and setting an instruction action response space, and setting display elements in the instruction action response space to form an element display picture responding to the instruction action of the terminal user.
In the picture interaction method, if a plurality of display elements are arranged at the same plane picture position of an element display picture, when the terminal user performs control operation on the display elements at the plane picture position, the display element with the largest display depth or the display element with the smallest display depth at the plane picture position is selected.
In the picture interaction method, a plurality of display elements are arranged at the same plane picture position of an element display picture, and when the terminal user performs control operation on the display elements at the plane picture position, the display elements are unfolded around the plane picture position so as to facilitate the terminal user to select the display elements.
In the picture interaction method of the present application, the picture interaction method further includes:
setting a plurality of presentation elements in the instruction action response space as a presentation element group;
the controlling operation of the display elements in the element display picture based on the terminal user instruction action comprises the following steps:
and simultaneously performing control operation on all the display elements in the display element group in the element display picture based on the terminal user instruction action.
In the picture interaction method of the application, the display terminal system comprises two infrared emission finger sleeves; the deformation operation of the display element is specifically as follows:
and carrying out deformation operation on the display element based on the distance change of the command action of the terminal user generated by the two infrared emission fingerstalls.
The embodiment of the application also provides a camera shooting display system which is used for interacting with a terminal user and comprises a display screen, a visible light camera, an infrared light camera, at least one infrared emission fingerstall and a picture processing chip;
the display content of the display screen comprises an end user picture shot by the visible light camera, an end user instruction action generated by an end user shot by the infrared light camera by using the infrared emission fingerstall and an element display picture responding to the end user instruction action;
the picture processing chip includes:
the display element control operation module is used for controlling and operating display elements in the element display picture based on the instruction action of the terminal user;
the final display picture generation module is used for superposing the display elements after the control operation on the terminal user picture to generate a final display picture; and
and the display module is used for displaying the terminal display picture on the display screen.
Embodiments of the present application also provide a computer storage medium having stored therein processor-executable instructions that are loaded by one or more processors to perform the above-described picture interaction method.
Compared with the prior art, the picture interaction method and the camera shooting display system receive the command actions of the terminal user through the infrared camera, so that the terminal user can perform personalized interaction control on the display elements in the element display picture, and the diversity of the picture display effect is improved; the method effectively solves the technical problems that the display picture cannot be subjected to personalized interaction control and the display effect is single in the conventional picture interaction method.
Drawings
FIG. 1 is a schematic diagram of an embodiment of a camera display system of the present application;
FIG. 2 is a schematic diagram of a hierarchical structure of display contents of a display screen in a picture interaction method according to the present application;
FIG. 3 is a schematic plan view of the display content of the display screen in the picture interaction method of the present application;
FIG. 4 is a flowchart of an embodiment of a screen interaction method according to the present application;
FIG. 5 is a schematic diagram illustrating an operation of zooming in and out a display element according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a selection operation of a presentation element according to an embodiment of the present application;
FIG. 7a is a second diagram illustrating a selection operation of a display element according to an embodiment of the screen interaction method of the present application;
FIG. 7b is a third diagram illustrating a selection operation of a display element according to an embodiment of the screen interaction method of the present application;
FIGS. 8 a-8 d are schematic diagrams illustrating control operations on a presentation element group according to an embodiment of a screen interaction method of the present application;
FIG. 9 is a block diagram of an embodiment of a camera display system of the present application;
fig. 10 is a schematic diagram of a working environment of an electronic device in which the picture interaction method of the present application is located.
Detailed Description
Referring to the drawings, wherein like reference numbers represent like elements throughout, the principles of the present application are illustrated in a suitable computing environment. The following description is based on illustrative embodiments of the application and should not be taken as limiting other embodiments of the application not described in detail herein.
In the description that follows, embodiments of the application will be described with reference to the steps and symbols of operations performed by one or more computers, unless otherwise indicated. It will thus be appreciated that the steps and operations, to name a few, are performed by a computer, including being manipulated by a computer processing element representing electronic signals representing data in a structured form. This manipulation transforms the data or maintains it in a location within the computer's memory system, which may be re-configured or otherwise alter the computer's operation in a manner well known to those skilled in the art. The data structure maintained by the data is the physical location of the memory having specific characteristics defined by the data format. However, the principles of the present application are described in the foregoing text and are not meant to be limiting, and those of skill in the art will appreciate that various of the steps and operations described below may also be implemented in hardware.
The picture interaction method can be used for the interactive operation of the terminal user and the camera display system, the camera display system can be shown in fig. 1, and fig. 1 is a schematic diagram of the structure of the camera display system applicable to the picture interaction method. The camera display system 10 includes a display screen 11, a visible light camera 12, an infrared light camera 13, and at least one infrared emission fingerstall 14. Wherein the visible light camera 12 is used for shooting an end user picture under visible light; the infrared camera 13 is used for shooting the command action of the end user under infrared light; the end user generates end user command actions by using infrared emission fingerstall 14; the display screen 11 is used for displaying corresponding display contents. The display content comprises an end user picture shot by the visible light camera 12, an end user instruction action generated by an end user shot by the infrared light camera 13 by using the infrared emission fingerstall 14 and an element display picture corresponding to the end user instruction action.
The display content of the display screen can be shown in fig. 2 and fig. 3, wherein fig. 2 is a schematic hierarchical structure diagram of the display content of the display screen in the picture interaction method of the present application. Fig. 3 is a schematic plan view of a display content of a display screen in the picture interaction method of the present application.
As shown in fig. 2, the end user picture captured by the visible light camera 12 and the end user command action captured by the infrared light camera 13 are both located in the first level 21, and the element display picture of the end user command action control is located in a command action response space 22, where the command action response space 22 may be superimposed on the first level 21. Presentation elements 231, 232, and 233 can be set at different end user picture positions and different depth positions by setting instruction action response space 22, i.e., not only the setting positions of different presentation elements in element presentation picture 33 can be set, but also the occlusion relationship between different presentation elements can be adjusted by setting the depth positions.
Fig. 3 is a schematic diagram showing a planar structure of the instruction action response space 22 superimposed on the first hierarchy 21. The display includes an end user screen 31, an end user command action 32 generated by an infrared emission fingerstall, and an element presentation screen 33.
Wherein end user command actions 32 include a trigger action by the end user to activate infrared emission of infrared emission fingerstall 14 and a stop action by the end user to stop infrared emission of infrared emission fingerstall 14. The end user may use the end user instruction action 32 to operate the presentation elements 231, 232, and 233 in the instruction action response space 22, thereby changing the projected positions of the presentation elements 231, 232, and 233 in the element presentation screen 33.
Specifically, the infrared emission fingerstall 14 in the present application may be a handheld infrared emitter, and the end user may perform a triggering action of starting to emit infrared light or a stopping action of stopping emitting infrared light through finger action.
Referring to fig. 4, fig. 4 is a flowchart of an embodiment of a screen interaction method according to the present application. The picture interaction method of the embodiment can be executed on the camera display system, and is used for realizing the interaction operation between the terminal user and the camera display system. The picture interaction method comprises the following steps:
step S401, based on the instruction action of the terminal user, performing control operation on the display elements in the element display picture;
step S402, superposing the display element after the control operation on a terminal user picture to generate a final display picture;
step S403, performing display operation on the final display picture.
The specific flow of each step of the screen interaction method of the present embodiment is described in detail below.
In step S401, the camera display system captures an end user command action of the end user through the infrared camera. The end user uses the end user instruction action to control the operation of the display element in the element display screen or instruction action response space.
In particular, the presentation element may be a presentation object or a presentation video. When the display element is a display object, control operations performed by the end user instruction action on the display element in the element display screen include a planar translation operation, a zoom-in and zoom-out operation, a rotation operation, and a deformation operation.
The planar translation operation herein refers to an operation of moving the display object in the display plane of the display screen, such as moving the display object in the up-down, left-right direction in the display plane, or the like. The enlarging and reducing operation refers to an operation of enlarging and reducing a display size of the display object in the display plane, such as an operation of moving the display object to the center of the display plane for enlarging so as to observe the display object; or the display object is moved to the edge of the display plane to be reduced so as to temporarily store the display object at the edge of the display plane. The rotation operation refers to adjusting the display angle of the display object in the display plane, for example, the display object is a model of a globe, and the model of the globe can be rotated through the rotation operation so that the geographic location of the country that any end user wants to display faces the display screen. The deformation operation refers to a deformation operation of the display object in the display plane, for example, the display object is a spring ball, and the spring ball can be stretched or compressed through the deformation operation.
Furthermore, the terminal user can use one infrared emission finger sleeve with one hand respectively, so that the terminal user can use two hands to generate two terminal user instruction actions, and the display terminal system can perform deformation operation on the display element based on the distance change of the terminal user instruction actions generated by the two infrared emission finger sleeves; for example, when the distance of the command action of the terminal user generated by the two infrared emission fingerstalls is increased, stretching operation is carried out on the display element; if the distance between the two infrared emission fingerstalls for generating the command actions of the terminal user becomes smaller, the display elements are compressed, and the like.
When the presentation element is a presentation video, control operations performed by the end user instruction action on the presentation element in the element presentation screen include a plane panning operation, a zooming-in and zooming-out operation, and a play pause operation.
The planar shift operation herein refers to an operation of moving the presentation video in the display plane of the display screen, such as moving the presentation video in the up-down, left-right direction in the display plane, or the like. The zoom-in and zoom-out operation refers to a zoom-in and zoom-out operation of a display size of a display video in a display plane. The play pause operation refers to an operation of playing the trigger display video or stopping playing the trigger display video.
In order to realize different types of control operations on the display elements in the element display screen by the end user, the display content of the display screen further comprises an end user instruction selection screen 34 for switching different types of control operations, and the end user instruction selection screen 34 comprises a plurality of different control operation selection keys, so that the camera display system can determine the type of control operation based on the end user instruction actions of the end user on the control operation selection keys.
For example, if the end user performs a triggering action on the control operation selection key 341 of the end user command selection screen 34, then the end user performs a planar translation operation on the corresponding display element through the triggering action of the infrared emission finger stall on the display element in the element display screen 33. If the end user performs a triggering action on the control operation selection key 342 of the end user instruction selection screen 34, then the end user performs a zoom-in and zoom-out operation on the corresponding display element through the triggering action of the infrared emission finger stall on the display element of the element display screen.
If the display element in the element display screen 33 cannot execute the control operation corresponding to the control operation selection key selected by the end user, if the end user wants to execute the play pause operation on the display object, but the display object cannot execute the play pause operation, the camera display system may prompt the control operation error so that the end user can select the accurate control operation selection key.
Further, if the end user needs to zoom in and out the display element, as shown in fig. 5, fig. 5 is a schematic diagram of the zooming in and out operation of the display element in the embodiment of the picture interaction method of the present application. The camera display system can set an initial position 51 and a display position 52 of a display element (display video) in advance, so as to facilitate temporary storage of the display element at the initial position 51 and effective display at the display position 52, and can set an initial position preset display size of the display element at the initial position and a display position preset display size at the display position; the preset display size of the display position is larger than the preset display size of the initial position, so that the display element can be effectively displayed at the display position. Therefore, the terminal user can synchronously realize the zoom-in and zoom-out operation of the display element by carrying out the plane translation operation on the display element, the zoom-in and zoom-out proportion can be accurately controlled, the situation that the terminal user carries out the plane translation operation and the zoom-in and zoom-out operation on the display element for many times is avoided, and the plane translation position and the zoom-in and zoom-out proportion cannot be effectively and accurately controlled.
In order to better realize the effect of enlarging and gradually changing the display element from the initial position to the display position, the camera display system can set the preset display size of the display element in the middle position of any middle position based on the preset display size of the initial position of the display element and the preset display size of the display position. Specifically, the camera display system may set a middle preset display size of the middle based on a distance between the middle and the initial position and a distance between the middle and the display position, where the middle preset display size should be greater than the initial preset display size and less than the display preset display size. The closer the intermediate bit is to the initial bit, the smaller the corresponding intermediate bit preset display size is, and the closer the corresponding intermediate bit preset display size is to the initial bit preset display size; the closer the intermediate position is to the display position, the larger the corresponding intermediate position preset display size is, and the closer the corresponding intermediate position preset display size is to the display position preset display size, so that the amplifying and gradual change effect of the display element from the initial position to the display position is effectively realized.
Further, if a plurality of display elements are simultaneously disposed on the display plane of the display screen, the end user may need to adjust the shielding relationship of different display elements based on the requirement, so that the camera display system may set the display depth of all display elements on the same plane picture position of the element display picture, so as to realize the shielding relationship of different display elements.
Specifically, as shown in fig. 2, if the display element 232 is required to cover the display element 231 (if the display element 232 and the display element 231 are located at the same plane picture position), the camera display system only needs to set the depth of the display element 232 (the distance between the display element 232 and the first level 21) to be greater than the depth of the display element 231 (the distance between the display element 231 and the first level 21).
In addition, when a plurality of display elements on the display plane are blocked, as shown in fig. 6, fig. 6 is a schematic diagram illustrating a selection operation of the display elements in an embodiment of the screen interaction method of the present application. A plurality of display elements, such as display element 61, display element 62, and display element 63 in fig. 6, are disposed on the same plane screen position of the element display screen, wherein the display depth of display element 61 is the largest and the display depth of display element 63 is the smallest.
In order to meet the selection requirements of different users on the display elements, the camera display system can set that when the terminal user performs control operation on the display elements on the plane picture position, the display element with the largest display depth on the plane picture position is selected, namely, the display element 61 is selected. Thus, the terminal user can translate the display elements by controlling the topmost display element in the overlapped display elements of the display screen so as to display the effect that the terminal user translates the display elements from top to bottom in sequence.
The camera display system may also set that when the terminal user controls the display element at the plane screen position, select the display element with the smallest display depth at the plane screen position, that is, select the display element 63. Thus, the terminal user can control the bottommost display element in the overlapped display elements of the display screen, and perform translation operation on the display elements so as to present the effect that the terminal user translates the display elements from bottom to top in sequence.
Here the end user may perform panning of the bottom-most presentation element or panning of the top-most presentation element by setting different end user instruction actions, respectively. For example, the left hand and the right hand of the terminal user are respectively provided with an infrared emission finger sleeve, the camera shooting display system can set the infrared emission finger sleeve of the left hand of the terminal user to be used for translating the bottommost display element, and the infrared emission finger sleeve of the right hand of the terminal user is used for translating the topmost display element, so that the terminal user can respectively translate the bottommost display element and the topmost display element through the two infrared emission finger sleeves.
Of course, the camera shooting display system can also set the display elements with arbitrary display depth on the plane picture position when the terminal user controls the display elements on the plane picture position. That is, when the terminal user performs a control operation on the display element at the plane screen position, the camera display system may set a plurality of display elements overlapped at the same plane screen position to be spread around the plane screen position, so that the terminal user may select the display element to be controlled for operation, specifically, as shown in fig. 7a and 7b, the display element 71, the display element 72 and the display element 73 in fig. 7a may overlap each other, when the terminal user performs a control operation on the display element at the plane screen position, that is, the terminal user issues an end user instruction action at the plane screen position, the display element 71, the display element 72 and the display element 73 may be spread around the plane screen position in sequence (the display element 71, the display element 72 and the display element 73 at the plane screen position may be reserved), the terminal user may select the display element 71, the display element 72 and the display element 73 for translation and other control operations according to needs, and the unselected display element may remain at the original plane screen position, as shown in fig. 7b, and the terminal user performs a plane translation operation on the display element 72.
Further, the end user can also control the operation of multiple display elements in the instruction action response space through the instruction action of the end user. Referring specifically to fig. 8a to 8d, presentation elements 821, 822 and 823 are disposed in the instruction action response space 82, and the end user can set the presentation elements 821, 822 and 823 in the instruction action response space as a presentation element group (e.g. by the end user instruction action box selecting all the presentation elements in a specific plane screen area and setting as the presentation element group) so that the end user can operate the whole presentation element group.
Before the end user performs a control operation on the presentation elements in the presentation element group, the positions of presentation elements 821, 822 and 823 in the instruction action response space are shown in fig. 8a, and the presentation elements 821, 822 and 823 are shown in an element presentation screen 82 of the display screen in fig. 8 b.
The end user may select a certain rotation center point to perform a rotation operation on the whole display element group, for example, the center of the display element 822 is used as the rotation center point to perform a rotation operation on the whole display element group (of course, the end user may also select any point in the instruction action response space 82 as the rotation center point according to needs), after performing a rotation operation on the center of the display element 822, the positions of the display elements 821, 822 and 823 in the instruction action response space 82 are shown in fig. 8c, and the element display screen 82 of the display elements 821, 822 and 823 on the display screen is shown in fig. 8 d.
After the rotation operation, the end user may select an individual presentation element in the presentation element group to perform other control operations again, such as a zoom-in and zoom-out operation or a deformation operation.
In step S402, after the terminal user performs the control operation on the display element through the instruction action of the terminal user, the camera display system may directly superimpose the display element after the control operation on the terminal user screen, so as to generate a final display screen. Thus, the terminal user can better superimpose the adjusted display element on the terminal user picture.
In step S403, the camera display system may send the final display screen generated in step S402 to other live broadcast display terminals. Thus, other live broadcast display terminals can better display personalized final display pictures overlapped by terminal user pictures and element display pictures.
Thus, the interactive operation process between the terminal user and the camera shooting display system of the picture interactive method of the embodiment is completed.
According to the picture interaction method, the infrared camera is used for receiving the command action of the terminal user, so that the terminal user can conduct personalized interaction control on the display elements in the element display picture, and the diversity of picture display effects is improved.
The application also provides a camera display system which is used for interacting with an end user, as shown in fig. 9, and fig. 9 is a schematic structural diagram of an embodiment of the camera display system of the application. The camera display system 90 includes a display screen 91, a visible light camera 92, an infrared light camera 93, at least one infrared emission fingerstall 94, and a picture processing chip 95.
The display content of the display screen 91 includes an end user picture shot by a visible light camera, an end user instruction action generated by an end user shot by an infrared light camera by using an infrared emission fingerstall, and an element display picture responding to the end user instruction action.
The screen processing chip 95 includes a display element control operation module 951, a final display screen generation module 952, and a display module 953.
The display element control operation module 951 is configured to perform control operation on display elements in an element display screen based on an instruction action of an end user; the final display screen generating module 952 is configured to superimpose the display element after the control operation on the terminal user screen, and generate a final display screen; the display module 953 is configured to perform a display operation on a terminal display screen.
The specific working principle of the image capturing display system of the present application is the same as or similar to that described in the above embodiment of the picture interaction method, and please refer to the related description in the above embodiment of the picture interaction method.
According to the picture interaction method and the picture interaction device, the infrared camera is used for receiving the command action of the terminal user, so that the terminal user can perform personalized interaction control on the display elements in the element display picture, and the diversity of the picture display effect is improved; the method effectively solves the technical problems that the display picture cannot be subjected to personalized interaction control and the display effect is single in the conventional picture interaction method.
The terms "component," "module," "system," "interface," "process," and the like as used herein are generally intended to refer to a computer-related entity: hardware, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a controller and the controller can be a component. One or more components can reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
Fig. 10 and the following discussion provide a brief, general description of the operating environment of an electronic device in which the picture interaction apparatus of the present application may be implemented. The work environment of fig. 10 is only one example of a suitable work environment and is not intended to suggest any limitation as to the scope of use or functionality of the work environment.
Although not required, embodiments are described in the general context of "computer-readable instructions" being executed by one or more electronic devices or electronic devices. Computer readable instructions may be distributed via a computer readable medium (discussed below). Computer readable instructions may be implemented as program modules, such as functions, objects, application Programming Interfaces (APIs), data structures, etc., that perform particular tasks or implement particular abstract data types. Typically, the functionality of the computer readable instructions may be combined or distributed as desired in various environments.
Fig. 10 illustrates an example of an electronic device 1012 including one or more embodiments of the screen interaction apparatus of the present application. In one configuration, electronic device 1012 includes at least one processing unit 1016 and memory 1018. Depending on the exact configuration and type of electronic device, memory 1018 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. This configuration is illustrated in fig. 10 by dashed line 1014.
In other embodiments, electronic device 1012 may include additional features and/or functionality. For example, device 1012 may also include additional storage (e.g., removable and/or non-removable) including, but not limited to, magnetic storage, optical storage, and the like. Such additional storage is illustrated in fig. 10 by storage 1020. In one embodiment, computer readable instructions for implementing one or more embodiments provided herein may be in storage 1020. Storage 1020 may also store other computer readable instructions for implementing an operating system, application programs, and the like. Computer readable instructions may be loaded in memory 1018 for execution by processing unit 1016, for example.
The term "computer readable media" as used herein includes computer storage media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions or other data. Memory 1018 and storage 1020 are examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by electronic device 1012. Any such computer storage media may be part of electronic device 1012.
The electronic device 1012 may also include a communication connection 1026 that allows the electronic device 1012 to communicate with other devices. Communication connection 1026 may include, but is not limited to, a modem, a Network Interface Card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a USB connection, or other interface for connecting electronic device 1012 to other electronic devices. The communication connection 1026 may include a wired connection or a wireless connection. Communication connection 1026 may transmit and/or receive communication media.
The term "computer readable media" may include communication media. Communication media typically embodies computer readable instructions or other data in a "modulated data signal" such as a carrier wave or other transport mechanism and includes any information delivery media. The term "modulated data signal" may include such signals: one or more of the signal characteristics are set or changed in such a manner as to encode information into the signal.
Electronic device 1012 may include input device(s) 1024 such as a keyboard, mouse, pen, voice input device, touch input device, infrared camera, video input device, and/or any other input device. Output device(s) 1022 such as one or more displays, speakers, printers, and/or any other output device may also be included in device 1012. Input device 1024 and output device 1022 may be connected to electronic device 1012 via a wired connection, a wireless connection, or any combination thereof. In one embodiment, an input device or an output device from another electronic device may be used as input device 1024 or output device 1022 for electronic device 1012.
The components of the electronic device 1012 may be connected by various interconnects (e.g., buses). Such interconnects may include a Peripheral Component Interconnect (PCI), such as PCI express, universal Serial Bus (USB), firewire (IEEE 1394), an optical bus structure, and the like. In another embodiment, components of electronic device 1012 may be interconnected by a network. For example, memory 1018 may be comprised of multiple physical memory units located in different physical locations interconnected by a network.
Those skilled in the art will appreciate that storage devices for storing computer readable instructions may be distributed across a network. For example, an electronic device 1030 accessible via network 1028 may store computer readable instructions for implementing one or more embodiments of the application. Electronic device 1012 may access electronic device 1030 and download a part or all of the computer readable instructions for execution. Alternatively, electronic device 1012 may download pieces of the computer readable instructions, as needed, or some instructions may be executed at electronic device 1012 and some at electronic device 1030.
Various operations of the embodiments are provided herein. In one embodiment, the one or more operations may constitute computer-readable instructions stored on one or more computer-readable media that, when executed by an electronic device, will cause the computing device to perform the operations. The order in which some or all of the operations are described should not be construed as to imply that these operations are necessarily order dependent. Those skilled in the art will appreciate alternative ordering that will have the benefit of this description. Moreover, it should be understood that not all operations need be present in every embodiment provided herein.
Moreover, although the disclosure has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art based upon a reading and understanding of this specification and the annexed drawings. The present disclosure includes all such modifications and alterations and is limited only by the scope of the following claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the disclosure. Furthermore, while a particular feature of the disclosure may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for a given or particular application. Moreover, to the extent that the terms "includes," has, "" contains, "or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term" comprising.
The functional units in the embodiment of the application can be integrated in one processing module, or each unit can exist alone physically, or two or more units are integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product. The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. Each of the devices or systems described above may perform the method in the corresponding method embodiment.
In summary, although the embodiments of the present application have been described above, the numbers before the embodiments are used for convenience of description, and the order of the embodiments of the present application is not limited. Moreover, the above-mentioned embodiments are not intended to limit the present application, and those skilled in the art can make various modifications and variations without departing from the spirit and scope of the present application, so the scope of the present application is defined by the claims.

Claims (10)

1. The picture interaction method is used for enabling an end user to interact with a shooting display system and is characterized in that the shooting display system comprises a display screen, a visible light camera, an infrared light camera and at least one infrared emission finger sleeve, wherein the display content of the display screen comprises an end user picture shot by the visible light camera, an end user instruction action generated by the end user shot by the infrared light camera by using the infrared emission finger sleeve and an element display picture responding to the end user instruction action; the picture interaction method comprises the following steps:
based on the terminal user instruction action, performing control operation on the display elements in the element display picture;
superposing the display elements after the control operation on the terminal user picture to generate a final display picture; and
and carrying out display operation on the final display picture.
2. The picture interaction method according to claim 1, wherein if the display element is a display object, the control operation performed on the display element includes a planar translation operation, a zoom-in and zoom-out operation, a rotation operation, and a deformation operation;
if the display element is a display video, the control of the display element includes a planar translation operation, a zoom-in and zoom-out operation, and a play pause operation.
3. The screen interaction method according to claim 2, wherein the display content of the display screen further includes a terminal user instruction selection screen for controlling operation selection keys, the screen interaction method further comprising:
and determining the type of the control operation based on the end user instruction action of the end user on the control operation selection key.
4. The picture interaction method as claimed in claim 2, wherein the zooming-in and zooming-out operation of the presentation element comprises:
setting an initial position and a display position of the display element;
setting an initial position preset display size of the display element;
setting a preset display size of a display position of the display element; the preset display size of the display position is larger than the preset display size of the initial position;
and setting a middle bit preset display size of the middle bit based on the distance between the middle bit and the initial bit and the distance between the middle bit and the display bit, wherein the middle bit preset display size is larger than the initial bit preset display size and smaller than the display bit preset display size.
5. The picture interaction method as claimed in claim 2, wherein the picture interaction method further comprises:
and setting an instruction action response space, and setting display elements in the instruction action response space to form an element display picture responding to the instruction action of the terminal user.
6. The picture interaction method as claimed in claim 5, further comprising:
setting a plurality of presentation elements in the instruction action response space as a presentation element group;
the controlling operation of the display elements in the element display picture based on the terminal user instruction action comprises the following steps:
and simultaneously performing control operation on all the display elements in the display element group in the element display picture based on the terminal user instruction action.
7. The picture interaction method according to claim 5, wherein a plurality of presentation elements are provided at a same plane picture position as an element presentation picture, and when the terminal user performs a control operation on the presentation element at the plane picture position, a presentation element with a largest presentation depth or a presentation element with a smallest presentation depth at the plane picture position is selected.
8. The picture interaction method according to claim 5, wherein a plurality of presentation elements are provided at a same plane picture position as an element presentation picture, and when the end user performs a control operation on the presentation elements at the plane picture position, the plurality of presentation elements are spread around the plane picture position so that the end user selects the presentation elements.
9. The picture interaction method according to claim 2, wherein the camera display system comprises two infrared emission fingerstalls; the deformation operation of the display element is specifically as follows:
and carrying out deformation operation on the display element based on the distance change of the command action of the terminal user generated by the two infrared emission fingerstalls.
10. The camera shooting display system is used for interacting with an end user and is characterized by comprising a display screen, a visible light camera, an infrared light camera, at least one infrared emission fingerstall and a picture processing chip;
the display content of the display screen comprises an end user picture shot by the visible light camera, an end user instruction action generated by an end user shot by the infrared light camera by using the infrared emission fingerstall and an element display picture responding to the end user instruction action;
the picture processing chip includes:
the display element control operation module is used for controlling and operating display elements in the element display picture based on the instruction action of the terminal user;
the final display picture generation module is used for superposing the display elements after the control operation on the terminal user picture to generate a final display picture; and
and the display module is used for displaying the terminal display picture on the display screen.
CN202310965367.6A 2023-08-02 2023-08-02 Picture interaction method and shooting display system Active CN116661656B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310965367.6A CN116661656B (en) 2023-08-02 2023-08-02 Picture interaction method and shooting display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310965367.6A CN116661656B (en) 2023-08-02 2023-08-02 Picture interaction method and shooting display system

Publications (2)

Publication Number Publication Date
CN116661656A true CN116661656A (en) 2023-08-29
CN116661656B CN116661656B (en) 2024-03-12

Family

ID=87724695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310965367.6A Active CN116661656B (en) 2023-08-02 2023-08-02 Picture interaction method and shooting display system

Country Status (1)

Country Link
CN (1) CN116661656B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104020897A (en) * 2014-05-23 2014-09-03 神画科技(深圳)有限公司 Interaction display system
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN104834394A (en) * 2014-02-09 2015-08-12 神画科技(深圳)有限公司 Interaction display system
CN104978077A (en) * 2014-04-08 2015-10-14 联想(北京)有限公司 Interaction method and interaction system
US20150346813A1 (en) * 2014-06-03 2015-12-03 Aaron Michael Vargas Hands free image viewing on head mounted display
CN106446801A (en) * 2016-09-06 2017-02-22 清华大学 Micro-gesture identification method and system based on ultrasonic active detection
CN107257439A (en) * 2017-07-26 2017-10-17 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
US20180018917A1 (en) * 2016-07-15 2018-01-18 Samsung Display Co., Ltd. Organic light emitting display device and head mounted display system having the same
CN109716276A (en) * 2016-06-10 2019-05-03 赫克斯冈技术中心 The system and method for the element being visually blocked in access threedimensional model
CN111766936A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content control method and device, terminal equipment and storage medium
CN111880648A (en) * 2020-06-19 2020-11-03 华为技术有限公司 Three-dimensional element control method and terminal
CN114422755A (en) * 2022-01-12 2022-04-29 中驰智能科技(扬州)有限公司 Multi-camera shooting picture display method based on vehicle-mounted shooting adjustment
CN114428571A (en) * 2020-10-29 2022-05-03 深圳Tcl新技术有限公司 Interaction method, computer equipment and computer readable storage medium
CN114830066A (en) * 2019-12-23 2022-07-29 苹果公司 Device, method and graphical user interface for displaying applications in a three-dimensional environment
CN114915852A (en) * 2021-02-09 2022-08-16 腾讯科技(深圳)有限公司 Video call interaction method and device, computer equipment and storage medium
US20220317776A1 (en) * 2021-03-22 2022-10-06 Apple Inc. Methods for manipulating objects in an environment
US20230092282A1 (en) * 2021-09-23 2023-03-23 Apple Inc. Methods for moving objects in a three-dimensional environment
US20230221833A1 (en) * 2022-01-10 2023-07-13 Apple Inc. Methods for displaying user interface elements relative to media content

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834394A (en) * 2014-02-09 2015-08-12 神画科技(深圳)有限公司 Interaction display system
CN104978077A (en) * 2014-04-08 2015-10-14 联想(北京)有限公司 Interaction method and interaction system
CN104020897A (en) * 2014-05-23 2014-09-03 神画科技(深圳)有限公司 Interaction display system
US20150346813A1 (en) * 2014-06-03 2015-12-03 Aaron Michael Vargas Hands free image viewing on head mounted display
CN104199547A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method
CN109716276A (en) * 2016-06-10 2019-05-03 赫克斯冈技术中心 The system and method for the element being visually blocked in access threedimensional model
US20180018917A1 (en) * 2016-07-15 2018-01-18 Samsung Display Co., Ltd. Organic light emitting display device and head mounted display system having the same
CN106446801A (en) * 2016-09-06 2017-02-22 清华大学 Micro-gesture identification method and system based on ultrasonic active detection
CN107257439A (en) * 2017-07-26 2017-10-17 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN111766936A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content control method and device, terminal equipment and storage medium
CN114830066A (en) * 2019-12-23 2022-07-29 苹果公司 Device, method and graphical user interface for displaying applications in a three-dimensional environment
CN115167676A (en) * 2019-12-23 2022-10-11 苹果公司 Apparatus and method for displaying applications in a three-dimensional environment
CN111880648A (en) * 2020-06-19 2020-11-03 华为技术有限公司 Three-dimensional element control method and terminal
CN114428571A (en) * 2020-10-29 2022-05-03 深圳Tcl新技术有限公司 Interaction method, computer equipment and computer readable storage medium
CN114915852A (en) * 2021-02-09 2022-08-16 腾讯科技(深圳)有限公司 Video call interaction method and device, computer equipment and storage medium
US20220317776A1 (en) * 2021-03-22 2022-10-06 Apple Inc. Methods for manipulating objects in an environment
US20230092282A1 (en) * 2021-09-23 2023-03-23 Apple Inc. Methods for moving objects in a three-dimensional environment
US20230221833A1 (en) * 2022-01-10 2023-07-13 Apple Inc. Methods for displaying user interface elements relative to media content
CN114422755A (en) * 2022-01-12 2022-04-29 中驰智能科技(扬州)有限公司 Multi-camera shooting picture display method based on vehicle-mounted shooting adjustment

Also Published As

Publication number Publication date
CN116661656B (en) 2024-03-12

Similar Documents

Publication Publication Date Title
US7761813B2 (en) Three-dimensional motion graphic user interface and method and apparatus for providing the same
US8997021B2 (en) Parallax and/or three-dimensional effects for thumbnail image displays
US9026938B2 (en) Dynamic detail-in-context user interface for application access and content access on electronic displays
US20070120846A1 (en) Three-dimensional motion graphic user interface and apparatus and method for providing three-dimensional motion graphic user interface
US20140359435A1 (en) Gesture Manipulations for Configuring System Settings
JP2017513106A (en) Generate screenshot
US20080065992A1 (en) Cascaded display of video media
CN112230914B (en) Method, device, terminal and storage medium for producing small program
US20150212713A1 (en) Information processing apparatus, information processing method, and computer-readable recording medium
CN108205431A (en) Show equipment and its control method
US20170142372A1 (en) Method of displaying surveillance video and computer program product therefor
TWI442305B (en) A operation method and a system of the multi-touch
CN107943381A (en) Hot-zone method of adjustment and device, client
US20140317549A1 (en) Method for Controlling Touchscreen by Using Virtual Trackball
CN116661656B (en) Picture interaction method and shooting display system
CN112740161A (en) Terminal, method for controlling terminal, and recording medium having recorded therein program for implementing the method
US9292165B2 (en) Multiple-mode interface for spatial input devices
US20130201095A1 (en) Presentation techniques
US11557065B2 (en) Automatic segmentation for screen-based tutorials using AR image anchors
US20170351415A1 (en) System and interfaces for an interactive system
KR102223554B1 (en) Terminal, method for contrlling thereof and recording medium on which a program for implemeting the method
KR102245042B1 (en) Terminal, method for contrlling thereof and recording medium on which a program for implemeting the method
EP3848785B1 (en) Mobile device and method for controlling mobile device
KR102092156B1 (en) Encoding method for image using display device
KR20200137594A (en) A mobile apparatus and a method for controlling the mobile apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant