CN114816206A - Data processing method and electronic equipment - Google Patents

Data processing method and electronic equipment Download PDF

Info

Publication number
CN114816206A
CN114816206A CN202210251905.0A CN202210251905A CN114816206A CN 114816206 A CN114816206 A CN 114816206A CN 202210251905 A CN202210251905 A CN 202210251905A CN 114816206 A CN114816206 A CN 114816206A
Authority
CN
China
Prior art keywords
display screen
transparent display
touch information
target object
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210251905.0A
Other languages
Chinese (zh)
Inventor
段勇
马焱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202210251905.0A priority Critical patent/CN114816206A/en
Publication of CN114816206A publication Critical patent/CN114816206A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a data processing method, which comprises the following steps: acquiring depth image information of a target object and touch information on a transparent display screen; establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information; and controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object. Simultaneously, this application still provides an electronic equipment.

Description

Data processing method and electronic equipment
Technical Field
The present disclosure relates to data processing technologies, and in particular, to a data processing method and an electronic device.
Background
In a traditional scene, when a user performs touch operation on a touch screen, corresponding touch information can only be displayed through the touch screen but cannot be combined with people or objects in a real scene, so that effective interaction between people in front of the screen and people behind the screen cannot be realized.
Disclosure of Invention
In view of this, an embodiment of the present application is directed to provide a data processing method, where the method includes:
acquiring depth image information of a target object and touch information on a transparent display screen;
establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information;
and controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object.
In the foregoing solution, the acquiring depth image information of the target object and touch information on the transparent display screen includes:
acquiring depth image information acquired by an image acquisition component in a first surface direction of the transparent display screen aiming at the target object, wherein the target object is positioned between the image acquisition component and the transparent display screen;
acquiring first touch information acquired by a touch detection component in the direction of a second surface of the transparent display screen; wherein the second surface and the first surface are oppositely disposed.
In the foregoing solution, the acquiring depth image information of the target object and touch information on the transparent display screen includes:
acquiring depth image information acquired by an image acquisition component aiming at the target object in the second surface direction of the transparent display screen, wherein the target object is positioned in the first surface direction of the transparent display screen; wherein the second surface and the first surface are oppositely disposed;
and acquiring first touch information acquired by a touch detection part in the direction of the second surface of the transparent display screen.
In the above scheme, the method further comprises:
acquiring second touch information acquired by the touch detection component in the direction of the first surface of the transparent display screen; and the display modes of the first touch information and the second touch information are the same or different.
In the foregoing solution, the obtaining depth image information of the target object and touch information on the transparent display screen at least further includes:
acquiring touch information on a transparent display screen at a first moment; acquiring depth image information of the target object at a second moment; wherein the first time is less than the second time; or the first time is greater than the second time.
In the foregoing solution, the establishing a mapping relationship between the touch information and the target object based on the depth image information and the touch information includes:
and establishing a mapping relation between the touch information and the target object based on the depth image information and the coordinate position of the touch information.
In the foregoing solution, the establishing a mapping relationship between the touch information and the target object based on the depth image information and the coordinate position of the touch information includes:
determining a first coordinate position of the touch information on the transparent display screen based on the area measurement data of the touch detection component on the transparent display screen;
determining a second coordinate position of the depth image information on the transparent display screen based on the spatial position relation between the image acquisition component and the transparent display screen;
and establishing a mapping relation between the touch information and the target object based on the first coordinate position and the second coordinate position.
In the foregoing solution, the method further includes:
detecting a behavioral action of the target object;
and if the behavior action meets an action condition, switching the current display mode of the touch information based on the behavior action.
In the above scheme, the method further comprises:
receiving a mode switching instruction aiming at the touch information;
and switching the current display mode of the touch information based on the mode switching instruction.
According to another aspect of the present application, there is provided an electronic device including:
the acquisition unit is used for acquiring depth image information of the target object and touch information on the transparent display screen;
the establishing unit is used for establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information;
and the display unit is used for controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object.
According to the data processing method and the electronic equipment, the depth image information of the target object and the touch information on the transparent display screen are obtained; establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information; and controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object. Therefore, the touch information can be combined with people or objects in a real scene, and the effect of augmented reality is achieved.
Drawings
Fig. 1 is a schematic view of a flow implementation of a data processing method in the present application;
FIG. 2 is a first scenario of the present application;
FIG. 3 is a diagram illustrating a second scenario of the present application;
FIG. 4 is a third scenario of the present application;
FIG. 5 is a diagram illustrating a fourth scenario in the present application;
FIG. 6 is a diagram illustrating a fifth scenario in the present application;
FIG. 7 is a first structural diagram of an electronic device according to the present application;
FIG. 8 is a second structural schematic diagram of an electronic device according to the present application;
fig. 9 is a schematic structural diagram of a data processing system according to the present application.
Detailed Description
The technical solution of the present application is further described in detail with reference to the drawings and specific embodiments.
Fig. 1 is a schematic view of a flow implementation of a data processing method in the present application, as shown in fig. 1, the method includes:
step 1, obtaining depth image information of a target object and touch information on a transparent display screen;
step 2, establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information;
and 3, controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object.
In the present application, the method may be applied to an electronic device, which may be a Personal Computer (PC), a server, a signal processor, or the like. The electronic device may include an image capture component and a transparent display screen, and the transparent display screen may have touch capabilities. The touch screen can also be respectively connected with an external image acquisition component and an external transparent display screen in a wired or wireless mode, and the transparent display screen can have touch control capability.
In an example one of the present application, the transparent display screen may have a first surface and a second surface, wherein the image capturing component may be disposed in a direction of the first surface of the transparent display screen, and the second surface of the transparent display screen may be a touch surface with touch capability.
In a case where a target object is present between the image capture component and the transparent display screen, the electronic device may acquire depth image information captured by the image capture component for the target object in a first surface direction of the transparent display screen.
Here, the depth image information may be actively transmitted to the electronic device by the image capturing part. The electronic device may send an image acquisition request to the image acquisition component, and the image acquisition component sends the depth image information acquired for the target object to the electronic device based on the image acquisition request.
Here, the target object includes, but is not limited to, a person, a living being, and an object. Wherein the object comprises solid, liquid, gas, etc.
In this example, taking the target object as an example of a person, the image capture component may be located behind the person if the person is facing the transparent display screen.
In this application, when the electronic device obtains the touch information on the transparent display screen, a touch detection component may be disposed in the second surface direction of the transparent display screen, and the touch detection component may be connected to the electronic device in a wired or wireless manner. When the touch detection component detects a touch operation performed by a user on the touch surface in the direction of the second surface of the transparent display screen, the touch detection component may send first touch information corresponding to the touch operation to the electronic device. Therefore, the electronic equipment can acquire the touch information on the transparent display screen.
Here, the touch detection part includes, but is not limited to, a laser radar and an infrared detection part.
As shown in fig. 2, the touch-sensing device includes a host 10, a transparent display 20, a camera 30 and a touch-sensing device 40, wherein the transparent display 20 has a first surface 21 and a second surface 22, the camera 30 is disposed in the direction of the first surface 21 of the transparent display 20, the second surface 22 is a touch surface, and the touch-sensing device 40 is disposed in the direction of the second surface 22 of the transparent display 20. When the user a is located between the camera 30 and the transparent display screen 20, the camera 30 may perform image acquisition on the user a to obtain depth image information of the user a (e.g., bone image information of the user a), and send the depth image information to the host 10, and the host 10 projects the depth image information of the user a onto the transparent display screen 20 according to a spatial position relationship between the camera 30 and the transparent display screen 20 (the depth image information of the user a is indicated by a dotted line in the figure). The user B is located in the direction of the second surface 22 of the transparent display screen 20, the user B writes on the second surface 22 of the transparent display screen 20 by touch, for example, a "love heart" is drawn at a heart portion represented by the depth image information, the touch detection component 40 sends the "love heart" touch information to the host computer 10, and the host computer 10 projects the "love heart" to the display area of the transparent display screen 20 through the touch detection component 40 for the area measurement data of the second surface 22 and is located at the heart portion represented by the depth image information of the user a. In this way, the "love heart" appears from the perspective of user B as if it is directly attached to the heart region of user a, and is also able to move and follow the beat of user a.
In example two of the present application, the image capturing component may be further disposed in a second surface direction of the transparent display screen, and is located in the same direction as the touch surface of the transparent display screen.
When the image acquisition component and the touch control surface are positioned in the same direction of the transparent display screen, the target object can be positioned in the first surface direction of the transparent display screen. And the image acquisition component positioned in the direction of the second surface of the transparent display screen can acquire an image of the target object to obtain depth image information of the target object and send the depth image information to the electronic equipment. Alternatively, the electronic device may transmit an image acquisition request to the image acquisition part located in the second surface direction of the transparent display screen, and the image acquisition part may transmit the depth image information acquired for the target object to the electronic device based on the image acquisition request. In this way, the electronic device can acquire depth image information of the target object.
For example, taking the example of a target object being a person, the transparent display screen may be located between the image capture component and the person with the person facing the transparent display screen.
In the second example, the process of acquiring the touch information on the transparent display screen by the electronic device is the same as that in the first example, and is not described herein again.
As shown in fig. 3, the difference from fig. 2 is that the camera 30 is disposed in the direction of the second surface 22 of the transparent display 20, the user a is located in the direction of the first surface 21 of the transparent display 20, and the camera 30 collects an image of the user a to obtain depth image information of the user a (e.g., bone image information of the user a), and sends the depth image information to the host 10.
Here, the camera 30 may be disposed in other directions of the transparent display 20, and here, a specific installation position of the camera 30 may not be limited as long as it can perform image acquisition on the target object, obtain depth image information of the target object, and send the depth image information to the host computer 10.
In a third example of the present application, the first surface and the second surface of the transparent display screen may both be touch surfaces, a first touch detection component may be disposed in a direction of the first surface of the transparent display screen, a second touch detection component may be disposed in a direction of the second surface of the transparent display screen, and both the first touch detection component and the second touch detection component may be connected to the electronic device in a wired or wireless manner. When the first touch detection component detects a first touch operation performed by a user on the touch surface in the direction of the first surface of the transparent display screen, the first touch detection component may send first touch information corresponding to the first touch operation to the electronic device. When the second touch detection component detects a second touch operation performed by the user with respect to the touch surface in the direction of the second surface of the transparent display screen, the second touch detection component may send second touch information corresponding to the second touch operation to the electronic device. Therefore, the electronic equipment can acquire touch information from different surfaces of the transparent display screen.
In this example, a first surface direction of the transparent display screen may be provided with a first image capturing component, a second surface direction of the transparent display screen may be provided with a second image capturing component, and both the first image capturing component and the second image capturing component may be connected with the electronic device in a wired or wireless manner. When the target object is located in the first surface direction of the transparent display screen, the first image acquisition component can be used for acquiring an image of the target object, the second image acquisition component can be used for acquiring an image of the target object, or the first image acquisition component and the second image acquisition component can be used for simultaneously acquiring an image of the target object and transmitting the acquired depth image to the electronic equipment.
In this example, when the first image capturing part and the second image capturing part simultaneously capture images of the target object, the electronic device may further combine information of the first depth image map information and the second depth image map information to form third depth image information of the target object when receiving the first depth image map information captured by the first image capturing part and the second depth image map information captured by the second image capturing part. Therefore, the completeness of the depth image information of the target object acquired by the electronic equipment can be improved.
As shown in fig. 4, the difference from fig. 3 and fig. 2 is that the first surface 21 and the second surface 22 of the transparent display screen 20 are both touch surfaces, a camera 30-1 is arranged in the direction of the first surface 21, a camera 30-2 is arranged in the direction of the second surface 22, and the user a is located between the camera 30-1 and the transparent display screen 20, wherein the cameras 30-1 and 30-2 can simultaneously acquire images of the user a to obtain first depth image information and second depth image information of the user a, and the first depth image information and the second depth image information are sent to the host 10. After the host 10 receives the first depth image information and the second depth image information sent by the camera 30-1 and the camera 30-2, the first depth image information and the second depth image information may be fused to obtain third depth image information more complete for the user a, and then the third depth image information is projected to the display area of the transparent display screen 20 based on the spatial position relationship among the camera 30-1, the camera 30-2 and the transparent display screen 20. Then, the user a can also write on the first surface 21 of the transparent display screen 20 by touch, for example, a hat a is drawn on the head portion represented by the depth image information of the user a, the user B writes on the second surface 22 of the transparent display screen 20 by touch, for example, a "love heart" is drawn on the heart portion represented by the depth image information of the user a, and the touch information of the "hat" and the touch information of the "love heart" are sent to the host computer 10 by the touch detection component 40, and the touch information of the "hat" and the touch information of the "love heart" are projected to the display area of the transparent display screen 20 by the host computer 10. In this way, the "hat" appears from the perspective of user B as if it were worn directly on user a's head, and can move with user a; and "love heart" is as if directly fitting on the heart portion of the user a, and is also capable of beating while the user a moves.
In the first to third examples of the application, after obtaining the depth image information of the target object and the touch information on the transparent display screen, the electronic device may further establish a mapping relationship between the touch information and the target object based on the depth image information and the touch information.
In one implementation, the electronic device may establish a mapping relationship between the touch information and the target object based on the depth image information and the coordinate position of the touch information.
Here, the touch detection unit may emit a light beam to the touch display area of the transparent display screen, and when a user performs a touch operation on the touch display area, an object such as a hand, an arm, or a stylus of the user blocks a part of the light beam, so that the touch detection unit may determine a first coordinate position of touch information corresponding to the touch operation on the transparent display screen based on the area measurement data of the transparent display screen. And a spatial position relationship can be established between the image acquisition component and the transparent display screen, and a second coordinate position of the depth image information of the target object on the transparent display screen can be determined based on the spatial position relationship, so that the depth image information can be displayed on the transparent display screen based on the second coordinate position. The electronic device can establish a mapping relation between the touch information and the target object based on a first coordinate position of the touch information on the transparent display screen and a second coordinate position of the depth image information on the transparent display screen.
In this application, the electronic device may further control the touch information to be displayed on the transparent display screen based on the mapping relationship, so that the touch information moves along with the target object.
Here, when the electronic device controls the touch information to be displayed on the transparent display screen based on the mapping relationship, a current display mode of the touch information may be a following mode, which represents that the touch information may be combined with a target object and may move in a touch display area along with movement of the target object.
For example, a depth camera a is arranged in the direction of the first surface of the transparent display screen, a touch detection component B is arranged in the direction of the second surface of the transparent display screen, and the user a is located between the depth camera a and the transparent display screen as a target object. The depth camera A can be used for collecting images of the user A to obtain depth image information of the user A, the collected depth image information is sent to the electronic equipment, and the electronic equipment can project and display the depth image information to a display area of the transparent display screen based on the spatial position relation between the depth camera A and the transparent display screen.
Here, the depth image information may represent skeletal feature information of the user a. If the user B draws a love heart in the depth image information through the transparent display screen, the touch detection component B can detect the touch operation of the user B and can send the touch information 'love heart' corresponding to the touch operation to the electronic device, and the electronic device can determine the coordinate position of the touch information 'love heart' on the transparent display screen based on the area measurement data of the transparent display screen by the touch detection component B, so that the touch information 'love heart' can be projected to the display area of the transparent display screen based on the coordinate position. Then, the electronic device can establish a mapping relation between the touch information 'love' and the user A based on the coordinate position of the depth image information on the transparent display screen and the coordinate position of the touch information 'love' on the transparent display screen, and can control the touch information 'love' to be displayed on the transparent display screen based on the mapping relation.
Here, since the position parameters between the bone features represented by the depth image information of the user a are not changed when the user a moves, the effect viewed from the perspective of the user B when the user a moves is: the touch information is attached to the heart of the user A in a love mode and can move along with the user A, and therefore the effect of augmented reality is achieved.
In the application, when the electronic equipment acquires the depth image information of the target object and the touch information on the transparent display screen, the electronic equipment can acquire the touch information on the transparent display screen at a first moment and acquire the depth image information of the target object at a second moment; wherein the first time may be less than the second time; or the first time may be greater than the second time. That is to say, the electronic device may acquire the touch information on the transparent display screen first, and acquire the depth image information of the target object after acquiring the touch information. Or the depth image information of the target object may be acquired first, and then the touch information on the transparent display screen may be acquired after the depth image information is acquired. Of course, the first time may also be equal to the second time, that is, the electronic device may also obtain the touch information and the depth image information at the same time. The setting can be specifically carried out according to the use scene.
In the following, an implementation scenario in which the first time is less than the second time is exemplarily described:
for example, as shown in fig. 5, the user B draws a desk B and a backpack a on the second surface 22 of the transparent display screen 20 in a touch manner, wherein the backpack a is placed on the desk B, then the touch detection component 40 sends the touch information of the backpack a and the desk B to the electronic device 10, and after receiving the touch information of the backpack a and the desk B, the electronic device 10 projects the touch information of the backpack a and the desk B onto the display area of the transparent display screen 20. At this time, the user a walks from the direction of the first surface 21 of the transparent display screen 20, the image capturing component 30 can capture the depth image information of the user a, and the image capturing component 30 sends the depth image information to the electronic device 10, and the electronic device 10 can project the depth image information of the user a to the display area of the transparent display screen 20 based on the spatial position relationship between the image capturing component 30 and the transparent display screen 20. Then, when the user a walks to the side of the desk b, the electronic device 10 may establish a mapping relationship between the backpack a and the user a based on a first coordinate position of the backpack a on the transparent display screen 20 and a second coordinate position of the depth image information of the user a on the transparent display screen 20, and based on the mapping relationship, may control a display mode of the backpack a on the transparent display screen 20 to be a following mode. Since the electronic device 10 does not establish a mapping relationship between the table b and the user a, the display mode of the table b on the transparent display screen 20 is a still mode. As such, the effect viewed at the perspective of user B may be: when the user A walks to the side of the desk b, the backpack a on the desk b can be directly lifted up through the action of lifting hands, and the backpack a can be lifted to move, while the desk b is still, so that the effect of augmented reality is realized. Therefore, when some teaching scenes are aimed at, virtual-real combination can be carried out on some abstract teaching contents through the scheme of the application, so that students can understand and accept more easily, and the teaching quality can be improved.
In this application, when the first surface and the second surface of the transparent display screen are both touch surfaces, the display modes of the first touch information on the first surface and the second touch information on the second surface may be the same or different.
For example, when the display modes of the first touch information and the second touch information are different, the display mode of the first touch information on the transparent display screen may be a following mode, which indicates that the first touch information can move along with the movement of the target object; the display mode of the second touch information on the transparent display screen may be a static mode, which represents that the second touch information is in a separation relation with the target object, and when the target object moves, the second touch information is in a static state relative to the target object and does not move along with the target object.
When the display modes of the first touch information and the second touch information are the same, the display modes of the first touch information and the second touch information on the transparent display screen can both be following modes; or, the display modes of the first touch information and the second touch information on the transparent display screen may both be static modes.
Therefore, the user A and the user B which are positioned on different surfaces of the transparent display screen can respectively perform touch operation on the transparent display screen.
In the application, the electronic equipment can also detect the behavior of the target object; and if the detected behavior action meets the action condition, switching the current display mode of the touch information based on the behavior action.
Here, a plurality of behavior actions may be preset in the electronic device, and when the detected behavior action is the preset action, it may be determined that the detected behavior action satisfies the action condition; or, the electronic device may receive a plurality of types of touch information, and when the detected behavior action corresponds to the target touch information, it may be determined that the detected behavior action satisfies the action condition.
For example, as shown in fig. 5, the target object is a user a, during the movement of the user a, the image capturing component 30 may continuously capture an image of the user a and send the captured depth image information to the electronic device 10, after receiving the depth image information of the user a, the electronic device 10 may compare the depth image information of the user a, and may determine a behavior action of the user a according to a comparison result, if the behavior action of the user is detected as a "bag", the electronic device 10 may search for touch information corresponding to the coordinate position of the depth image information on the transparent display screen 20 based on the coordinate position of the depth image information corresponding to the behavior action of the "bag", and if a corresponding touch information "such as a bag a" is detected at the coordinate position of the depth image information, determine that the current behavior action satisfies an action condition, a mapping relation between the touch information "package a" and the user a is established, and the touch information "package a" is controlled to be displayed on the transparent display screen 20 based on the mapping relation, so that the current display mode of the touch information "package a" is switched from a static mode to a following mode, and in the following mode, the touch information "package a" can be lifted by the user a when the user a has a bag, and can move on the transparent display screen along with the movement of the user a.
As shown in fig. 6, when the electronic device 10 detects that the user a walks to the side of the table b and detects that one action is a "package placing" action, based on the coordinate position of the depth image information on the transparent display 20 corresponding to the "package placing" action, the touch information "package a" may be switched from the current following mode to the stationary mode, so that the touch information "package a" is placed on the table b, where the package a does not move with the movement of the user a in the stationary mode.
In the application, the electronic device may further receive a mode switching instruction for the touch information; and switching the current display mode of the touch information based on the mode switching instruction.
For example, the current display mode of the touch information is a static mode, and at this time, the user may send a mode switching instruction to the electronic device by means of a gesture, voice, or a button, and the electronic device may control the touch information to be switched from the current static mode to the following mode based on the mode switching instruction.
According to the data processing method, the depth image information of the target object and the touch information on the transparent display screen are obtained; establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information; and controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object. Therefore, the touch information can be combined with people or objects in a real scene, and the effect of augmented reality is achieved.
Fig. 7 is a schematic structural diagram of an electronic device in the present application, and as shown in fig. 7, the electronic device includes:
an obtaining unit 701, configured to obtain depth image information of a target object and touch information on a transparent display screen;
an establishing unit 702, configured to establish a mapping relationship between the touch information and the target object based on the depth image information and the touch information;
a display unit 703, configured to control the touch information to be displayed on the transparent display screen based on the mapping relationship, so that the touch information moves along with the target object.
In a preferred embodiment, the obtaining unit 701 is specifically configured to obtain depth image information, which is collected by an image collecting component in a first surface direction of the transparent display screen, for the target object, where the target object is located between the image collecting component and the transparent display screen; acquiring first touch information acquired by a touch detection part in the direction of the second surface of the transparent display screen; wherein the second surface is disposed opposite the first surface.
In a preferred embodiment, the obtaining unit 701 is further specifically configured to obtain depth image information, which is obtained by the image obtaining component in a second surface direction of the transparent display screen for the target object, where the target object is located in the first surface direction of the transparent display screen; wherein the second surface and the first surface are oppositely disposed; and acquiring first touch information acquired by the touch detection component in the direction of the second surface of the transparent display screen.
In a preferred embodiment, the obtaining unit 701 is further specifically configured to obtain second touch information collected by the touch detection component in the direction of the first surface of the transparent display screen; and the display modes of the first touch information and the second touch information are the same or different.
In a preferred embodiment, the obtaining unit 701 is further specifically configured to obtain touch information on the transparent display screen at a first time; acquiring depth image information of the target object at a second moment; wherein the first time is less than the second time; or the first time is greater than the second time.
In a preferred embodiment, the establishing unit 702 is specifically configured to establish a mapping relationship between the touch information and the target object based on the depth image information and the coordinate position of the touch information.
In a preferred aspect, the electronic device further includes:
a determining unit 704, configured to determine a first coordinate position of the touch information on the transparent display screen based on area measurement data of the transparent display screen by the touch detection component; determining a second coordinate position of the depth image information on the transparent display screen based on the spatial position relation between the image acquisition component and the transparent display screen;
the establishing unit 702 is further specifically configured to establish a mapping relationship between the touch information and the target object based on the first coordinate position and the second coordinate position.
In a preferred aspect, the electronic device further includes: a detection unit 705 and a switching unit 706;
the detection unit 705 is configured to detect a behavior action of the target object;
if the behavior action satisfies the action condition, the detection unit 705 triggers the switching unit 706, so that the switching unit 706 switches the current display mode of the touch information based on the behavior action.
In a preferred embodiment, the electronic device further includes:
a receiving unit 707 configured to receive a mode switching instruction for the touch information;
the switching unit 706 is further configured to switch a current display mode of the touch information based on the mode switching instruction.
It should be noted that: in the above embodiment, when performing data processing, the electronic device is only illustrated by dividing the program modules, and in practical applications, the processing may be distributed to different program modules according to needs, that is, the internal structure of the apparatus is divided into different program modules to complete all or part of the processing described above. In addition, the electronic device provided by the above embodiment and the data processing method embodiment provided by the above embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
Fig. 8 is a schematic structural component diagram of an electronic device 800 in the present application, where the electronic device may be a personal computer, a digital terminal, an information transceiver, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like. The electronic device 800 shown in fig. 8 includes: at least one processor 801, memory 802, at least one network interface 804, and a user interface 803. The various components in the electronic device 800 are coupled together by a bus system 805. It is understood that the bus system 805 is used to enable communications among the components connected. The bus system 805 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 805 in fig. 8.
The user interface 803 may include, among other things, a display, a keyboard, a mouse, a trackball, a click wheel, a key, a button, a touch pad, or a touch screen.
It will be appreciated that the memory 802 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 802 described in embodiments herein is intended to comprise, without being limited to, these and any other suitable types of memory.
The memory 802 in the embodiments of the present application is used to store various types of data to support the operation of the electronic device 800. Examples of such data include: any computer programs for operating on electronic device 800, such as operating system 8021 and application programs 8022; contact data; telephone book data; a message; a picture; video, etc. Operating system 8021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 8022 may contain various applications such as a Media Player (Media Player), a Browser (Browser), and the like for implementing various application services. A program implementing the method of an embodiment of the present application may be included in application program 8022.
The method disclosed in the embodiments of the present application may be applied to the processor 801 or implemented by the processor 801. The processor 801 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 801. The Processor 801 may be a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 801 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in a storage medium that is located in the memory 802, and the processor 801 reads the information in the memory 802 to perform the steps of the aforementioned methods in conjunction with its hardware.
In an exemplary embodiment, the electronic Device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, Micro Controllers (MCUs), microprocessors (microprocessors), or other electronic components for performing the foregoing methods.
In an exemplary embodiment, the present application further provides a computer readable storage medium, such as the memory 802 including a computer program, which can be executed by the processor 801 of the electronic device 800 to perform the steps of the foregoing method. The computer readable storage medium can be Memory such as FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface Memory, optical disk, or CD-ROM; or may be a variety of devices including one or any combination of the above memories, such as a mobile phone, computer, tablet device, personal digital assistant, etc.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out any one of the method steps of the data processing method described above.
Fig. 9 is a schematic structural diagram of a data processing system according to the present application, and as shown in fig. 9, the data processing system includes:
the transparent display screen 91 is provided with a first surface and a second surface, wherein the first surface and/or the second surface are touch surfaces;
here, the transparent display 91 is further provided with a touch detection component 911 in the direction of the touch surface, for collecting touch information corresponding to a touch operation when the user performs the touch operation on the touch surface.
The image acquisition component 92 is positioned on at least one surface of the transparent display screen 91 and is used for acquiring an image of the target object a to obtain depth image information of the target object a;
here, the target object a may be located between the image pickup section 92 and the transparent display screen 91, and depth image information of the target object a is indicated by a dotted line in the drawing.
The host 93 is connected with the image acquisition component 91 and the touch detection component 911 on the transparent display screen 91, and is used for receiving the depth image information acquired by the image acquisition component 92 for the target object a and receiving the touch information sent by the touch detection component 911 for the transparent display screen 91; and a mapping relation for establishing a mapping relation between the touch information and the target object a based on the depth image information and the touch information, and controlling the touch information to be displayed on the transparent display 91 based on the mapping relation so that the touch information can move along with the movement of the target object a.
The host 93 is equivalent to the electronic device described above, the data processing system and the data processing method embodiment provided above belong to the same concept, and the specific implementation process thereof is described in detail in the method embodiment and is not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of data processing, the method comprising:
acquiring depth image information of a target object and touch information on a transparent display screen;
establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information;
and controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object.
2. The method of claim 1, wherein the acquiring depth image information of the target object and touch information on the transparent display screen comprises:
acquiring depth image information acquired by an image acquisition component in a first surface direction of the transparent display screen aiming at the target object, wherein the target object is positioned between the image acquisition component and the transparent display screen;
acquiring first touch information acquired by a touch detection component in the direction of a second surface of the transparent display screen; wherein the second surface and the first surface are oppositely disposed.
3. The method of claim 1, wherein the acquiring depth image information of the target object and touch information on the transparent display screen comprises:
acquiring depth image information acquired by an image acquisition component aiming at the target object in the second surface direction of the transparent display screen, wherein the target object is positioned in the first surface direction of the transparent display screen; wherein the second surface and the first surface are oppositely disposed;
and acquiring first touch information acquired by a touch detection part in the direction of the second surface of the transparent display screen.
4. The method of claim 2 or 3, wherein the method further comprises:
acquiring second touch information acquired by the touch detection component in the direction of the first surface of the transparent display screen; and the display modes of the first touch information and the second touch information are the same or different.
5. The method of claim 1, wherein the obtaining depth image information of the target object and touch information on the transparent display screen further comprises at least:
acquiring touch information on a transparent display screen at a first moment; acquiring depth image information of the target object at a second moment; wherein the first time is less than the second time; or the first time is greater than the second time.
6. The method of claim 1, wherein the establishing a mapping relationship of the touch information to the target object based on the depth image information and the touch information comprises:
and establishing a mapping relation between the touch information and the target object based on the depth image information and the coordinate position of the touch information.
7. The method of claim 6, wherein the establishing a mapping relationship of the touch information to the target object based on the depth image information and the coordinate position of the touch information comprises:
determining a first coordinate position of the touch information on the transparent display screen based on the area measurement data of the transparent display screen by the touch detection component;
determining a second coordinate position of the depth image information on the transparent display screen based on the spatial position relation between the image acquisition component and the transparent display screen;
and establishing a mapping relation between the touch information and the target object based on the first coordinate position and the second coordinate position.
8. The method of claim 1, wherein the method further comprises:
detecting a behavioral action of the target object;
and if the behavior action meets an action condition, switching the current display mode of the touch information based on the behavior action.
9. The method of claim 1, wherein the method further comprises:
receiving a mode switching instruction aiming at the touch information;
and switching the current display mode of the touch information based on the mode switching instruction.
10. An electronic device, comprising:
the acquisition unit is used for acquiring depth image information of the target object and touch information on the transparent display screen;
the establishing unit is used for establishing a mapping relation between the touch information and the target object based on the depth image information and the touch information;
and the display unit is used for controlling the touch information to be displayed on the transparent display screen based on the mapping relation so that the touch information moves along with the target object.
CN202210251905.0A 2022-03-15 2022-03-15 Data processing method and electronic equipment Pending CN114816206A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210251905.0A CN114816206A (en) 2022-03-15 2022-03-15 Data processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210251905.0A CN114816206A (en) 2022-03-15 2022-03-15 Data processing method and electronic equipment

Publications (1)

Publication Number Publication Date
CN114816206A true CN114816206A (en) 2022-07-29

Family

ID=82529407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210251905.0A Pending CN114816206A (en) 2022-03-15 2022-03-15 Data processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN114816206A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012001A1 (en) * 1997-07-07 2001-08-09 Junichi Rekimoto Information input apparatus
CN111610852A (en) * 2019-02-26 2020-09-01 北京海益同展信息科技有限公司 Information display method and device and storage medium
CN112130799A (en) * 2020-09-24 2020-12-25 联想(北京)有限公司 Control method and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012001A1 (en) * 1997-07-07 2001-08-09 Junichi Rekimoto Information input apparatus
CN111610852A (en) * 2019-02-26 2020-09-01 北京海益同展信息科技有限公司 Information display method and device and storage medium
CN112130799A (en) * 2020-09-24 2020-12-25 联想(北京)有限公司 Control method and electronic equipment

Similar Documents

Publication Publication Date Title
JP7344974B2 (en) Multi-virtual character control method, device, and computer program
WO2019153824A1 (en) Virtual object control method, device, computer apparatus, and storage medium
US7535486B2 (en) Display control device, display control method, program, and portable apparatus
JP7305249B2 (en) Method for determining motion information of image feature points, task execution method and device
WO2019033957A1 (en) Interaction position determination method and system, storage medium and smart terminal
RU2720356C1 (en) Control device, control method and storage medium
WO2018150831A1 (en) Information processing device, information processing method, and recording medium
US9392248B2 (en) Dynamic POV composite 3D video system
JP2013069224A (en) Motion recognition apparatus, motion recognition method, operation apparatus, electronic apparatus, and program
US11954200B2 (en) Control information processing method and apparatus, electronic device, and storage medium
CN103248810A (en) Image processing device, image processing method, and program
CN112068698A (en) Interaction method and device, electronic equipment and computer storage medium
JP2021520540A (en) Camera positioning methods and devices, terminals and computer programs
CN112346572A (en) Method, system and electronic device for realizing virtual-real fusion
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
JP2016503915A (en) Aim and press natural user input
WO2019155735A1 (en) Information processing device, information processing method, and program
CN113426117A (en) Virtual camera shooting parameter acquisition method and device, electronic equipment and storage medium
CN112788443B (en) Interaction method and system based on optical communication device
CN114816206A (en) Data processing method and electronic equipment
CN114115544B (en) Man-machine interaction method, three-dimensional display device and storage medium
CN112717409B (en) Virtual vehicle control method, device, computer equipment and storage medium
JP2023140922A (en) Display terminal, information processing system, communication system, display method, information processing method, communication method, and program
CN113192127A (en) Image processing method and device, electronic equipment and storage medium
Yeo et al. OmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional Camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination