CN110866979A - Data processing method, device, computing equipment and medium - Google Patents

Data processing method, device, computing equipment and medium Download PDF

Info

Publication number
CN110866979A
CN110866979A CN201911117909.4A CN201911117909A CN110866979A CN 110866979 A CN110866979 A CN 110866979A CN 201911117909 A CN201911117909 A CN 201911117909A CN 110866979 A CN110866979 A CN 110866979A
Authority
CN
China
Prior art keywords
reference image
target device
determining
target
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911117909.4A
Other languages
Chinese (zh)
Inventor
唐河云
赵琳璐
王晓陆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201911117909.4A priority Critical patent/CN110866979A/en
Publication of CN110866979A publication Critical patent/CN110866979A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The present disclosure provides a data processing method, including: the method includes acquiring a reference image in a virtual display space, determining a relative reference position between the reference image and a target device, determining a relative target position between the virtual image and the target device based on the relative reference position, determining display data of the virtual image based on the relative target position, and displaying the virtual image on a display unit of the target device based on the display data so as to perceive that the virtual image is located at the relative target position when the virtual image is viewed through the target device. The present disclosure also provides a data processing apparatus, a computing device, and a computer-readable storage medium.

Description

Data processing method, device, computing equipment and medium
Technical Field
The present disclosure relates to a data processing method, a data processing apparatus, a computing device, and a computer-readable storage medium.
Background
The ar (augmented reality) device has a function of displaying a virtual image. When the AR device displays a virtual image to realize a function of integrating the virtual image with a real environment, it is necessary to first determine a display position of the virtual image. In determining the display position of the virtual image, the display position of the virtual image in space is generally determined by recognizing position information of the surrounding spatial environment. However, when determining the display position of the virtual image in the space, the determined display position of the virtual image is not accurate enough due to the complex spatial environment, so that the displayed virtual image does not meet the user's expectation, and the user experience is poor.
Disclosure of Invention
One aspect of the present disclosure provides a data processing method, including: the method comprises the steps of obtaining a reference image in a virtual display space, determining a relative reference position between the reference image and a target device, determining a relative target position between a virtual image and the target device based on the relative reference position, determining display data of the virtual image based on the relative target position, and displaying the virtual image on a display unit of the target device based on the display data so as to sense that the virtual image is located at the relative target position when the virtual image is viewed through the target device.
Optionally, the method further includes: determining whether a user wears the target device, and in response to determining that the user wears the target device, sending indication information to an external device connected with the target device through the target device so that the external device displays the reference image based on the indication information, wherein the obtaining of the reference image comprises: and acquiring the reference image displayed on the external equipment through the target equipment.
Optionally, the determining the relative reference position between the reference image and the target device includes: acquiring a plurality of characteristic points in the reference image, and determining a relative reference position between the reference image and the target device based on the distribution of the characteristic points.
Optionally, the method further includes: a plurality of feature points of the reference image are determined. The determining a plurality of feature points of the reference image comprises: the method comprises the steps of obtaining a plurality of pixel points in a reference image, determining a current pixel point in the pixel points, determining a target area based on the current pixel point, wherein the target area comprises a plurality of reference pixel points, and determining the current pixel point as one of a plurality of characteristic points when the pixel value of the current pixel point and the pixel values of the reference pixel points meet preset conditions.
Optionally, the method further includes: identifying the plurality of initial images by the target device to obtain an identification result, and after acquiring the reference image, identifying the reference image based on the identification result to determine whether the reference image is at least one of the plurality of initial images, wherein the determining the relative reference position between the reference image and the target device comprises: upon determining that the reference image is at least one of the plurality of initial images, determining a relative reference position between the reference image and a target device.
Another aspect of the present disclosure provides a data processing apparatus including: the device comprises an acquisition module, a first determination module, a second determination module, a third determination module and a display module. The acquisition module acquires a reference image in the virtual display space. A first determination module that determines a relative reference position between the reference image and a target device. A second determination module that determines a relative target position between the virtual image and the target device based on the relative reference position. A third determination module that determines display data for the virtual image based on the relative target position. A display module that displays the virtual image on a display unit of the target device based on the display data so as to perceive that the virtual image is located at the relative target position when the virtual image is viewed through the target device.
Optionally, the apparatus further comprises: a fourth determining module and a sending module. The fourth determination module determines whether the target device is worn by the user. A sending module, configured to send, in response to determining that a user wears the target device, indication information to an external device connected to the target device through the target device, so that the external device displays the reference image based on the indication information, where the obtaining of the reference image includes: and acquiring the reference image displayed on the external equipment through the target equipment.
Optionally, the determining the relative reference position between the reference image and the target device includes: acquiring a plurality of characteristic points in the reference image, and determining a relative reference position between the reference image and the target device based on the distribution of the characteristic points.
Optionally, the apparatus further comprises: and the fifth determining module is used for determining a plurality of characteristic points of the reference image. The determining a plurality of feature points of the reference image comprises: the method comprises the steps of obtaining a plurality of pixel points in a reference image, determining a current pixel point in the pixel points, determining a target area based on the current pixel point, wherein the target area comprises a plurality of reference pixel points, and determining the current pixel point as one of a plurality of characteristic points when the pixel value of the current pixel point and the pixel values of the reference pixel points meet preset conditions.
Optionally, the apparatus further comprises: an identification module and a sixth determination module. The identification module identifies the plurality of initial images through the target device to obtain an identification result, and the identification result is obtained after the reference image is obtained. A sixth determination module that identifies the reference image based on the identification result to determine whether the reference image is at least one of the plurality of initial images. Wherein the determining a relative reference position between the reference image and a target device comprises: upon determining that the reference image is at least one of the plurality of initial images, determining a relative reference position between the reference image and a target device.
Another aspect of the disclosure provides a non-transitory readable storage medium storing computer-executable instructions for implementing the method as above when executed.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of a data processing method and a data processing apparatus according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of a data processing method according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a schematic diagram of a target device connected to an external device according to an embodiment of the disclosure;
4A-4B schematically illustrate determining relative position information from feature points, according to an embodiment of the disclosure;
FIG. 4C schematically illustrates a schematic diagram of determining a virtual screen according to an embodiment of the present disclosure;
FIG. 5 schematically shows a block diagram of a data processing apparatus according to an embodiment of the present disclosure; and
FIG. 6 schematically shows a block diagram of a computer system for implementing data processing according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable control apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
An embodiment of the present disclosure provides a data processing method, including: a reference image in the virtual display space is acquired, and a relative reference position between the reference image and the target device is determined. Then, a relative target position between the virtual image and the target device is determined based on the relative reference position, and display data of the virtual image is determined based on the relative target position. Thereafter, the virtual image may be displayed on a display unit of the target device based on the display data so as to perceive the virtual image as being located at a relative target position when the virtual image is viewed through the target device.
Fig. 1 schematically shows an application scenario of a data processing method and a data processing apparatus according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the application scenario 100 includes, for example, a target device 110. The target device 110 may be, for example, an ar (augmented reality) device.
In the embodiment of the present disclosure, an AR device is taken as an example of AR glasses, and the AR glasses have a function of displaying a virtual image. When the user wears the AR glasses, the user can view a real image of a surrounding space through the AR glasses, and the AR glasses can also display a virtual image at a corresponding position in the surrounding space so that the virtual image is merged with the real image.
In order to facilitate understanding of the technical scheme of the embodiment of the disclosure, the surrounding environment is taken as a forest for example, and when the user wears the AR glasses, the user can view the trees in a certain surrounding space area through the AR glasses. At this time, the AR glasses may display virtual images at respective positions in a spatial region that can be viewed by the user. For example, the AR glasses display the image 120 including the virtual image 130, and the virtual image 130 may be a bird, for example, so that the virtual image 130 is integrated with the surrounding spatial region.
The data processing method according to the exemplary embodiment of the present disclosure is described below with reference to fig. 2, fig. 3, and fig. 4A to 4C in conjunction with the application scenario of fig. 1. It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present disclosure, and the embodiments of the present disclosure are not limited in this respect. Rather, embodiments of the present disclosure may be applied to any scenario where applicable.
Fig. 2 schematically shows a flow chart of a data processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S250. The method of the embodiment of the present disclosure can be applied to a target device, for example, the target device includes AR glasses.
In operation S210, a reference image in a virtual display space is acquired.
According to an embodiment of the present disclosure, the virtual display space may be, for example, a space that the target device can display, and the size of the virtual display space is determined by, for example, the visual range of the target device. In other words, when the user wears the target device, the spatial region that the user can view through the target device is the virtual display space of the target device.
In the embodiments of the present disclosure, the reference image in the virtual display space may be acquired by the target device, for example. For example, the target device may recognize an arbitrary object in the virtual display space, and when a reference image is recognized, the reference image may be acquired so as to determine relative position information between the target device and the reference image based on the reference image.
According to the embodiment of the disclosure, in order to ensure that the target device can successfully recognize the reference image when recognizing the virtual display space, the target device needs to recognize the reference image in advance and store the recognition result. Specifically, the identification result may be obtained by identifying a plurality of initial images by the target device, and the reference image may be any one of the plurality of initial images.
In operation S220, a relative reference position between the reference image and the target device is determined.
For example, after acquiring the reference image, the reference image may be recognized based on the recognition result to determine whether the reference image is at least one of the plurality of initial images, such as when the reference image is determined to be at least one of the plurality of initial images, a relative reference position between the reference image and the target device may be determined.
In operation S230, a relative target position between the virtual image and the target device is determined based on the relative reference position.
According to the embodiment of the present disclosure, after the relative reference position between the reference image and the target device is determined, the relative target position of the virtual image in the virtual display space with respect to the target device may be determined according to the relative reference position. The relative target position is, for example, an initial position of the virtual image, which is determined when the target device is started. After the initial position is determined, the initial position of the virtual image does not change with the movement of the target device during subsequent use of the target device. In other words, after determining the initial position of the virtual image in the virtual display space, the user may wear the target device to move in the virtual display space, and during the movement of the user, the position of the virtual image in the virtual display space, for example, no longer changes, and the initial position of the virtual image may be re-determined when the target device is restarted.
In operation S240, display data of the virtual image is determined based on the relative target position.
According to the embodiment of the disclosure, after the relative target position is determined, the target device may determine display data of the virtual image in the virtual display space according to the relative target position. The display data indicates, for example, an initial position of the virtual image in the virtual display space, and the initial position of the virtual image does not change with movement of the target device during subsequent use of the target device.
In operation S250, a virtual image is displayed on a display unit of a target device based on display data so as to perceive that the virtual image is located at a relative target position when the virtual image is viewed through the target device.
According to the embodiments of the present disclosure, the target device may display the virtual image on the display unit of the target device according to the display data. When the target equipment is worn by a user, the user can watch the virtual image displayed on the display unit and can perceive the initial position of the virtual image in the virtual display space, and when the target equipment is worn by the user to move, the virtual image does not move along with the movement of the user, so that the technical effect that the virtual image and the real environment are integrated is achieved.
The embodiment of the disclosure determines the initial position of the virtual image according to the reference image so as to display the virtual image at the initial position, and when the target device is used subsequently, the position of the virtual image does not change along with the movement of the target device, so that the virtual image and the real environment are integrated. In addition, the initial position of the virtual image is determined according to the reference image, the calculation process is simple and quick, and the accuracy of the determined initial position is high.
Fig. 3 schematically illustrates a schematic diagram of a target device and an external device according to an embodiment of the present disclosure.
As shown in fig. 3, the reference image may be displayed, for example, by the external device 320. The external device 320 may be, for example, a computer, a tablet, or other display device. The target device 310 is connected to the external device 320, for example, through a USB cable, Wi-Fi, or bluetooth connection.
In the disclosed embodiments, for example, it may be first determined whether the user is wearing the target device. If the target device is determined to be worn by the user at the moment, the indication information can be sent to the external device connected with the target device through the target device in response to the determination that the target device is worn by the user, so that the external device displays the reference image based on the indication information. That is, when the user wears the target device, the target device may instruct the external device to display the reference image, so that the reference image displayed on the external device is conveniently acquired by the target device.
According to the embodiment of the disclosure, the reference image includes a plurality of pixel points, for example, a part of the pixel points in the reference image may be determined as a plurality of feature points of the reference image, so as to determine the relative position information between the reference image and the target device based on the plurality of feature points.
Specifically, for example, a plurality of pixel points in the reference image may be obtained first, and one of the pixel points is determined as the current pixel point in sequence. Then, a target area is determined based on the current pixel point, wherein the target area includes a plurality of reference pixel points, and the target area may use, for example, the current pixel point as a center of a circle, and use a pixel point whose distance from the center of the circle is within a preset distance range as a reference pixel point. Namely, the distance between the reference pixel point and the center of the circle is within a preset distance range. And when the pixel value of the current pixel point and the pixel values of the multiple reference pixel points meet the preset condition, determining the current pixel point as one of the multiple characteristic points. The meeting of the preset condition may be, for example, that a difference between a pixel value of the current pixel and a pixel value of the reference pixel is greater than a preset threshold, and the difference greater than the preset threshold indicates that a difference between the current pixel and the reference pixel is large, and at this time, the current pixel may be determined as a feature point.
According to an embodiment of the present disclosure, determining a relative reference position between the reference image and the target device includes: a plurality of feature points in the reference image are acquired, and a relative reference position between the reference image and the target device is determined based on a distribution of the plurality of feature points.
According to the embodiment of the present disclosure, for example, distribution information of a plurality of feature points in a reference image, which includes, for example, display distances and deformation information between a plurality of feature points displayed on a display unit of a target device, may be acquired by the target device. The display distance can represent relative distance information between the reference image and the target device, and the deformation information can represent relative orientation information between the reference image and the target device.
Fig. 4A to 4B schematically show a schematic diagram of determining relative position information from feature points according to an embodiment of the present disclosure.
As shown in fig. 4A, the reference image displayed on the external device includes, for example, 4 feature points, and the 4 feature points are distributed in a rectangular manner, for example. The reference image displayed in the display unit of the target device may be, for example, the reference image 410 or the reference image 420. It will be appreciated that the display distance between the plurality of feature points (e.g., feature point A, B, C, D) in reference image 410 is greater than the display distance between the plurality of feature points in reference image 420, characterizing the relative distance of reference image 410 from the target device less than the relative distance between reference image 420 and the target device.
As shown in fig. 4B, the reference image displayed in the display unit of the target device may be, for example, the reference image 430 or the reference image 440. It is understood that the distribution among the plurality of feature points in the reference image 430 is, for example, a rectangle (with small deformation), and the relative orientation information characterizing the reference image 430 and the target device is that the target device is right in front of the reference image. The distribution between the plurality of feature points in the reference image 440 is, for example, a parallelogram (with large deformation), and the relative orientation information characterizing the reference image 440 and the target device is that the target device is on the side of the reference image.
The embodiment of the disclosure determines the relative position information between the reference image and the target device through a plurality of feature points in the reference image, improves the positioning efficiency and the positioning accuracy of the target device, at least partially realizes the effect of quickly positioning the target device without complex positioning calculation, so as to determine the initial position of the virtual image according to the positioned target device, and realizes the integration of the virtual image and the virtual display space by displaying the virtual image in the initial position, thereby improving the experience of a user using the target device.
Fig. 4C schematically illustrates a schematic diagram of determining a virtual screen according to an embodiment of the present disclosure.
According to the embodiment of the present disclosure, after the relative reference position between the target device and the reference image is determined based on the reference image displayed on the external device, for example, the display position of the virtual screen may be determined according to the relative reference position. The virtual screen can be used for displaying a virtual image, for example, and the virtual screen can be displayed on a plane where the reference image is located, so that the virtual screen and the reference image are combined into a large-size display area, and a function of expanding the display area through the virtual screen is realized.
Specifically, the display mode of the virtual screen may be determined according to the display mode of the reference image in the target device. For example, as shown in the left diagram of fig. 4C, when the display form of the reference image in the target device is a rectangular-shaped display form, it indicates that the target device is directly in front of the reference image. At this time, the display mode of the virtual screen 450 may be consistent with the display mode of the reference image, for example, the display mode of the virtual screen 450 may be a rectangular display mode, and it is ensured that the virtual screen 450 and the reference image are combined into a large-sized display area on the same plane. The left diagram of fig. 4C shows the position of the virtual screen 450 above the reference image, and it is understood that the virtual screen 450 of the embodiment of the present disclosure may be at any position of the plane of the reference image, for example, a left position, a right position, a lower position, and so on.
As shown in the right diagram of fig. 4C, when the display mode of the reference image in the target apparatus is a parallelogram-shaped display mode, it indicates that the target apparatus is on the side of the reference image. At this time, the display mode of the virtual screen 460 may be consistent with the display mode of the reference image, for example, the display mode of the virtual screen 460 may be a parallelogram-shaped display mode, and it is ensured that the virtual screen 460 and the reference image are combined into a large-sized display area on the same plane. Wherein the content of the first and second substances,
fig. 4C right illustrates the position of the virtual screen 460 above the reference image, and it is understood that the virtual screen 460 of the embodiment of the disclosure may be at any position of the plane of the reference image, such as the left position, the right position, the lower position, and so on.
Fig. 5 schematically shows a block diagram of a data processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the data processing apparatus 500 includes an acquisition module 510, a first determination module 520, a second determination module 530, a third determination module 540, and a display module 550.
The acquisition module 510 may be used to acquire a reference image in the acquired virtual display space. According to the embodiment of the present disclosure, the obtaining module 510 may perform, for example, the operation S210 described above with reference to fig. 2, which is not described herein again.
The first determination module 520 may be used to determine a relative reference position between the reference image and the target device. According to the embodiment of the present disclosure, the first determining module 520 may perform, for example, operation S220 described above with reference to fig. 2, which is not described herein again.
The second determination module 530 may be used to determine a relative target location between the virtual image and the target device based on the relative reference location. According to an embodiment of the present disclosure, the second determining module 530 may perform, for example, the operation S230 described above with reference to fig. 2, which is not described herein again.
The third determination module 540 may be used to determine display data for the virtual image based on the relative target position. According to an embodiment of the present disclosure, the third determining module 540 may, for example, perform operation S240 described above with reference to fig. 2, which is not described herein again.
The display module 550 may be configured to display the virtual image on a display unit of the target device based on the display data so as to perceive the virtual image as being located at a relative target position when the virtual image is viewed through the target device. According to the embodiment of the disclosure, the display module 550 may perform, for example, the operation S250 described above with reference to fig. 2, which is not described herein again.
According to the embodiment of the present disclosure, the data processing apparatus 500 may further include, for example: a fourth determining module and a sending module. The fourth determination module determines whether the user wears the target device. The sending module is used for responding to the fact that the target equipment is worn by the user, sending indication information to the external equipment connected with the target equipment through the target equipment so that the external equipment can display a reference image based on the indication information, wherein the step of obtaining the reference image comprises the following steps: and acquiring the reference image displayed on the external equipment through the target equipment.
According to an embodiment of the present disclosure, determining a relative reference position between the reference image and the target device includes: a plurality of feature points in the reference image are acquired, and a relative reference position between the reference image and the target device is determined based on distribution of the plurality of feature points.
According to the embodiment of the present disclosure, the data processing apparatus 500 may further include, for example: and the fifth determining module is used for determining a plurality of characteristic points of the reference image. Determining a plurality of feature points of a reference image includes: the method comprises the steps of obtaining a plurality of pixel points in a reference image, determining a current pixel point in the pixel points, determining a target area based on the current pixel point, wherein the target area comprises the reference pixel points, and determining the current pixel point as one of a plurality of characteristic points when the pixel value of the current pixel point and the pixel values of the reference pixel points meet preset conditions.
According to the embodiment of the present disclosure, the data processing apparatus 500 may further include, for example: an identification module and a sixth determination module. The identification module identifies a plurality of initial images through the target device to obtain an identification result, and the identification result is obtained after the reference image is obtained. And a sixth determining module which identifies the reference image based on the identification result to determine whether the reference image is at least one of the plurality of initial images. Wherein determining a relative reference position between the reference image and the target device comprises: upon determining that the reference image is at least one of the plurality of initial images, a relative reference position between the reference image and the target device is determined.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any plurality of the acquisition module 510, the first determination module 520, the second determination module 530, the third determination module 540, and the display module 550 may be combined and implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the disclosure, at least one of the obtaining module 510, the first determining module 520, the second determining module 530, the third determining module 540, and the displaying module 550 may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented by any one of three implementations of software, hardware, and firmware, or an appropriate combination of any of them. Alternatively, at least one of the obtaining module 510, the first determining module 520, the second determining module 530, the third determining module 540, and the displaying module 550 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
FIG. 6 schematically shows a block diagram of a computer system for implementing data processing according to an embodiment of the present disclosure. The computer system illustrated in FIG. 6 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 6, a computer system 600 implementing data processing includes a processor 601, a computer-readable storage medium 602. The system 600 may perform a method according to an embodiment of the present disclosure.
In particular, processor 601 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 601 may also include onboard memory for caching purposes. The processor 601 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 602 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium 602 may comprise a computer program 603, which computer program 603 may comprise code/computer-executable instructions that, when executed by the processor 601, cause the processor 601 to perform a method according to an embodiment of the disclosure or any variant thereof.
The computer program 603 may be configured with computer program code, for example comprising computer program modules. For example, in an example embodiment, code in computer program 603 may include one or more program modules, including 603A, modules 603B, … …, for example. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, and when the program modules are executed by the processor 601, the processor 601 may execute the method according to the embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present disclosure, at least one of the obtaining module 510, the first determining module 520, the second determining module 530, the third determining module 540, and the display module 550 may be implemented as a computer program module described with reference to fig. 6, which, when executed by the processor 601, may implement the respective operations described above.
The present disclosure also provides a computer-readable medium, which may be embodied in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable medium carries one or more programs which, when executed, implement the above data processing method.
According to embodiments of the present disclosure, a computer readable medium may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, optical fiber cable, radio frequency signals, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A method of data processing, comprising:
acquiring a reference image in a virtual display space;
determining a relative reference position between the reference image and a target device;
determining a relative target position between the virtual image and the target device based on the relative reference position;
determining display data for the virtual image based on the relative target position; and
based on the display data, displaying the virtual image on a display unit of the target device so as to perceive the virtual image as being located at the relative target position when the virtual image is viewed through the target device.
2. The method of claim 1, further comprising:
determining whether the target device is worn by a user; and
in response to determining that the target device is worn by a user, sending indication information to an external device connected with the target device through the target device so that the external device displays the reference image based on the indication information;
wherein the acquiring a reference image comprises: and acquiring the reference image displayed on the external equipment through the target equipment.
3. The method of claim 1 or 2, wherein the determining a relative reference position between the reference image and a target device comprises:
acquiring a plurality of feature points in the reference image; and
determining a relative reference position between the reference image and a target device based on the distribution of the plurality of feature points.
4. The method of claim 3, further comprising: determining a plurality of feature points of the reference image;
the determining a plurality of feature points of the reference image comprises:
acquiring a plurality of pixel points in the reference image;
determining a current pixel point of the plurality of pixel points;
determining a target area based on the current pixel point, wherein the target area comprises a plurality of reference pixel points; and
and when the pixel value of the current pixel point and the pixel values of the reference pixel points meet a preset condition, determining the current pixel point as one of the characteristic points.
5. The method of claim 1, further comprising:
identifying the plurality of initial images through the target equipment to obtain an identification result; and
after acquiring the reference image, identifying the reference image based on the identification result to determine whether the reference image is at least one of the plurality of initial images;
wherein the determining a relative reference position between the reference image and a target device comprises: upon determining that the reference image is at least one of the plurality of initial images, determining a relative reference position between the reference image and a target device.
6. A data processing apparatus comprising:
the acquisition module acquires a reference image in the virtual display space;
a first determination module that determines a relative reference position between the reference image and a target device;
a second determination module that determines a relative target position between the virtual image and the target device based on the relative reference position;
a third determination module that determines display data of a virtual image based on the relative target position; and
a display module that displays the virtual image on a display unit of the target device based on the display data so as to perceive that the virtual image is located at the relative target position when the virtual image is viewed through the target device.
7. The apparatus of claim 6, further comprising:
a fourth determination module that determines whether the target device is worn by a user; and
the sending module is used for responding to the fact that the target equipment is worn by the user, sending indication information to external equipment connected with the target equipment through the target equipment so that the external equipment can display the reference image based on the indication information;
wherein the acquiring a reference image comprises: and acquiring the reference image displayed on the external equipment through the target equipment.
8. The apparatus of claim 7, wherein the determining a relative reference position between the reference image and a target device comprises:
acquiring a plurality of feature points in the reference image; and
determining a relative reference position between the reference image and a target device based on the distribution of the plurality of feature points.
9. A computing device, comprising:
one or more processors; and
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-5.
10. A computer-readable storage medium storing computer-executable instructions for implementing the method of any one of claims 1 to 5 when executed.
CN201911117909.4A 2019-11-14 2019-11-14 Data processing method, device, computing equipment and medium Pending CN110866979A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911117909.4A CN110866979A (en) 2019-11-14 2019-11-14 Data processing method, device, computing equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911117909.4A CN110866979A (en) 2019-11-14 2019-11-14 Data processing method, device, computing equipment and medium

Publications (1)

Publication Number Publication Date
CN110866979A true CN110866979A (en) 2020-03-06

Family

ID=69653778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911117909.4A Pending CN110866979A (en) 2019-11-14 2019-11-14 Data processing method, device, computing equipment and medium

Country Status (1)

Country Link
CN (1) CN110866979A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106980382A (en) * 2017-03-31 2017-07-25 维沃移动通信有限公司 A kind of method, mobile terminal and the VR equipment of the control of VR device plays
CN107908524A (en) * 2017-12-11 2018-04-13 深圳创维-Rgb电子有限公司 Information processing method, device and the readable storage medium storing program for executing of virtual reality terminal
US20190095696A1 (en) * 2017-09-28 2019-03-28 Cal-Comp Big Data, Inc. Body information analysis apparatus and method of auxiliary comparison of eyebrow shapes thereof
CN109584295A (en) * 2017-09-29 2019-04-05 阿里巴巴集团控股有限公司 The method, apparatus and system of automatic marking are carried out to target object in image
CN109831662A (en) * 2019-03-22 2019-05-31 芋头科技(杭州)有限公司 Real-time pictures projective techniques, device and the controller and medium of AR glasses screen
CN109981982A (en) * 2019-03-25 2019-07-05 联想(北京)有限公司 Control method, device and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106980382A (en) * 2017-03-31 2017-07-25 维沃移动通信有限公司 A kind of method, mobile terminal and the VR equipment of the control of VR device plays
US20190095696A1 (en) * 2017-09-28 2019-03-28 Cal-Comp Big Data, Inc. Body information analysis apparatus and method of auxiliary comparison of eyebrow shapes thereof
CN109584295A (en) * 2017-09-29 2019-04-05 阿里巴巴集团控股有限公司 The method, apparatus and system of automatic marking are carried out to target object in image
CN107908524A (en) * 2017-12-11 2018-04-13 深圳创维-Rgb电子有限公司 Information processing method, device and the readable storage medium storing program for executing of virtual reality terminal
CN109831662A (en) * 2019-03-22 2019-05-31 芋头科技(杭州)有限公司 Real-time pictures projective techniques, device and the controller and medium of AR glasses screen
CN109981982A (en) * 2019-03-25 2019-07-05 联想(北京)有限公司 Control method, device and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李倩宇: "增强现实中的目标识别与三维注册算法研究与实现", 《中国优秀硕士学位论文全文数据库-信息科技辑》 *

Similar Documents

Publication Publication Date Title
EP3345073B1 (en) Localizing devices in an augmented reality environment
US20190369613A1 (en) Electronic device and method for controlling multiple drones
EP2933707A1 (en) Head mounted display presentation adjustment
US11202163B2 (en) Audio output method, electronic device, and audio output apparatus
US20210253103A1 (en) Method, system, and device for determining overtaking trajectory for autonomous vehicles
CN109508579B (en) Method and device for acquiring virtual point cloud data
CN112258519B (en) Automatic extraction method and device for way-giving line of road in high-precision map making
CN110059623B (en) Method and apparatus for generating information
CN109829447B (en) Method and device for determining a three-dimensional frame of a vehicle
US20220114785A1 (en) Three-dimensional model generation method and three-dimensional model generation device
KR102491386B1 (en) Electronic device and control method thereof
US20200380717A1 (en) Positioning method, positioning device and nonvolatile computer-readable storage medium
CN109981982B (en) Control method, device and system
CN110866979A (en) Data processing method, device, computing equipment and medium
CN112558036B (en) Method and device for outputting information
EP4080479A2 (en) Method for identifying traffic light, device, cloud control platform and vehicle-road coordination system
US20210264673A1 (en) Electronic device for location-based ar linking of object-based augmentation contents and operating method thereof
US20190102909A1 (en) Automated identification of parts of an assembly
US20180136470A1 (en) Feature balancing
CN115511870A (en) Object detection method and device, electronic equipment and storage medium
EP4206977A1 (en) Electronic device and control method of electronic device
CN110555833B (en) Image processing method, image processing apparatus, electronic device, and medium
CN110418059B (en) Image processing method and device applied to electronic equipment, electronic equipment and medium
US20160217559A1 (en) Two-dimensional image processing based on third dimension data
KR20180097004A (en) Method of position calculation between radar target lists and vision image ROI

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination