CN117979414A - Method, device, electronic equipment and storage medium for searching for article - Google Patents

Method, device, electronic equipment and storage medium for searching for article Download PDF

Info

Publication number
CN117979414A
CN117979414A CN202211320938.2A CN202211320938A CN117979414A CN 117979414 A CN117979414 A CN 117979414A CN 202211320938 A CN202211320938 A CN 202211320938A CN 117979414 A CN117979414 A CN 117979414A
Authority
CN
China
Prior art keywords
augmented reality
target object
target
wireless communication
reality device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211320938.2A
Other languages
Chinese (zh)
Inventor
程林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211320938.2A priority Critical patent/CN117979414A/en
Publication of CN117979414A publication Critical patent/CN117979414A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The disclosure relates to the technical field of computers, in particular to a method, a device, electronic equipment and a storage medium for searching an object, wherein the method, the device, the electronic equipment and the storage medium enable an augmented reality device to establish wireless communication connection with a target object, the position of the target object is determined based on the wireless communication connection, and guide information for guiding the position of the target object is displayed in an augmented reality space, so that the searching process of the target object can be fused with a real environment, and the searching efficiency and the user experience of the target object are improved.

Description

Method, device, electronic equipment and storage medium for searching for article
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, an electronic device, and a storage medium for searching for an article.
Background
An Extended Reality technology (XR for short) can combine Reality with virtual through a computer, and provides an Extended Reality space for human-computer interaction for users. In the augmented reality space, a user may perform social interactions, entertainment, learning, work, tele-office, authoring UGC (User Generated Content ), etc. through a virtual reality device such as a head mounted display (Head Mount Display, HMD).
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, according to one or more embodiments of the present disclosure, there is provided a method of finding an item, comprising:
establishing a wireless communication connection between the augmented reality device and the target object;
determining a location of the target item based on the wireless communication connection;
and displaying guide information for guiding the position of the target object in the augmented reality space.
In a second aspect, according to one or more embodiments of the present disclosure, there is provided an apparatus for finding an item, comprising:
A connection unit for establishing wireless communication connection between the augmented reality device and the target object;
A determining unit for determining a location of the target item based on the wireless communication connection;
And the display unit is used for displaying guide information for guiding the position of the target object in the augmented reality space.
In a third aspect, according to one or more embodiments of the present disclosure, there is provided an electronic device comprising: at least one memory and at least one processor; wherein the memory is for storing program code, the processor is for invoking the program code stored by the memory to cause the electronic device to perform a method of finding an item provided in accordance with one or more embodiments of the present disclosure.
In a fourth aspect, according to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code which, when executed by a computer device, causes the computer device to perform a method of finding an item provided according to one or more embodiments of the present disclosure.
In a fifth aspect, according to one or more embodiments of the present disclosure, there is provided a computer program product comprising instructions that, when executed by a computer device, cause the computer device to perform a method of finding an item provided according to one or more embodiments of the present disclosure.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of a method of finding an item provided in an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an augmented reality device according to an embodiment of the present disclosure;
FIG. 3 is an alternative schematic diagram of a virtual field of view of an augmented reality device provided in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an augmented reality space provided in accordance with an embodiment of the present disclosure;
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the steps recited in the embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Furthermore, embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. The term "responsive to" and related terms mean that one signal or event is affected to some extent by another signal or event, but not necessarily completely or directly. If event x occurs "in response to" event y, x may be directly or indirectly in response to y. For example, the occurrence of y may ultimately lead to the occurrence of x, but other intermediate events and/or conditions may exist. In other cases, y may not necessarily result in the occurrence of x, and x may occur even though y has not yet occurred. Furthermore, the term "responsive to" may also mean "at least partially responsive to".
The term "determining" broadly encompasses a wide variety of actions, which may include obtaining, calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like, and may also include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like, as well as parsing, selecting, choosing, establishing and the like. Related definitions of other terms will be given in the description below. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
For the purposes of this disclosure, the phrase "a and/or B" means (a), (B), or (a and B).
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
In some embodiments, an execution subject of the method of finding an item provided by the present disclosure may include an augmented reality device. An Extended Reality (XR) technology can combine Reality with virtual through a computer, and provides an Extended Reality space for human-computer interaction for users. In the augmented reality space, a user may perform social interactions, entertainment, learning, work, tele-office, authoring UGC (User Generated Content ), etc. through a virtual reality device such as a head mounted display (Head Mount Display, HMD).
The augmented reality device may include, but is not limited to, the following types:
And the computer-side virtual reality (PCVR) equipment performs related calculation of virtual reality functions and data output by using the PC side, and the external computer-side virtual reality equipment realizes the effect of virtual reality by using the data output by the PC side.
The mobile virtual reality device supports setting up a mobile terminal (such as a smart phone) in various manners (such as a head-mounted display provided with a special card slot), performing related calculation of a virtual reality function by the mobile terminal through connection with the mobile terminal in a wired or wireless manner, and outputting data to the mobile virtual reality device, for example, watching a virtual reality video through an APP of the mobile terminal.
The integrated virtual reality device has a processor for performing the calculation related to the virtual function, and thus has independent virtual reality input and output functions, and is free from connection with a PC or a mobile terminal, and has high degree of freedom in use.
Of course, the form of implementation of the augmented reality device is not limited to this, and may be further miniaturized or enlarged as needed.
The sensor (such as a nine-axis sensor) for detecting the gesture in the augmented reality device is arranged in the augmented reality device, and is used for detecting the gesture change of the virtual reality device in real time, if the user wears the virtual reality device, when the gesture of the head of the user changes, the real-time gesture of the head is transmitted to the processor, so that the gaze point of the sight of the user in the virtual environment is calculated, an image in the gaze range (namely a virtual field of view) of the user in the three-dimensional model of the virtual environment is calculated according to the gaze point, and the image is displayed on the display screen, so that the user looks like watching in the real environment.
Referring to fig. 2, a user may enter an augmented reality space through an intelligent terminal device such as a head-mounted VR glasses, and control his/her Avatar (Avatar) in the augmented reality space to perform social interaction, entertainment, learning, remote office, etc. with other user-controlled avatars. In some embodiments, the avatar may display an animated avatar of the avatar or a real image of the user, which is not limiting herein.
In one embodiment, in the augmented reality space, a user may implement related interactive operations through a controller, which may be a handle, for example, a user performs related operation control through operation of keys of the handle. Of course, in other embodiments, the target object in the virtual reality device may be controlled using a gesture or voice or multi-mode control mode instead of using the controller.
Fig. 3 shows an alternative schematic view of a virtual field of view of an augmented reality device according to an embodiment of the disclosure, where a horizontal field of view angle and a vertical field of view angle are used to describe a distribution range of the virtual field of view in a virtual environment, a vertical direction of distribution range is represented by a vertical field of view angle BOC, a horizontal direction of distribution range is represented by a horizontal field of view angle AOB, and an image of the virtual field of view in the virtual environment can always be perceived by a human eye through a lens. The angle of view represents a distribution range of viewing angles that the lens has when sensing an environment. For example, the angle of view of an augmented reality device represents the range of distribution of viewing angles that the human eye has when a virtual environment is perceived through a lens of the augmented reality device; for another example, in a mobile terminal provided with a camera, the field angle of the camera is a distribution range of the viewing angle that the camera has when sensing the real environment to shoot.
An augmented reality device, such as an HMD, incorporates several cameras (e.g., depth cameras, RGB cameras, etc.), the purpose of which is not limited to providing a through view only. The camera images and integrated Inertial Measurement Unit (IMU) provide data that can be processed by computer vision methods to automatically analyze and understand the environment. Also, HMDs are designed to support not only passive computer vision analysis, but also active computer vision analysis. The passive computer vision method analyzes image information captured from the environment. These methods may be monoscopic (images from a single camera) or stereoscopic (images from two cameras). Including but not limited to feature tracking, object recognition, and depth estimation. Active computer vision methods add information to the environment by projecting a pattern that is visible to the camera but not necessarily to the human vision system. Such techniques include time-of-flight (ToF) cameras, laser scanning, or structured light to simplify stereo matching issues. Active computer vision is used to implement scene depth reconstruction.
Referring to fig. 1, fig. 1 shows a flowchart of a method 100 for searching for an item according to an embodiment of the present disclosure, where the method 100 includes steps S120-S160.
Step S120: a wireless communication connection is established with the target item.
In some embodiments, the target article may be an electronic device with wireless communication capabilities, including but not limited to a mobile terminal, a wearable device, an augmented reality device, and the like. If the target item itself does not have an eligible wireless communication function, an electronic tag, such as a radio frequency identification technology (Radio Frequency Identification, RFID), may be provided on the target item to provide the wireless communication function.
In some embodiments, the wireless communication connection established between the augmented reality device and the target item includes, but is not limited to, radio Frequency (RF) communication (e.g., radio Frequency Identification (RFID)), zigbee communication protocols, wiFi, infrared, wireless Universal Serial Bus (USB), ultra Wideband (UWB), bluetooth communication protocols, and cellular communication, such as code division multiple access (CDM) or global system for mobile communications (GSM).
In a specific embodiment, after the augmented reality device is started, the device capable of being connected through wireless communication can be detected in response to a search operation of a user, or the device capable of being connected through wireless communication can be automatically detected in a preset frequency band and wireless communication is performed, so that the searched device in a signal range can be searched. If the device is detected, calibrating the device capable of being connected with the wireless communication, and establishing the wireless communication connection with the calibrated device. The search operation includes, but is not limited to, a somatosensory control operation, a gesture control operation, an eyeball shaking operation, a touch control operation, a voice control instruction, or an operation of an external control device. In one embodiment, the search operation may further include a user input of text, voice, or image, etc. corresponding to the target item, but the present disclosure is not limited thereto.
Step S140: a location of the target item is determined based on the wireless communication connection.
In some embodiments, a position of the target item relative to the augmented reality device may be determined based on a signal time of flight or a signal phase between the target item and the augmented reality device; and/or receiving the position information of the target object sent by the target object through the wireless communication connection.
In one embodiment, ultra Wideband (UWB) technology may be used to determine the location of the target item. The ultra-wideband technology is a wireless carrier communication technology, which does not adopt a sine carrier in a traditional communication system, but utilizes nanosecond non-sine wave narrow pulses to transmit data, so that the ultra-wideband technology occupies a large frequency spectrum range, and the data transmission rate can reach more than hundreds of megabits per second despite the use of wireless communication. The UWB technology has the advantages of low system complexity, low power spectrum density of the transmitted signal, insensitivity to channel fading, low interception capability, high positioning accuracy and the like, and is particularly suitable for high-speed wireless access in indoor and other dense multipath places. Positioning using UWB technology allows the location of the signal source to be determined with high accuracy and very low delay. Common UWB-based positioning methods include: time Of Flight (TOF), time Of Arrival (TIME DIFFERENCE Of Arrival, TDOA), angle Of Arrival (AOA), angular phase difference Of Arrival (PHASE DIFFERENCE Of Arrival, PDOA), signal strength analysis (RSS), and the like.
Illustratively, the distance between the augmented reality device and the target object may be obtained by a time-of-flight ranging method (TOF), which is a method of measuring the distance between two radio transceivers by multiplying the time of flight of a signal by the speed of light; the orientation of the target object relative to the augmented reality device can be obtained by an angle of arrival positioning method (AOA), the AOA requires the augmented reality device to have at least two antennas, the AOA determines the time (or phase) difference between the different antennas when detecting the incoming signal, and the augmented device can calculate the angle of signal incoming by means of the information provided by the time (or phase) difference.
It should be noted that, the determination of the location information of the target object may be specifically performed by the augmented reality device or by the cloud server, which is not limited herein. It will be appreciated that if the determination of the location information of the target item is performed specifically by the server, the augmented reality device may accept the location information of the target item sent by the server to obtain the location of the target item.
In a specific embodiment, the augmented reality device may also directly receive, through the wireless communication connection, the location information of the target object sent by the target object. Illustratively, the target item is equipped with a positioning system, the position of which is determined by techniques such as synchronous positioning and mapping (Simultaneous Localization AND MAPPING, SLAM) or inertial navigation positioning. An inertial measurement unit (Inertial Measurement Unit, IMU) may be provided on the target object
Other suitable indoor positioning technologies may also be used to determine the position of the target object, such as bluetooth positioning technology, wiFi positioning technology, zigBee positioning technology, radio frequency technology, inertial navigation technology, geomagnetic positioning technology, visual positioning technology, etc., which are not described herein.
Step S160: and displaying guide information for guiding the position of the target object in the augmented reality space.
The augmented reality space may be a simulation environment for the real world, a semi-simulated and semi-fictional virtual scene, or a purely fictional virtual scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual character object (Avatar) to move in the virtual scene.
In some embodiments, the augmented reality space includes a real environment image of an ambient environment in which the augmented reality device is located. For example, the real environment image may be generated based on an optical perspective technique or a video perspective technique. The optical perspective technology can directly enable a user to see the surrounding external environment through the optical lens. The video perspective technology collects real-time images of the surrounding external environment through a camera, processes the images captured by the camera through anti-distortion algorithm processing, and then outputs images on a head-mounted display to simulate the surrounding external environment. The non-optical perspective equipment can simulate the environment in the real environment one by one through the video perspective technology, and the generated environment image is generated through algorithm processing, so that the content in the perspective picture can be replaced by combining with other image processing technologies. In this embodiment, by displaying the guide information of the target object superimposed on the real environment image displayed in the augmented reality device, the user can visually know the position of the target object in the first time, so that the efficiency of searching for the target object can be improved.
In some embodiments, the guidance information may include one or more of the following: text, images, three-dimensional models. For example, a preset two-dimensional image or a three-dimensional stereoscopic model corresponding to the target object may be displayed at the position of the target object in the real environment image, so as to indicate the position of the target object of the user. The two-dimensional image or the three-dimensional stereoscopic model may be pre-stored in the augmented reality device, or pre-input into the augmented reality device by the user, or acquired from the cloud by the augmented reality device based on the input of the user, which is not limited herein.
In this way, according to one or more embodiments of the present disclosure, by enabling the augmented reality device to establish a wireless communication connection with the target object, determining the position of the target object based on the wireless communication connection, and displaying the guiding information for guiding the position of the target object in the augmented reality space, the searching process of the target object may be integrated with the reality environment, and the searching efficiency and the user experience of the target object may be improved.
In some embodiments, an environment map of the target space may be predetermined, and a thumbnail of the environment map may be displayed superimposed on the real environment image.
The environment map may be a two-dimensional map image or a three-dimensional map model, for example. The thumbnail of the environment map may include prompt information for prompting the current positions of the augmented reality device and the target item in the thumbnail, and navigation information of the augmented reality device to the target item.
Further, in some embodiments, it may be determined whether the augmented reality device and the target object are currently located in the same subspace of the target space, and if not, a thumbnail of the environment map is displayed superimposed on the real environment image, so that the user can go to another subspace to find the target object.
The following description will take a target space as an example of a house. The environment map of the house can be obtained in advance, for example, synchronous positioning and mapping technology can be adopted, so that the mapping equipment starts from any place of the house, positions and postures of the equipment are positioned through repeatedly observed environment features in the motion process, and then an incremental map of the surrounding environment is constructed according to the positions of the equipment, so that the purposes of simultaneous positioning and map construction are achieved. The mapping apparatus may be configured with a laser SLAM sensor or a vision SLAM sensor.
It should be noted that the mapping device may be an augmented reality device or other devices, and the disclosure is not limited herein.
After determining the environment map of the house, when it is determined that the augmented reality device and the target object are currently located in a different room from the target, a thumbnail of the environment map may be displayed at a preset position (e.g., the lower left corner or the upper right corner of the virtual field of view) in the augmented reality space, and the positions of the current augmented reality device and the target object and the navigation planning path therebetween may be identified in the thumbnail.
Referring to fig. 4, in an augmented reality space 10 in which an augmented reality device presents to a user, a real environment image 20 of a room in which the user is located is displayed, a thumbnail image 30 of a three-dimensional model of the entire house is superimposed and displayed in an upper right corner of the real environment image 20, a first mark 31 for prompting the current location of the augmented reality device and a second mark 32 for prompting the location of a target object are displayed in the thumbnail image 30, and a navigation planning path 33 for navigating from the current location of the augmented reality device to the location of the target object is displayed.
According to one or more embodiments of the present disclosure, by determining an environment map of a target space in advance, if the augmented reality device and the target object are currently located in different subspaces in the target space, a thumbnail of the environment map is also displayed in the augmented reality space, so that the efficiency of locating and searching the target object by a user can be further improved.
In some embodiments, the thumbnail may be rotated in response to a user's adjustment operation for the thumbnail. The adjustment operation includes, but is not limited to, a somatosensory control operation, a gesture control operation, an eyeball shaking operation, a touch control operation, a voice control instruction, or an operation of an external control device. In this embodiment, by rotating the thumbnail in response to the adjustment operation of the user on the thumbnail, it is possible to provide more viewing angles of the environment map, and to improve the efficiency of the user in locating and finding the target object.
In some embodiments, the method 100 further comprises:
step S110: and pre-binding the augmented reality device and the target object.
For example, the augmented reality device may be pre-bound to the target item before the target item is lost or lost, for example, by the augmented reality device searching for a signal of the target item and pairing the augmented reality device with the target item to bind the augmented reality device and the target item. After the augmented reality device is bound with the target object, the augmented reality device can store binding information (such as identity verification information, key information and the like of the target object) of the target object, so that the target object can be searched for in a targeted manner when the augmented reality device is used next time, and the connection efficiency and the security between the augmented reality device and the target object are improved.
In some embodiments, after the augmented reality device establishes an association with the target item, the device ID of the target item may be displayed in a preset interface of the augmented reality device, and when the user triggers the target item ID, the augmented reality device may initiate a connection request to the target item to establish a wireless communication connection with the target item.
In some embodiments, step S120 includes: and enabling the augmented reality equipment and the target object to establish wireless communication connection on a preset frequency band. The signal sent by the target object has a preset frequency band, and the augmented reality device only searches for the target object in the preset frequency band, so that the augmented reality device and the target object are connected in a wireless communication manner in the preset frequency band, the augmented reality device can be connected with the target object in a targeted manner, and the connection efficiency and the safety between the augmented reality device and the target object are improved.
In some embodiments, if it is detected that the target item is obscured by another overlay in the real environment image, the guidance information is displayed over the other overlay. In this embodiment, by displaying the guide information above the cover when the target object is blocked by the other cover, it is possible to facilitate the user to find the covered target object.
For example, the target object in the augmented reality device may be image-identified based on an image-identification technique, and when it is identified that there are other overlays on the target object, the guidance information may be displayed on the overlay, but the disclosure is not limited thereto.
Accordingly, there is provided in accordance with an embodiment of the present disclosure an apparatus for searching for an item, comprising:
A connection unit for establishing wireless communication connection between the augmented reality device and the target object;
A determining unit for determining a location of the target item based on the wireless communication connection;
And the display unit is used for displaying guide information for guiding the position of the target object in the augmented reality space.
In some embodiments, the augmented reality space includes a real environment image of an ambient environment in which the augmented reality device is located.
In some embodiments, the real environment image is generated based on an optical perspective technique or a video perspective technique.
In some embodiments, the apparatus further comprises:
A map determining unit for determining an environment map of the target space in advance;
And the map display unit is used for displaying the thumbnail of the environment map in a superimposed manner on the real environment image.
In some embodiments, the map display unit is further configured to display a thumbnail of the environment map in the augmented reality space if the augmented reality device and the target item are currently located in a different subspace of the target space.
In some embodiments, the thumbnail includes a hint information for hinting the current location of the augmented reality device and the target item in the thumbnail.
In some embodiments, the thumbnail further includes navigation information of the augmented reality device to the target item.
In some embodiments, the target space comprises a house and the subspace comprises a room within the house.
In some embodiments, the apparatus further comprises:
and a thumbnail rotation unit for rotating the thumbnail in response to an operation of the user on the thumbnail.
In some embodiments, the apparatus further comprises:
And the binding unit is used for pre-binding the augmented reality equipment and the target object.
In some embodiments, the connection unit is configured to establish a wireless communication connection between the augmented reality device and the target object on a preset frequency band.
In some embodiments, the display unit is configured to display the guidance information above other covers if it is detected that the target object is blocked by the other covers in the real environment image.
In some embodiments, the determining unit is configured to determine a position of the target item relative to the augmented reality device based on a signal flight time or a signal phase between the target item and the augmented reality device, and/or to receive position information of the target item transmitted by the target item through the wireless communication connection.
In some embodiments, the guidance information includes one or more of the following: text, images, three-dimensional models.
For embodiments of the device, reference is made to the description of method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
Accordingly, in accordance with one or more embodiments of the present disclosure, there is provided an electronic device comprising:
At least one memory and at least one processor;
Wherein the memory is for storing program code, and the processor is for invoking the program code stored by the memory to cause the electronic device to perform a method of finding an item provided in accordance with one or more embodiments of the present disclosure.
Accordingly, in accordance with one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium having program code stored thereon, the program code being executable by a computer device to cause the computer device to perform a method of finding an item provided in accordance with one or more embodiments of the present disclosure.
Referring now to fig. 5, a schematic diagram of an electronic device (e.g., a terminal device or server) 800 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 5 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 5, the electronic device 800 may include a processing means (e.g., a central processor, a graphics processor, etc.) 801, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 802 or a program loaded from a storage means 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the electronic device 800 are also stored. The processing device 801, the ROM 802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
In general, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, etc.; storage 808 including, for example, magnetic tape, hard disk, etc.; communication means 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 5 shows an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via communication device 809, or installed from storage device 808, or installed from ROM 802. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 801.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods of the present disclosure described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a method of finding an item, comprising: establishing a wireless communication connection between the augmented reality device and the target object; determining a location of the target item based on the wireless communication connection; and displaying guide information for guiding the position of the target object in the augmented reality space.
According to one or more embodiments of the present disclosure, the augmented reality space includes a real environment image of an ambient environment in which the augmented reality device is located.
In accordance with one or more embodiments of the present disclosure, the real-world environment image is generated based on an optical perspective technique or a video perspective technique.
According to one or more embodiments of the present disclosure, the method further comprises: an environment map of the target space is predetermined; and displaying the thumbnail of the environment map in a superimposed manner on the actual environment image.
According to one or more embodiments of the present disclosure, the displaying the thumbnail of the environment map superimposed on the real environment image includes: and if the augmented reality equipment and the target object are currently located in different subspaces in the target space, displaying the thumbnail of the environment map in the augmented reality space.
In accordance with one or more embodiments of the present disclosure, the thumbnail includes prompt information for prompting a current location of the augmented reality device and the target item in the thumbnail.
In accordance with one or more embodiments of the present disclosure, the thumbnail further includes navigation information of the augmented reality device to the target item.
According to one or more embodiments of the present disclosure, the target space comprises a house and the subspace comprises a room within the house.
In accordance with one or more embodiments of the present disclosure, the thumbnail is rotated in response to a user operation on the thumbnail.
According to one or more embodiments of the present disclosure, the augmented reality device is pre-bound to the target item.
In accordance with one or more embodiments of the present disclosure, the enabling an augmented reality device to establish a wireless communication connection with a target item includes: and enabling the augmented reality equipment and the target object to establish wireless communication connection on a preset frequency band.
According to one or more embodiments of the present disclosure, the displaying, in the augmented reality space, guiding information for guiding the location of the target object includes: and if the target object in the real environment image is detected to be blocked by other covers, displaying the guide information above the other covers.
According to one or more embodiments of the present disclosure, the determining the location of the target item based on the wireless communication connection includes: determining a position of the target item relative to the augmented reality device based on a signal flight time or a signal phase between the target item and the augmented reality device; and/or receiving the position information of the target object sent by the target object through the wireless communication connection.
According to one or more embodiments of the present disclosure, the guidance information includes one or more of the following: text, images, three-dimensional models.
According to one or more embodiments of the present disclosure, there is provided an apparatus for searching for an item, comprising: a connection unit for establishing wireless communication connection between the augmented reality device and the target object; a determining unit for determining a location of the target item based on the wireless communication connection; and the display unit is used for displaying guide information for guiding the position of the target object in the augmented reality space.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: at least one memory and at least one processor; wherein the memory is for storing program code, the processor is for invoking the program code stored by the memory to cause the electronic device to perform a method of finding an item provided in accordance with one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code which, when executed by a computer device, causes the computer device to perform a method of finding an item provided according to one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, there is provided a computer program product comprising instructions that, when executed by a computer device, cause the computer device to perform a method of finding an item provided according to one or more embodiments of the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (18)

1. A method of finding an item, comprising:
establishing a wireless communication connection between the augmented reality device and the target object;
determining a location of the target item based on the wireless communication connection;
and displaying guide information for guiding the position of the target object in the augmented reality space.
2. The method of claim 1, wherein the augmented reality space comprises a real environment image of an ambient environment in which the augmented reality device is located.
3. The method of claim 2, wherein the real environment image is generated based on an optical perspective technique or a video perspective technique.
4. The method as recited in claim 2, further comprising:
An environment map of the target space is predetermined;
and displaying the thumbnail of the environment map in a superimposed manner on the actual environment image.
5. The method of claim 4, wherein the overlaying the thumbnail of the environment map on the real environment image comprises:
And if the augmented reality equipment and the target object are currently located in different subspaces in the target space, overlaying and displaying the thumbnail of the environment map on the real environment image.
6. The method of claim 4, wherein the thumbnail includes hint information for hinting at a current location of the augmented reality device and the target item in the thumbnail.
7. The method of claim 6, wherein the thumbnail further comprises navigation information of the augmented reality device to the target item.
8. The method of claim 5, wherein the target space comprises a house and the subspace comprises a room within the house.
9. The method as recited in claim 4, further comprising:
The thumbnail is rotated in response to a user operation on the thumbnail.
10. The method as recited in claim 1, further comprising:
and pre-binding the augmented reality device and the target object.
11. The method of claim 1, wherein the establishing a wireless communication connection between the augmented reality device and the target item comprises:
And enabling the augmented reality equipment and the target object to establish wireless communication connection on a preset frequency band.
12. The method of claim 2, wherein displaying, in the augmented reality space, guidance information for guiding the location of the target object comprises:
And if the target object in the real environment image is detected to be blocked by other covers, displaying the guide information above the other covers.
13. The method of claim 1, wherein the determining the location of the target item based on the wireless communication connection comprises:
determining a position of the target item relative to the augmented reality device based on a signal flight time or a signal phase between the target item and the augmented reality device; and/or the number of the groups of groups,
And receiving the position information of the target object sent by the target object through the wireless communication connection.
14. The method of claim 1, wherein the guidance information comprises one or more of: text, images, three-dimensional models.
15. An apparatus for searching for items, comprising:
A connection unit for establishing wireless communication connection between the augmented reality device and the target object;
A determining unit for determining a location of the target item based on the wireless communication connection;
And the display unit is used for displaying guide information for guiding the position of the target object in the augmented reality space.
16. An electronic device, comprising:
At least one memory and at least one processor;
wherein the memory is for storing program code and the processor is for invoking the program code stored by the memory to cause the electronic device to perform the method of any of claims 1-14.
17. A non-transitory computer storage medium comprising,
The non-transitory computer storage medium stores program code that, when executed by a computer device, causes the computer device to perform the method of any of claims 1 to 14.
18. A computer program product, characterized in that it comprises instructions which, when executed by a computer device, cause the computer device to perform the method according to any of claims 1 to 14.
CN202211320938.2A 2022-10-26 2022-10-26 Method, device, electronic equipment and storage medium for searching for article Pending CN117979414A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211320938.2A CN117979414A (en) 2022-10-26 2022-10-26 Method, device, electronic equipment and storage medium for searching for article

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211320938.2A CN117979414A (en) 2022-10-26 2022-10-26 Method, device, electronic equipment and storage medium for searching for article

Publications (1)

Publication Number Publication Date
CN117979414A true CN117979414A (en) 2024-05-03

Family

ID=90860354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211320938.2A Pending CN117979414A (en) 2022-10-26 2022-10-26 Method, device, electronic equipment and storage medium for searching for article

Country Status (1)

Country Link
CN (1) CN117979414A (en)

Similar Documents

Publication Publication Date Title
US20150379770A1 (en) Digital action in response to object interaction
CN111742281B (en) Electronic device for providing second content according to movement of external object for first content displayed on display and operating method thereof
US11132842B2 (en) Method and system for synchronizing a plurality of augmented reality devices to a virtual reality device
WO2022005717A1 (en) Generating ground truth datasets for virtual reality experiences
US20240031678A1 (en) Pose tracking for rolling shutter camera
US20240089695A1 (en) Locating Content In An Environment
CN117979414A (en) Method, device, electronic equipment and storage medium for searching for article
US20200351608A1 (en) Locating Content in an Environment
US20240078734A1 (en) Information interaction method and apparatus, electronic device and storage medium
WO2023231666A1 (en) Information exchange method and apparatus, and electronic device and storage medium
CN117991889A (en) Information interaction method, device, electronic equipment and storage medium
KR20180055764A (en) Method and apparatus for displaying augmented reality object based on geometry recognition
US20240095086A1 (en) Mobile device resource optimized kiosk mode
EP4100918B1 (en) Method and system for aligning a digital model of a structure with a video stream
US20230401796A1 (en) Fast ar device pairing using depth predictions
WO2024016880A1 (en) Information interaction method and apparatus, and electronic device and storage medium
US20240127006A1 (en) Sign language interpretation with collaborative agents
CN117435040A (en) Information interaction method, device, electronic equipment and storage medium
CN117631921A (en) Information interaction method, device, electronic equipment and storage medium
CN115981544A (en) Interaction method and device based on augmented reality, electronic equipment and storage medium
CN117435041A (en) Information interaction method, device, electronic equipment and storage medium
CN117631904A (en) Information interaction method, device, electronic equipment and storage medium
CN117519457A (en) Information interaction method, device, electronic equipment and storage medium
CN117994284A (en) Collision detection method, collision detection device, electronic equipment and storage medium
CN117641040A (en) Video processing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination