CN111580655A - Information processing method and device and electronic equipment - Google Patents

Information processing method and device and electronic equipment Download PDF

Info

Publication number
CN111580655A
CN111580655A CN202010381695.8A CN202010381695A CN111580655A CN 111580655 A CN111580655 A CN 111580655A CN 202010381695 A CN202010381695 A CN 202010381695A CN 111580655 A CN111580655 A CN 111580655A
Authority
CN
China
Prior art keywords
content
user
editing
editing panel
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010381695.8A
Other languages
Chinese (zh)
Inventor
习羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010381695.8A priority Critical patent/CN111580655A/en
Publication of CN111580655A publication Critical patent/CN111580655A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The application discloses an information processing method, an information processing device and electronic equipment, and belongs to the technical field of communication. The information processing method comprises the following steps: displaying an editing panel; target content is displayed on the editing panel; receiving a first input of a user; responding to the first input, and acquiring gesture information of a user in an AR space; and editing the target content according to the gesture information. According to the embodiment of the application, the editing panel can be displayed in the AR space, and the target content on the editing panel can be edited by means of the gesture information of the user in the AR space, so that the user can express own opinions conveniently, and the requirement of the user for efficiently communicating the content in the AR conference scene is met.

Description

Information processing method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an information processing method and device and electronic equipment.
Background
Augmented Reality (AR) is a technology that seamlessly merges virtual world information and real world information. With the rapid development of the AR technology, the AR is more and more widely applied to daily life and work of people, the AR conference is one of the AR conference, and the AR conference brings a real on-site conference experience to participants in different spaces. However, in the existing AR conference scene, the requirement of the user to efficiently communicate the content cannot be satisfied.
Disclosure of Invention
The embodiment of the application aims to provide an information processing method, an information processing device and electronic equipment, so as to solve the problem that the requirement of a user for efficiently communicating contents cannot be met in the existing AR conference scene.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an information processing method, which is applied to a first AR device, and the method includes:
displaying an editing panel; wherein the editing panel is displayed with target content;
receiving a first input of a user;
responding to the first input, and acquiring gesture information of a user in an AR space;
and editing the target content according to the gesture information.
In a second aspect, an embodiment of the present application provides an information processing method, which is applied in an AR server, and the method includes:
receiving second content sent by the first AR device; the second content is obtained by editing display content on an editing panel corresponding to the first AR device by the first AR device according to the user gesture information;
and sending the second content to a second AR device, and displaying the second content on a corresponding editing panel of the second AR device.
In a third aspect, an embodiment of the present application provides an information processing apparatus, which is applied to a first AR device, and includes:
the display module is used for displaying the editing panel; wherein the editing panel is displayed with target content;
the first receiving module is used for receiving a first input of a user;
the acquisition module is used for responding to the first input and acquiring gesture information of a user in an AR space;
and the editing module is used for editing the target content according to the gesture information.
In a fourth aspect, an embodiment of the present application provides an information processing apparatus, which is applied in an AR server, and includes:
a third receiving module, configured to receive second content sent by the first AR device; the second content is obtained by editing display content on an editing panel corresponding to the first AR device by the first AR device according to the user gesture information;
and the second sending module is used for sending the second content to a second AR device, and the second content is displayed on a corresponding editing panel of the second AR device.
In a fifth aspect, embodiments of the present application provide an electronic device, where the first AR device includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect or the steps of the method according to the second aspect.
In a sixth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect, or the steps of the method according to the second aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect or the method according to the second aspect.
In this embodiment, the AR device may display an editing panel, where target content is displayed on the editing panel, receive a first input of a user, obtain gesture information of the user in the AR space in response to the first input, and edit the target content according to the gesture information. Therefore, when the AR equipment is applied to an AR conference scene, the editing panel can be displayed in the AR space, and the target content on the editing panel can be edited by means of the gesture information of the user in the AR space, so that the user can express own opinions conveniently, and the requirement of the user for efficiently communicating the content in the AR conference scene is met.
Drawings
FIG. 1 is a flow chart of an information processing method according to an embodiment of the present application;
FIG. 2 is a flow chart of another information processing method according to an embodiment of the present application;
fig. 3 is a flowchart of an information interaction process in an AR conference according to an embodiment of the present application;
FIG. 4 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application;
FIG. 5 is a second schematic structural diagram of an information processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
To facilitate understanding of the embodiments of the present application, the following is first explained.
In a real face-to-face meeting, the meeting participants can often discuss about a certain topic, under the condition, pictures are often written on panels such as a white board, a blackboard and the like, so that the meeting participants can express own opinions more clearly and efficiently, and the contents of discussion can be visually presented and recorded, so that the whole discussion process is more visual and efficient compared with simple language expression. In an AR conference scene, each participant needs to wear AR devices such as AR glasses and the like and is in a different space, so that it is inconvenient for each participant to edit a real panel, thereby visually presenting and recording discussion content. In order to enable the efficient communication mode by means of the panel to be presented in the AR conference scene, the embodiment of the present application proposes a concept of an editing panel, where the editing panel is a panel that is virtual in the AR space, and the editing panel may also be referred to as a virtual whiteboard, and the like, and implements editing of the content displayed on the editing panel by means of gesture information of a user (i.e., a participant) in the AR space, so as to facilitate the user to express his own opinion, and meet the requirement of the user for efficiently communicating the content in the AR conference scene.
Understandably, the embodiment of the application is mainly suitable for AR conference scenes.
Optionally, an embodiment of the present application provides an information interaction system in an AR conference, where the information interaction system includes at least AR glasses worn by each participant, And a Simultaneous Localization And Mapping (SLAM) system (including at least a depth camera, an RGB camera, an Inertial Measurement Unit (IMU), And the like) on the AR glasses may be used to localize And 3D map a space where the participant is located, determine a placement position of a virtual editing panel in the AR space, And simultaneously present a virtual editing panel in front of each participant in real time through an optical machine display system of the AR glasses. When a participant stretches out, a gesture recognition and tracking system (at least comprising a depth camera and a Time of Flight (ToF) distance measuring device) on the AR glasses can detect and track the hand movement, recognize different gestures, and perform different edits on the content displayed on the editing panel based on the recognized gestures, so that the content displayed on the editing panel is edited in real Time through the gestures, which is similar to the operations on the panel in the real world.
The information processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, fig. 1 is a flowchart of an information processing method provided in an embodiment of the present application, where the method is applied to a first AR device, and as shown in fig. 1, the method includes the following steps:
step 101: and displaying an editing panel.
In this embodiment, target content is displayed on the editing panel. The editing panel is a virtual panel in the corresponding AR space, which may also be referred to as a virtual whiteboard, etc. The target content is currently desired content. For example, taking an AR conference scene as an example, the target content is the content discussed in the current conference.
Alternatively, the first AR device may be an AR glasses, an AR helmet, or the like worn by the user.
In one embodiment, the creating process of the edit panel may include: acquiring a panel creating instruction input by a user; in response to the panel creation instruction, the edit panel is created in the AR space. In this embodiment, the first AR device may pre-store panel rendering data to create an editing panel from the pre-stored panel rendering data after acquiring a panel creation instruction input by a user. The panel creation instruction may be a creation instruction determined based on gesture information of the user, or may be a voice instruction of the user.
In another embodiment, the creating process of the editing panel may include: acquiring a panel creating instruction input by a user; sending the panel creation instruction to the AR server; receiving panel rendering data sent by the AR server based on the panel creating instruction; the editing panel is created in the AR space according to the panel rendering data. The panel creation instruction may be a creation instruction determined based on gesture information of the user, or a voice instruction of the user, or the like.
For example, taking AR glasses as an example, when a user stretches his hand to drag or slide a pendant of an editing panel in an AR space, a gesture recognition system and a positioning and tracking system (such as those implemented based on multiple sensors such as a depth camera, an RGB camera, and a ToF) on the AR glasses may recognize position information of the hand, and when the position information of the hand coincides with the position information of the pendant, it is considered that the hand has touched the pendant, and at the same time, a drag or slide input of the pendant by the user is recognized, and a panel creation instruction is obtained in response to the input, so as to create the editing panel according to pre-stored panel rendering data. The editing panel may now be displayed in the AR space. The AR space can be a meeting place of the AR conference, under the condition, after a user, namely a participant, creates an editing panel, the AR glasses worn by the participant can send panel creation instructions and/or panel rendering data to the AR glasses worn by other participants, so that the light machine display systems on the AR glasses worn by other participants can simultaneously present the editing panel in front of the corresponding participant, and each participant can share the editing panel to carry out the conference.
For another example, taking AR glasses as an example, when a participant inputs a voice instruction such as "i want to panel conference" in an AR conference, the voice recognition system on the AR glasses worn by the participant can recognize the voice instruction (i.e., a panel creation instruction) and transmit the voice instruction to the AR server; the AR server can generate 3D rendering data of the editing panel based on the voice instruction, and sends the rendering data to the AR glasses worn by each participant in the AR conference, so that the light machine display system on the AR glasses worn by each participant can present the editing panel in front of the corresponding participant at the same time, and each participant can share the editing panel to carry out the conference.
Step 102: a first input is received from a user.
In this embodiment, the first input may be a gesture of the user in the AR space for editing the panel.
Step 103: and responding to the first input, and acquiring gesture information of the user in the AR space.
In one embodiment, taking AR glasses as an example, a gesture recognition system and a positioning and tracking system (implemented based on multiple sensors such as a depth camera, an RGB camera, and a ToF) on the AR glasses may collect gestures of a user for an editing panel in an AR space, and recognize corresponding gesture information.
Step 104: and editing the target content according to the gesture information.
In this embodiment, different gesture information may correspond to different editing operations for the display content on the editing panel. In order to edit the content displayed on the editing panel according to the gesture information, a corresponding relationship between the gesture information and the editing operation may be stored in the first AR device in advance, so that after the gesture information of the user in the AR space is acquired, the corresponding editing operation is performed on the content displayed on the editing panel according to the corresponding relationship.
Alternatively, the editing operation for the target content displayed on the editing panel may include, but is not limited to, a content writing operation, a content deleting operation, a content storing operation, and the like.
For example, if the gesture information of the user in the AR space is that a finger (e.g., an index finger, a middle finger, etc.) is extended to write a painting on the editing panel, the finger of the user may touch the editing panel, that is, the position information of the finger at least partially overlaps with the position information of the editing panel, or may not touch the editing panel, for example, the finger is suspended above or below the editing panel, and the corresponding editing operation may be writing content on the editing panel.
For another example, if the gesture information of the user in the AR space is that a palm is extended to perform an erasing operation on the editing panel, the palm of the user may touch the editing panel, that is, the position information of the palm is at least partially overlapped with the position information of the editing panel, or may not touch the editing panel, for example, the palm is suspended above or below the editing panel, and the corresponding editing operation may be to delete at least part of or all of the content displayed on the editing panel. The deleted display content may be associated with the editing panel area covered by the user erase gesture, such as deleting all content displayed on the editing panel area if the user erase gesture covers all of the editing panel area, and deleting content displayed on the portion of the editing panel area if the user erase gesture covers all of the editing panel area.
For another example, if the gesture information of the user in the AR space is a rectangular camera composed of two thumbs and an index finger, the hand of the user may touch the editing panel, that is, the position information of the hand at least partially coincides with the position information of the editing panel, or may not touch the editing panel, and if the hand is suspended above or below the editing panel, the corresponding editing operation may be to store the content displayed on the editing panel.
In one embodiment, when storing the content displayed on the editing panel, the first AR device may directly store locally; or the content displayed on the editing panel is sent to the AR server in the form of documents and/or pictures, and the content displayed on the editing panel is archived by the AR server into a database of the AR server in the form of documents and/or pictures.
In another embodiment, in the case of sending the content displayed on the editing panel to the AR server for storage, the first AR device may display an information recipient list for selection by the user, receive a selection operation of the user on the information recipient list, and send instruction information to the AR server based on the selection operation; the indication information is used for instructing the AR server to send the stored content displayed on the editing panel to a selected information receiver through a target server (such as a mailbox server, and/or an instant messaging server, etc.), such as a mailbox of the selected information receiver, and/or instant messaging software. For example, the selected information receiver may be the user himself, who archives who collects the information, or all related users in the AR space.
In the information processing method of the embodiment of the application, the first AR device may display an editing panel, where target content is displayed on the editing panel, receive a first input of a user, obtain gesture information of the user in the AR space in response to the first input, and edit the target content according to the gesture information. Therefore, when the first AR device is applied to an AR conference scene, the editing panel can be displayed in the AR space, and the target content on the editing panel can be edited by means of the gesture information of the user in the AR space, so that the user can express own opinions conveniently, and the requirement of the user for efficiently communicating the content in the AR conference scene is met.
Understandably, while in AR space, a user may move his or her hand such as by reaching down a finger or the like, but the gesture at this time is not directed to input to the editing panel. In order to prevent editing of the display content on the editing panel based on such gestures, the above process of acquiring gesture information of the user in the AR space may be: and acquiring gesture information of the user in the AR space under the condition that the hand of the user is at least partially overlapped with the editing panel. In this way, by increasing the restriction condition that the hand of the user and the editing panel at least partially overlap, the display content on the editing panel can be prevented from being edited based on the invalid gesture, and the effectiveness of the editing operation can be improved.
In one embodiment, the at least partial coincidence of the hand of the user and the editing panel may be understood as the at least partial coincidence of the position information of the hand of the user and the position information of the editing panel.
In this embodiment, the first AR device may establish a communication connection with the second AR device. The second AR device may be one AR device or a plurality of AR devices in AR space different from the first AR device. After editing the target content, the method may further include: the first AR device sends the edited target content to the second AR device, so that the second AR device displays the edited target content on a corresponding editing panel. Therefore, through direct communication between the first AR device and the second AR device, real-time display content on the editing panel corresponding to each AR device can be conveniently and synchronously updated, so that a scene that multiple persons edit one editing panel simultaneously is established, and user experience is improved.
Optionally, after the target content is edited, the method may further include: the first AR device sends the edited target content to the AR server, and the AR server sends the edited target content to the second AR device, so that the second AR device displays the edited target content on a corresponding editing panel. Therefore, with the help of the transfer function of the AR server, the display content on the editing panel corresponding to each AR device can be synchronously updated in real time, so that a scene that multiple persons edit one editing panel simultaneously is constructed, and the user experience is improved.
Optionally, when the first AR device establishes a communication connection with the second AR device, the first AR device may further perform the following operations: receiving a second input of the user for the first content in the target contents displayed on the editing panel; in response to the second input, the first content is moved, and the first content is sent to the to-do of the second AR device if the first content at least partially coincides with a rendered image of a user wearing the second AR device. Thus, task allocation can be efficiently realized.
Note that the to-do of the second AR device may be to-do bound to the information of the user wearing the second AR device, and represents to-do tasks of the user wearing the second AR device. After the first content is sent to the to-do of the second AR device (which may be understood to be stored in the corresponding to-do task list), the first content may still be displayed on the editing panel, and the to-do task list may be separately sent to the corresponding user. The second input may be a dragging operation of the user for the first content, that is, an operation of dragging the first content in the AR space to at least partially coincide with a rendered image of a user wearing the second AR device, or an operation of inserting the first content into the rendered image of the user wearing the second AR device after the user grabs the first content, or the like.
Optionally, the editing panel may be collapsed. And the process of collapsing the editing panel may be: the method comprises the steps that a first AR device obtains a panel retraction instruction input by a user; and in response to the panel folding instruction, folding the editing panel, such as folding the editing panel to the corresponding pendant. For example, the panel retraction instruction may be a retraction instruction determined based on gesture information of the user, or may be a voice instruction of the user (e.g., "i want to retract the editing panel"), or the like.
Referring to fig. 2, fig. 2 is a flowchart of an information processing method provided in an embodiment of the present application, where the method is applied in an AR server, and as shown in fig. 2, the method includes the following steps:
step 201: and receiving second content sent by the first AR device.
In this embodiment, the second content is obtained by editing, by the first AR device, the display content on the editing panel corresponding to the first AR device according to the user gesture information.
Step 202: the second content is sent to the second AR device.
Optionally, after receiving the second content, the second AR device may display the second content on its corresponding editing panel. The second AR device may be one AR device or a plurality of AR devices in AR space different from the first AR device.
According to the information processing method, the display content on the editing panel corresponding to each AR device can be synchronously updated in real time by means of the transfer function of the AR server, so that a scene that a plurality of people edit one editing panel simultaneously is constructed, and user experience is improved.
Optionally, when the first AR device includes multiple AR devices, the sending the second content to the second AR device may be: the AR server sorts the received second contents of the plurality of AR devices to obtain third contents, the third contents are displayed on an editing panel after the second contents are unified, and the third contents are sent to the second AR devices so as to be displayed on the editing panel corresponding to the second AR devices. Therefore, the display content on the editing panel corresponding to each AR device can be updated efficiently in real time through the content arrangement process of the AR server.
Optionally, the AR server may further perform the following operations: receiving storage information sent by a first AR device, wherein the storage information at least comprises target content displayed on an editing panel corresponding to the first AR device; and sending the storage information to a target server so that the target server sends the storage information to a target user. In this way, the target user can conveniently acquire the target content displayed on the stored editing panel.
In one embodiment, the target content displayed on the editing panel corresponding to the first AR device in the storage information may exist in the form of a document and/or a picture. The target user can be a user wearing the first AR device, who collects the data by filing can be achieved, and all related users in an AR space can be also achieved.
In another embodiment, the target server may be a mailbox server and/or an instant messaging server, and at this time, the storage information may be sent to a mailbox and/or instant messaging software of the target user.
The following describes an information interaction process in a specific example of the present application with reference to fig. 3 by taking a specific AR conference scene as an example.
In the specific example of the present application, as shown in fig. 3, the conference participants related to the AR conference include a conference participant 1, a conference participant 2, and a conference participant 3, where the conference participant 1, the conference participant 2, and the conference participant 3 are respectively located in respective real spaces and respectively wear AR glasses. Before starting the AR conference, the AR glasses worn by the conference participant 1, the conference participant 2, and the conference participant 3 may respectively upload the related information of the corresponding conference participants to the AR server, the AR server establishes a space (may be referred to as an AR space) of the AR conference, and sends rendering data of the AR space to the AR glasses of each conference participant. Then, the AR glasses of each participant can respectively establish an AR space based on the rendering data of the AR space, and determine the placement position of the virtual editing panel in the AR space.
After starting the AR conference, the conference participant 1 (also referred to as conference participant 2 or conference participant 3) may create an editing panel by inputting a panel creation instruction (a creation instruction determined based on gesture information as described above, or a voice instruction), and at this time, each conference participant may simultaneously present the editing panel in front of the conference participant, so that each conference participant can edit the content displayed on the editing panel (this content may be the content discussed in the current conference) by a gesture. The editing can be real-time editing, and the edited content is synchronized to the editing panel in front of each participant. For the specific editing operation, reference may be made to the above contents, which are not described herein again.
Further, with the gesture information of the conference participants, the target content (e.g., in the form of documents and/or pictures) displayed on the editing panel can also be archived in the database of the AR server. And the AR server can also push the display content on the stored editing panel and the information of the participants to a mailbox server and/or an instant messaging server so as to send the received information to the mailbox and/or instant messaging software of the selected participants by the mailbox server and/or the instant messaging server. The selected conferee may be some or all of the conferees. In addition, the editing panel can be folded and the AR conference can be ended by means of the gesture information of any conference participant.
Therefore, the high-efficiency, convenient and smooth AR conference can be presented perfectly by means of the editing panel from the creation of the editing panel, the real-time editing and synchronous editing of the content displayed on the panel, the archiving of the content displayed on the editing panel and the real-time forwarding of the content displayed on the editing panel to the ending of the AR conference, the archiving and the real-time forwarding of the content displayed on the editing panel save the time for arranging the conference records after the conference of the conference participants, and more information and more intelligent conference experience is brought to the conference participants.
It should be noted that, in the information processing method provided in the embodiment of the present application, the execution main body may be an information processing apparatus, or a control module in the information processing apparatus for executing the loaded information processing method. In the embodiment of the present application, an information processing method performed by an information processing apparatus is taken as an example, and the information processing method provided in the embodiment of the present application is described.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application, where the apparatus is applied to a first AR device, and as shown in fig. 4, the information processing apparatus 40 includes:
a display module 41 for displaying an editing panel; target content is displayed on the editing panel;
a first receiving module 42, configured to receive a first input of a user;
an obtaining module 43, configured to obtain gesture information of the user in the AR space in response to the first input;
and the editing module 44 is configured to edit the target content according to the gesture information.
Optionally, the obtaining module 43 is specifically configured to:
acquiring the gesture information of the user in an AR space under the condition that the hand of the user is at least partially overlapped with the editing panel.
Optionally, the first AR device and the second AR device establish a communication connection, and the information processing apparatus 40 further includes:
and the first sending module is used for sending the edited target content to the second AR device, and the second AR device displays the edited target content on a corresponding editing panel.
Optionally, the first AR device and the second AR device establish a communication connection, and the information processing apparatus 40 further includes:
the second receiving module is used for receiving second input of the user aiming at the first content in the target content;
and the storage module is used for responding to the second input, moving the first content, and sending the first content to the to-do of the second AR device under the condition that the first content is at least partially overlapped with the rendering image of the user wearing the second AR device.
It can be understood that the information processing apparatus 40 provided in the embodiment of the present application can implement each process implemented in the method embodiment shown in fig. 1, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application, where the apparatus is applied to an AR server, and as shown in fig. 5, the information processing apparatus 50 includes:
a third receiving module 51, configured to receive the second content sent by the first AR device; the second content is obtained by editing display content on an editing panel corresponding to the first AR equipment by the first AR equipment according to the user gesture information;
a second sending module 52, configured to send the second content to a second AR device, and the second AR device displays the second content on its corresponding editing panel.
Optionally, the information processing apparatus 500 further includes:
a fourth receiving module, configured to receive storage information sent by the first AR device; the storage information comprises target content displayed on an editing panel corresponding to the first AR device;
and the third sending module is used for sending the storage information to a target server, and the target server sends the storage information to a target user.
It can be understood that the information processing apparatus 50 provided in the embodiment of the present application can implement each process implemented in the method embodiment shown in fig. 2, and can achieve the same technical effect, and for avoiding repetition, details are not described here again.
The information processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a kiosk, and the like, and the embodiments of the present application are not particularly limited.
The information processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
Optionally, as shown in fig. 6, an electronic device 60 is further provided in the embodiment of the present application, and includes a processor 61, a memory 62, and a program or an instruction stored in the memory 62 and executable on the processor 61, where each component in the electronic device 60 is coupled together through a bus interface 63, and when the program or the instruction is executed by the processor 61, each process of the method embodiment shown in fig. 1 or fig. 2 is implemented, and the same technical effect can be achieved, and is not described again to avoid repetition. The electronic device 60 may be selected as the first AR device or AR server as described above.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, and a processor 710.
Those skilled in the art will appreciate that the electronic device 700 may also include a power supply (e.g., a battery) for powering the various components, and the power supply may be logically coupled to the processor 710 via a power management system, such that the functions of managing charging, discharging, and power consumption may be performed via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
Optionally, when the electronic device 700 is the first AR device, the display unit 706 is configured to display an editing panel; target content is displayed on the editing panel;
a user input unit 707 for receiving a first input by a user;
a processor 710 for obtaining gesture information of a user in an AR space in response to the first input; and editing the target content according to the gesture information.
Therefore, when the electronic device 700 is applied to an AR conference scene, the editing panel can be displayed in an AR space, and the target content on the editing panel can be edited by means of the gesture information of the user in the AR space, so that the user can express own opinions conveniently, and the requirement of the user for efficiently communicating the content in the AR conference scene is met.
Further, the processor 710 is further configured to: acquiring the gesture information of the user in an AR space under the condition that the hand of the user is at least partially overlapped with the editing panel.
Further, the radio frequency unit 701 is configured to send the edited target content to the second AR device, so as to display the edited target content on an editing panel corresponding to the second AR device. The electronic device 700 establishes a communication connection with a second AR device.
Further, the user input unit 707 is further configured to receive a second input of the user for the first content in the target content;
the processor 710 is further configured to move the first content in response to the second input, and send the first content to a to-do of a second AR device if the first content at least partially coincides with a rendered image of a user wearing the second AR device.
Optionally, when the electronic device 700 is the above-mentioned AR server, the radio frequency unit 701 is configured to receive a second content sent by the first AR device; the second content is obtained by editing display content on an editing panel corresponding to the first AR equipment by the first AR equipment according to the user gesture information; sending the second content to a second AR device to display the second content on an editing panel corresponding to the second AR device
Further, the radio frequency unit 701 is further configured to receive storage information sent by the first AR device; the storage information comprises target content displayed on an editing panel corresponding to the first AR device; and sending the storage information to a target server, and sending the storage information to a target user by the target server.
Therefore, with the help of the transfer function of the AR server, the display content on the editing panel corresponding to each AR device can be synchronously updated in real time, so that a scene that multiple persons edit one editing panel simultaneously is constructed, and the user experience is improved.
An embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the method embodiment shown in fig. 1 or fig. 2, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the method embodiment shown in fig. 1 or fig. 2, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. An information processing method applied to a first Augmented Reality (AR) device is characterized by comprising the following steps:
displaying an editing panel; wherein the editing panel is displayed with target content;
receiving a first input of a user;
responding to the first input, and acquiring gesture information of a user in an AR space;
and editing the target content according to the gesture information.
2. The method of claim 1, wherein the obtaining gesture information of the user in the AR space comprises:
acquiring the gesture information of the user in an AR space under the condition that the hand of the user is at least partially overlapped with the editing panel.
3. The method of claim 1, wherein the first AR device establishes a communication connection with a second AR device, and wherein after editing the target content according to the gesture information, the method further comprises:
and sending the edited target content to the second AR device, and displaying the edited target content on a corresponding editing panel of the second AR device.
4. The method of claim 1, wherein the first AR device establishes a communication connection with a second AR device, the method further comprising:
receiving a second input of a user for the first content in the target content;
in response to the second input, moving the first content, and sending the first content into a to-do of the second AR device if the first content at least partially coincides with a rendered image of a user wearing the second AR device.
5. An information processing method applied to an AR server, the method comprising:
receiving second content sent by the first AR device; the second content is obtained by editing display content on an editing panel corresponding to the first AR device by the first AR device according to the user gesture information;
and sending the second content to a second AR device, and displaying the second content on a corresponding editing panel of the second AR device.
6. The method of claim 5, further comprising:
receiving storage information sent by the first AR device; wherein the storage information includes target content displayed on an editing panel corresponding to the first AR device;
and sending the storage information to a target server, and sending the storage information to a target user by the target server.
7. An information processing apparatus applied to a first AR device, the apparatus comprising:
the display module is used for displaying the editing panel; wherein the editing panel is displayed with target content;
the first receiving module is used for receiving a first input of a user;
the acquisition module is used for responding to the first input and acquiring gesture information of a user in an AR space;
and the editing module is used for editing the target content according to the gesture information.
8. The apparatus of claim 7, wherein the obtaining module is specifically configured to:
acquiring the gesture information of the user in an AR space under the condition that the hand of the user is at least partially overlapped with the editing panel.
9. The apparatus of claim 7, wherein the first AR device establishes a communication connection with a second AR device, the apparatus further comprising:
and the first sending module is used for sending the edited target content to the second AR device, and the second AR device displays the edited target content on a corresponding editing panel.
10. The apparatus of claim 7, wherein the first AR device establishes a communication connection with a second AR device, the apparatus further comprising:
the second receiving module is used for receiving second input of the user aiming at the first content in the target content;
and the storage module is used for responding to the second input, moving the first content, and sending the first content to the to-do of the second AR device under the condition that the first content is at least partially overlapped with the rendering image of the user wearing the second AR device.
11. An information processing apparatus applied to an AR server, the apparatus comprising:
a third receiving module, configured to receive second content sent by the first AR device; the second content is obtained by editing display content on an editing panel corresponding to the first AR device by the first AR device according to the user gesture information;
and the second sending module is used for sending the second content to a second AR device, and the second content is displayed on a corresponding editing panel of the second AR device.
12. The apparatus of claim 11, further comprising:
a fourth receiving module, configured to receive storage information sent by the first AR device; wherein the storage information includes target content displayed on an editing panel corresponding to the first AR device;
and the third sending module is used for sending the storage information to a target server, and the target server sends the storage information to a target user.
13. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the information processing method according to any one of claims 1 to 4, or the steps of the information processing method according to claim 5 or 6.
14. A readable storage medium, characterized in that a program or instructions are stored thereon, which when executed by a processor implement the steps of the information processing method according to any one of claims 1 to 4, or the steps of the information processing method according to claim 5 or 6.
CN202010381695.8A 2020-05-08 2020-05-08 Information processing method and device and electronic equipment Pending CN111580655A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010381695.8A CN111580655A (en) 2020-05-08 2020-05-08 Information processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010381695.8A CN111580655A (en) 2020-05-08 2020-05-08 Information processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111580655A true CN111580655A (en) 2020-08-25

Family

ID=72112202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010381695.8A Pending CN111580655A (en) 2020-05-08 2020-05-08 Information processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111580655A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155311A (en) * 2016-06-28 2016-11-23 努比亚技术有限公司 AR helmet, AR interactive system and the exchange method of AR scene
CN108346179A (en) * 2018-02-11 2018-07-31 北京小米移动软件有限公司 AR equipment display methods and device
US20190310757A1 (en) * 2018-04-09 2019-10-10 Spatial Systems Inc. Augmented reality computing environments - mobile device join and load
CN110573992A (en) * 2017-04-27 2019-12-13 西门子股份公司 Editing augmented reality experiences using augmented reality and virtual reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155311A (en) * 2016-06-28 2016-11-23 努比亚技术有限公司 AR helmet, AR interactive system and the exchange method of AR scene
CN110573992A (en) * 2017-04-27 2019-12-13 西门子股份公司 Editing augmented reality experiences using augmented reality and virtual reality
CN108346179A (en) * 2018-02-11 2018-07-31 北京小米移动软件有限公司 AR equipment display methods and device
US20190310757A1 (en) * 2018-04-09 2019-10-10 Spatial Systems Inc. Augmented reality computing environments - mobile device join and load

Similar Documents

Publication Publication Date Title
US10754490B2 (en) User interface for collaborative efforts
EP2770729B1 (en) Apparatus and method for synthesizing an image in a portable terminal equipped with a dual camera
CN106775334B (en) File calling method and device on mobile terminal and mobile terminal
US20130198653A1 (en) Method of displaying input during a collaboration session and interactive board employing same
US9465507B2 (en) Techniques to facilitate asynchronous communication
KR102042461B1 (en) Mobile terminal and method for controlling of the same
US9324305B2 (en) Method of synthesizing images photographed by portable terminal, machine-readable storage medium, and portable terminal
CN108182016A (en) Mobile terminal and its control method
EP2919121A1 (en) Video information terminal and video display system
KR20150033308A (en) Mobile terminal and controlling method thereof
CN108351880A (en) Image processing method, device, electronic equipment and graphic user interface
EP3183644A1 (en) Digital media message generation
CN109669658A (en) A kind of display methods, device and display system
CN113918522A (en) File generation method and device and electronic equipment
CN110489031A (en) Content display method and terminal device
WO2023155858A1 (en) Document editing method and apparatus
CN111580655A (en) Information processing method and device and electronic equipment
CN115408763A (en) BIM platform-based component generation method
CN114374761A (en) Information interaction method and device, electronic equipment and medium
CN114584704A (en) Shooting method and device and electronic equipment
CN113360060A (en) Task implementation method and device and electronic equipment
CN115543176A (en) Information processing method and device and electronic equipment
CN117729415A (en) Image processing method, device, electronic equipment and medium
CN116319129A (en) Virtual conference processing method and device and electronic equipment
TW202403627A (en) Systems and methods for managing digital notes for collaboration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200825

RJ01 Rejection of invention patent application after publication