WO2020253800A1 - 仿真对象的身份识别方法、相关装置及系统 - Google Patents

仿真对象的身份识别方法、相关装置及系统 Download PDF

Info

Publication number
WO2020253800A1
WO2020253800A1 PCT/CN2020/096933 CN2020096933W WO2020253800A1 WO 2020253800 A1 WO2020253800 A1 WO 2020253800A1 CN 2020096933 W CN2020096933 W CN 2020096933W WO 2020253800 A1 WO2020253800 A1 WO 2020253800A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
simulation object
simulation
legal
simulated
Prior art date
Application number
PCT/CN2020/096933
Other languages
English (en)
French (fr)
Inventor
潘凌寒
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20827624.6A priority Critical patent/EP3964985A4/en
Priority to US17/620,424 priority patent/US11887261B2/en
Publication of WO2020253800A1 publication Critical patent/WO2020253800A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/34User authentication involving the use of external additional devices, e.g. dongles or smart cards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • This application relates to computer graphics technology and identity recognition technology, and in particular to methods, related devices and systems for identifying simulated objects.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • VR uses computer simulation to generate a three-dimensional virtual world, providing users with visual and other sensory simulation effects, making users feel as if they are in the environment, and allowing users to interact with the simulated virtual world.
  • AR is a technology that integrates real-world information and virtual world information. It will use physical information that is difficult to experience within a certain time and space of the real world, such as visual information, sound, taste, touch, etc., through computer science After technology simulation is superimposed on the real world, users can get sensory experience beyond reality.
  • MR mixes the real world and the virtual world to produce a new visual environment.
  • the environment contains both real world information and virtual world information, and the real world and virtual world can interact in real time.
  • the virtual images may be generated by other devices such as mobile phones or servers.
  • the virtual image may be replaced in the process of being sent from the mobile phone or server to the electronic device, resulting in that the virtual image finally presented by the electronic device is not safe or legal. Therefore, it is necessary to propose a technical solution to identify whether the virtual image presented by the electronic device is safe or legal, and to avoid possible security risks.
  • This application provides a method, related devices and systems for identifying the identity of simulated objects, which can present a secure virtual world to users, thereby protecting user privacy and improving the safety of electronic equipment.
  • an embodiment of the present application provides a method for identifying an identity of a simulated object, and the method is applied to an electronic device.
  • the method may include: the electronic device displays a simulation object through a display device, the simulation object is a virtual image generated by computer graphics technology; the electronic device determines whether the simulation object is legal; in response to a result of determining whether the simulation object is legal, the electronic The device outputs first prompt information, which is used to indicate whether the simulation object is legal; among them, the simulation object registered in the registered device corresponding to the electronic device is a legal simulation object, and the simulation object is not registered in the corresponding electronic device.
  • the simulation object registered by the device is an illegal simulation object.
  • the electronic device can prompt the user whether the currently displayed simulation object is a legal simulation object.
  • the simulation object is illegal
  • the user's alertness can be improved
  • the user can be prevented from divulging personal information under the inducement of the illegal simulation object, and the safety of user operations can be guaranteed, thereby protecting user privacy and improving the use safety of electronic equipment.
  • the registration device will perform a registration operation for the simulation object generated by the device trusted by the electronic device. That is to say, the simulation object generated by the device trusted by the electronic device is legal, and the simulation object generated by the device not trusted by the electronic device is illegal.
  • the devices trusted by the electronic device may include: 1. The electronic device itself. For example, VR/AR/MR glasses, etc. 2. Other devices that the electronic device is connected to. For example, the mobile phone to which the electronic device is connected. 3. The server provided by the manufacturer of the electronic device. 4. The server provided by the developer of the VR/AR/MR application installed on the electronic device.
  • the display device of the electronic device may have the following two implementation modes:
  • the display device is a display screen.
  • the electronic device displays the simulation object through the display screen.
  • the display device includes an optical device.
  • the electronic device projects an optical signal corresponding to the simulated object through the optical device. In this way, the user's retina can receive the optical signal to see the simulated object.
  • the simulation object displayed by the electronic device is generated by the electronic device, or the simulation object is generated by the computing device and sent to the electronic device.
  • the first prompt information output by the electronic device may include one or more of the following: visual elements displayed by the display device, voice, indicator light feedback, or vibration feedback.
  • visual elements displayed by the display device For example, when the displayed simulation object is legal, the electronic device can display the text information "legal” on the simulation object, and when the displayed simulation object is illegal, the electronic device can display the text information "illegal" on the simulation object.
  • the electronic device can perform the operation of determining whether the displayed simulation object is legal in any of the following situations:
  • the electronic device determines whether the simulation object is legal.
  • the first operation acting on the simulated object includes any one of the following: gesture input acting on the simulated object, blinking operation acting on the simulated object, gaze operation acting on the simulated object (for example, over A gaze operation with a preset duration), or a voice command used to determine whether the simulated object is legal (for example, the voice command "authentication").
  • the user can independently decide whether to trigger the electronic device to authenticate the simulated object.
  • the electronic device when the electronic device determines whether the simulation object is legal, it may also output second prompt information, which is used to indicate that the electronic device is determining whether the simulation object is legal.
  • the second prompt information includes one or more of the following: visual elements, voice, indicator light feedback, or vibration feedback displayed by the display device.
  • the electronic device can actively perform one or more of the following operations on the simulated object:
  • the electronic device stops displaying the simulation object.
  • the electronic device reports the simulation object. Specifically, the electronic device may send the identification of the simulation object to the authentication server, so that the authentication server marks the simulation object as an illegal simulation object.
  • the electronic device can perform one or more of the following operations on the simulated object under the trigger of the user:
  • the electronic device detects a second operation acting on the simulation object, and in response to the second operation, the electronic device stops displaying the simulation object.
  • the second operation may include any one of the following: gesture input acting on the simulated object, blinking operation on the simulated object, gaze operation on the simulated object, or used to determine whether the simulated object is legal Voice commands.
  • the electronic device detects a third operation that acts on the simulation object, and in response to the third operation, the electronic device reports the simulation object.
  • the specific operation of the electronic device reporting the simulation object can refer to the related description of the previous embodiment.
  • the third operation may include any one of the following: a gesture input acting on the simulated object, an eye blink operation acting on the simulated object, a gaze operation acting on the simulated object, or a voice command used to determine whether the simulated object is legal .
  • the electronic device may also display a physical object and execute: output third prompt information, the third prompt information is used to indicate that the simulated object is a virtual image generated by computer graphics technology; and /Or, the electronic device outputs fourth prompt information, which is used to indicate that the object corresponding to the physical object exists in the real world.
  • the user can be prompted which of the images currently displayed by the electronic device are physical objects and which are simulated objects.
  • the implementation manners of the third prompt information and the fourth prompt information are similar to the above-mentioned implementation manners of the first prompt information and the second prompt information, and you may refer to related descriptions.
  • the electronic device can display the physical object in any of the following ways: 1.
  • the display device of the electronic device is a display screen, and the physical object is displayed on the display screen, and the physical object is collected by the camera of the electronic device.
  • the display device of the electronic device includes a transparent lens through which the user can directly see the physical object in the real world.
  • the process for the electronic device to determine whether the simulation object is legal may specifically include: the electronic device sends the identification of the simulation object to the authentication server, so that the authentication server determines the simulation object Whether it is legal; the electronic device receives the authentication result returned by the authentication server, and the authentication result indicates whether the simulation object is legal; the electronic device determines whether the simulation object is legal according to the authentication result.
  • the simulated objects displayed by the electronic device include one or more of the following: simulated characters, simulated animals, simulated trees, or simulated buildings.
  • an embodiment of the present application provides a method for recognizing a simulated object, which is applied to a head-mounted device.
  • the method may include: the head-mounted device displays a simulated object and a physical object through a display device, the simulated object is a virtual image generated using computer graphics technology; the head-mounted device detects a first operation acting on the simulated object, and responds to The first operation determines whether the simulation object is legal; output second prompt information, which is used to indicate that the head-mounted device is determining whether the simulation object is legal; in response to the result of determining whether the simulation object is legal, the The head-mounted device outputs first prompt information, which is used to indicate whether the simulation object is legal; if the simulation object is an illegal simulation object, the head-mounted device detects the second operation on the simulation object, and responds Stop displaying the simulation object in the second operation; wherein, the simulation object registered in the registered device corresponding to the head-mounted device is a legal simulation object, and the simulation object that has not been registered in the registered device
  • the head-mounted device may include, but is not limited to: AR/MR glasses, AR/MR head-mounted display device, AR/MR all-in-one machine, etc.
  • the head-mounted device displays simulated objects and physical objects through the display device, the head-mounted device can provide an AR/MR display scene.
  • the head-mounted device can prompt the user whether the currently displayed simulation object is a legal simulation object when providing an AR/MR display scene.
  • the simulation object is illegal, it can increase the user's alertness, avoid the user from leaking personal information under the guidance of the illegal simulation object in the AR/MR scene, and ensure the safety of user operations, thereby protecting user privacy and improving the use of electronic devices Safety.
  • the registration device will perform a registration operation for the simulation object generated by the device trusted by the headset.
  • the simulation object generated by the device trusted by the head-mounted device is legal
  • the simulation object generated by the device not trusted by the head-mounted device is illegal.
  • devices trusted by the headset refer to the related description in the first aspect.
  • the embodiments of the present application provide a graphical user interface on an electronic device.
  • the electronic device includes a display device, a memory, and one or more processors, and the one or more processors are configured to execute data stored in the memory.
  • One or more computer programs in the computer program wherein the graphical user interface includes: displaying a simulation object, the simulation object is a virtual image generated by using computer graphics technology; in response to the determination result of whether the simulation object is legal, output the first Prompt information, the first prompt information is used to indicate whether the simulation object is legal; among them, the simulation object registered in the registered device corresponding to the electronic device is a legal simulation object, and the simulation object has not been registered in the registered device corresponding to the electronic device.
  • the simulation object is an illegal simulation object.
  • the display device of the electronic device may have the following two implementation modes:
  • the display device is a display screen.
  • the graphical user interface specifically includes: displaying the simulation object through the display screen.
  • the display device includes an optical device.
  • the graphical user interface specifically includes: projecting an optical signal corresponding to the simulated object through the optical device.
  • the displayed simulation object is generated by the electronic device, or the simulation object is generated by the computing device and then sent to the electronic device.
  • the first prompt information includes one or more of the following: visual elements displayed by the display device, voice, indicator light feedback, or vibration feedback.
  • the graphical user interface further includes: before outputting the first prompt information, outputting second prompt information, where the second prompt information is used to indicate that the electronic device is determining whether the simulation object is legitimate.
  • the second prompt information includes one or more of the following: visual elements, voice, indicator light feedback, or vibration feedback displayed by the display device.
  • the graphical user interface further includes: in response to a determination result that the simulation object is illegal, stopping displaying the simulation object.
  • the graphical user interface further includes: if the simulation object is an illegal simulation object, detecting a second operation acting on the simulation object, and in response to the second operation, stopping displaying the simulation Object.
  • the second operation can refer to the related description in the method of the second aspect.
  • the graphical user interface further includes: displaying a physical object; outputting third prompt information, the third prompt information is used to indicate that the simulated object is a virtual image generated using computer graphics technology, and /Or, output fourth prompt information, where the fourth prompt information is used to indicate that the object corresponding to the physical object exists in the real world.
  • the third prompt information and the fourth prompt information can refer to related descriptions in the method of the second aspect.
  • the embodiments of the present application provide a graphical user interface on a head-mounted device.
  • the head-mounted device includes a display device, a memory, and one or more processors, and the one or more processors are used to execute One or more computer programs stored in the memory, characterized in that, the graphical user interface includes: a display entity object and a simulation object, the simulation object is a virtual image generated by computer graphics technology; and it is detected to act on the simulation object In response to the first operation, determine whether the simulation object is legal;
  • Output second prompt information which is used to indicate that the head-mounted device is determining whether the simulation object is legal
  • first prompt information is output, and the first prompt information is used to indicate whether the simulation object is legal; if the simulation object is an illegal simulation object, detect the second effect on the simulation object Operation, in response to the second operation, stop displaying the simulation object;
  • the simulation object registered in the registered device corresponding to the head-mounted device is a legal simulation object, and the simulation object that has not been registered in the registered device corresponding to the head-mounted device is an illegal simulation object; the simulation object includes simulation Characters or simulated animals.
  • the graphical user interface and beneficial effects of the fourth aspect can refer to the possible implementations of the first aspect and the first aspect and the beneficial effects brought about
  • the implementation of the graphical user interface can refer to the above The implementation of the first aspect and each possible method of the first aspect will not be repeated here.
  • an embodiment of the present application provides an electronic device, which is configured to execute the method described in the first aspect.
  • the electronic device includes: one or more processors, a memory, and a display device; the memory is coupled with the one or more processors, and the memory is used to store computer program codes, the computer program codes including computer instructions, the one or more A processor calls the computer instructions to make the electronic device execute: display a simulation object through the display device, the simulation object is a virtual image generated by computer graphics technology; determine whether the simulation object is legal; in response to the determination of whether the simulation object is legal As a result, first prompt information is output, and the first prompt information is used to indicate whether the simulation object is legal; among them, the simulation object registered in the registered device corresponding to the electronic device is a legal simulation object, and the simulation object is not corresponding to the electronic device.
  • the simulation object registered by the registered device is an illegal simulation object.
  • the registration device will perform a registration operation for the simulation object generated by the device trusted by the electronic device. That is to say, the simulation object generated by the device trusted by the electronic device is legal, and the simulation object generated by the device not trusted by the electronic device is illegal.
  • the simulation object generated by the device trusted by the electronic device is legal
  • the simulation object generated by the device not trusted by the electronic device is illegal.
  • devices trusted by electronic devices refer to the related description in the first aspect.
  • the electronic device of the fifth aspect can be used to execute the method in the first aspect or any one of the possible implementations of the first aspect. Therefore, for the operations performed by the electronic device of the fifth aspect and the beneficial effects brought by the electronic device, reference may be made to the related description in the first aspect or any one of the possible implementation manners of the first aspect, which will not be repeated here.
  • an embodiment of the present application provides a head-mounted device, which is used to execute the method described in the second aspect.
  • the head-mounted device includes: one or more processors, a memory, and a display device; the memory is coupled with the one or more processors, and the memory is used to store computer program codes, the computer program codes including computer instructions, the one Or multiple processors call the computer instructions to cause the head-mounted device to execute: display a simulated object and a physical object through a display device, the simulated object is a virtual image generated by computer graphics technology; determine whether the simulated object is legal; respond to The result of determining whether the simulation object is legal or not, output first prompt information, the first prompt information is used to indicate whether the simulation object is legal; among them, the simulation object registered in the registered device corresponding to the head-mounted device is a legal simulation Object, a simulation object that has not been registered with the registered device corresponding to the head-mounted device is an illegal simulation object.
  • the registration device will perform a registration operation for the simulation object generated by the device trusted by the head-mounted device.
  • the simulation object generated by the device trusted by the head-mounted device is legal
  • the simulation object generated by the device not trusted by the head-mounted device is illegal.
  • devices trusted by the headset refer to the related description in the second aspect.
  • the head-mounted device of the sixth aspect can be used to execute the method in the first aspect or any one of the possible implementations of the first aspect. Therefore, for the operations performed by the head-mounted device of the sixth aspect and the beneficial effects brought by the head-mounted device, please refer to the related description in the first aspect or any one of the possible implementations of the first aspect. Repeat it again.
  • an embodiment of the present application provides an identity recognition system for a simulated object.
  • the system includes an electronic device, a registration device, and an authentication server.
  • the registration device is used to register the simulation object;
  • the electronic device is used to display the simulation object through the display device;
  • the electronic device is the electronic device described in the fifth aspect or any one of the possible implementation manners of the fifth aspect; this
  • the authentication server is used to determine whether the simulation object displayed by the electronic device is legal; among them, the simulation object registered in the registered device is a legal simulation object, and the simulation object that has not been registered in the registered device is an illegal simulation object.
  • the embodiments of the present application provide an identity recognition system for a simulated object, the system including: a head-mounted device, a registration device, and an authentication server.
  • the registration device is used to register the simulated object;
  • the head-mounted device is used to display the simulated object through the display device;
  • the head-mounted device is described in the sixth aspect or any one of the possible implementation manners of the sixth aspect Head-mounted device;
  • the authentication server is used to determine whether the simulation object displayed by the head-mounted device is legal; among them, the simulation object registered in the registered device is a legal simulation object, and the simulation object that has not been registered in the registered device
  • the object is an illegal simulation object.
  • the embodiments of the present application provide a computer program product containing instructions.
  • the computer program product When the computer program product is run on an electronic device, the electronic device can execute any possible implementation as in the first aspect and the first aspect. The method described by the method.
  • an embodiment of the present application provides a computer-readable storage medium, including instructions, which when the above instructions run on an electronic device, cause the electronic device to execute the first aspect and any possible implementation manner in the first aspect Described method.
  • an embodiment of the present application provides a computer program product containing instructions.
  • the computer program product is run on an electronic device, the electronic device is made to perform any of the second and second aspects. Implement the method described in the method.
  • an embodiment of the present application provides a computer-readable storage medium, including instructions, which when the foregoing instructions run on an electronic device, cause the foregoing electronic device to perform any possible implementation as in the second aspect and the second aspect The method described by the method.
  • the electronic device when the electronic device displays the simulation object, it can prompt the user whether the simulation object currently displayed is a legal simulation object.
  • the simulation object is illegal
  • the user's alertness can be improved
  • the user can be prevented from divulging personal information under the inducement of the illegal simulation object, and the safety of user operations can be guaranteed, thereby protecting user privacy and improving the use safety of electronic equipment.
  • FIG. 1 is a schematic structural diagram of a simulation object identification system provided by an embodiment of the present application
  • Figure 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 3A is a real image provided by an embodiment of this application
  • FIG. 3B is a schematic diagram of human-computer interaction provided by an embodiment of this application
  • FIG. 3C is an image seen by a user provided by an embodiment of this application;
  • FIG. 5B, FIG. 6B, FIG. 7B, FIG. 8B, FIG. 9B, FIG. 9D, FIG. 10B, and FIG. 11B are the images seen by the user according to the embodiments of the application;
  • FIG. 12 is a schematic diagram of a registration process of a simulation object provided by an embodiment of the application.
  • FIG. 13 is a schematic diagram of an authentication process of a simulation object provided by an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, “plurality” means two or more.
  • the embodiments of the present application provide a method, related devices, and systems for identifying a simulated object's identity.
  • an electronic device uses VR, AR, MR and other technologies to display the simulated object, it can identify whether the simulated object is legal.
  • the electronic device can notify the user of the recognition result, and can also process the recognized illegal simulation object.
  • the processing of the illegal simulation object by the electronic device may include: stopping displaying the illegal simulation object, or reporting the illegal simulation object.
  • Implementing the method provided in the embodiments of the present application can present a secure virtual world to the user, thereby protecting user privacy and improving the use safety of electronic devices.
  • a simulation object refers to a virtual image generated and displayed by rendering using computer graphics technology, computer simulation technology, display technology, etc., and may also be referred to as a virtual object or a virtual element.
  • the simulation object is a fake rather than a real image of the physical world.
  • the simulation object can be a virtual object imitating an object existing in the real world, which can bring an immersive experience to the user.
  • This application does not limit the types of simulation objects.
  • Simulation objects may include simulated animals, simulated characters, simulated trees, simulated buildings, virtual tags, landscapes, text information, icons, pictures, or videos, etc.
  • the simulation object can be two-dimensional or three-dimensional.
  • the simulation object registered in the registered device corresponding to the electronic device is a legal simulation object
  • the simulation object that has not been registered in the registered device corresponding to the electronic device is an illegal simulation object.
  • the registered device performs a registration operation for the simulation object generated by the device trusted by the electronic device. That is to say, the simulation object generated by the device trusted by the electronic device is legal, and the simulation object generated by the device not trusted by the electronic device is illegal.
  • the devices trusted by the electronic device may include: 1.
  • the electronic device itself For example, VR/AR/MR glasses, VR/AR/MR head-mounted display devices, VR/AR/MR integrated machines, etc. 2.
  • Other devices that the electronic device is connected to For example, the electronic device is connected to a mobile phone, a tablet computer, or a personal computer in a wireless or wired manner.
  • the server provided by the manufacturer of the electronic device For example, a server provided by HUAWEI, the manufacturer of Huawei VR/AR/MR head-mounted display devices, is used to generate simulation objects.
  • the server provided by the developer of the VR/AR/MR application installed on the electronic device.
  • a server provided by a developer of VR/AR/MR applications (such as VR game applications, AR navigation applications, etc.) that have been installed on electronic devices.
  • a legal simulation object may also be called a safe simulation object, and an illegal simulation object may also be called an unsafe simulation object.
  • the user can see these simulated objects (for example, simulated characters) and interact with the simulated objects. Specifically, the user can walk or drive (for example, navigate) under the instruction of the simulated object, input information to the electronic device, swing the body, and so on. For example, if the simulation object is a simulation character, when the simulation character attempts to interact with the user, the electronic device may call the speaker to play the voice "Hello, what is your name?", and call the microphone of the electronic device to collect the user's response The voice information (such as the user's name) input by the voice, that is, the simulated character may induce the user to disclose personal information.
  • the simulation object is a simulation character
  • the electronic device may call the speaker to play the voice "Hello, what is your name?”, and call the microphone of the electronic device to collect the user's response
  • the voice information (such as the user's name) input by the voice, that is, the simulated character may induce the user to disclose personal information.
  • the simulation object generated by the device trusted by the electronic device is safe and reliable, and the operation performed by the user under the instruction of this kind of simulation object is safe.
  • the information input by the user to the electronic device will not be leaked and the navigation The route is accurate and so on.
  • the security of simulation objects generated by devices that are not trusted by electronic devices cannot be guaranteed, and the security of operations performed by users under the instructions of such simulation objects cannot be guaranteed.
  • information input by users to electronic devices may be leaked and navigated. The route at the time may be inaccurate, etc. Therefore, in the embodiment of the present application, the electronic device notifies the user of the identification result, which can increase the user's alertness and ensure the safety of the user's operation, thereby protecting the user's privacy and improving the safety of the electronic device.
  • the simulation object identification system involved in the embodiment of the present application may include simulation objects generated using VR, AR, MR, and other technologies.
  • the simulation object identification system may include: an electronic device 100, a computing device 200 for generating a simulation object, a registration device 300, a server 400, and an authentication server 500. Among them:
  • the electronic device 100 and the computing device 200 can form a VR/AR/MR display system.
  • the electronic device 100 is a terminal device that can display simulation objects using technologies such as VR/AR/MR, and provide users with a display environment such as VR/AR/MR.
  • the electronic device 100 can use VR technology to present simulated objects, so that the user feels a completely simulated virtual world, that is, the electronic device 100 can provide the user with a VR display environment.
  • the electronic device 100 can use AR/MR and other technologies to superimpose and present simulated objects on real objects that actually exist in the physical world, so that the user can feel the effect of augmented reality, that is, the electronic device 100 can provide AR for the user.
  • /MR shows the environment.
  • the real objects that actually exist in the physical world may be captured by the camera of the electronic device 100, or may be directly seen by the user's eyes.
  • the simulation object presented by the electronic device 100 may be generated by the electronic device 100 itself through computer graphics technology, computer simulation technology, etc., or it may be other computing devices 200 connected to the electronic device 100 using computer graphics technology, computer simulation technology, etc. It is sent to the electronic device 100 after generation.
  • the computing device 200 may be a server as shown in FIG. 1, or may be a mobile phone, a computer, or the like to which the electronic device 100 is connected or paired. That is, in some embodiments, the electronic device 100 can be used with a computing device 200 and the like, and the computing device 200 is used to generate and provide a simulation object for the electronic device 100.
  • the simulated object presented by the electronic device 100 can interact with the user.
  • the user may interact with the simulated object presented by the electronic device 100 through interactive methods such as hand/arm movement, head movement, and eyeball rotation.
  • the electronic device 100 can be used with a handheld device (not shown in FIG. 1), and the user can interact with the simulated object presented by the electronic device 100 by manipulating the handheld device.
  • the handheld device may be, for example, a controller, a gyro mouse, or other handheld computing devices.
  • the handheld device can be configured with a variety of sensors such as acceleration sensors, gyroscope sensors, magnetic sensors, etc., which can be used to detect and track its own motion.
  • the handheld device can communicate with the electronic device 100 through short-distance transmission technologies such as Bluetooth (bluetooth), near field communication (NFC), and ZigBee.
  • the electronic device 100 may be installed on the user's head, for example, may be VR/AR/MR glasses, VR/AR/MR head-mounted display (HMD), VR/AR/MR integrated machine, etc.
  • the electronic device 100 may also be a non-portable electronic device such as a desktop computer supporting VR/AR/MR technology, a smart TV, a vehicle containing a display screen, and the like.
  • the computing device 200 can respond to the request of the electronic device 100 to generate a simulation object through computer graphics technology, computer simulation technology and other technologies, and send the generated simulation object to the electronic device 100 for display, so that the electronic device 100 provides VR/AR for the user /MR shows the environment.
  • the computing device 200 is a device trusted by the electronic device 100.
  • the computing device 200 can be a mobile phone to which the electronic device 100 is connected, a server provided by the manufacturer of the electronic device 100, or a VR/AR/MR already installed on the electronic device 100 Servers provided by developers of similar applications, etc. It is understandable that when the electronic device 100 itself can generate a simulation object through computer graphics technology, computer simulation technology and other technologies, the simulation object identification system provided in the embodiment of the present application may not be provided with the computing device 200.
  • the registration device 300 is a device that provides registration services for virtual objects.
  • the registration device 300 is used to provide registration services for legal simulation objects, that is, used to assign identifications to legal simulation objects.
  • the identifier is used to indicate the simulation object, for example, the identifier may be a universally unique identifier (UUID).
  • the registration device 300 may send two pieces of data to the electronic device 100 and the server 400 including the database for storage. The two pieces of data are: the identification assigned to the legal simulation object, and the user authentication obtained by encrypting the identification using the first algorithm.
  • the right key subscriber authentication key, Ki).
  • the first algorithm may be the K4 algorithm.
  • the server 400 and the authentication server 500 can provide authentication services for the simulated objects presented by the electronic device 100.
  • the server 400 is used to correlate and store two data from the registration device 300: the identifier assigned to the legal simulation object, and the Ki obtained by encrypting the identifier using the first algorithm.
  • the second algorithm and the third algorithm are also stored in the server 400, which are used to authenticate the simulation object.
  • the second algorithm may be the A3 algorithm
  • the third algorithm may be the A8 algorithm.
  • the electronic device 100 also associates and stores two data from the registration device 300: the identifier assigned to the legal simulation object, and the Ki obtained by encrypting the identifier using the first algorithm.
  • the electronic device 100 also stores the second algorithm and the third algorithm, which are used to authenticate the simulated object.
  • the authentication server 500 provides an authentication service for the simulation object displayed by the electronic device 100. Specifically, the authentication server 500 can, in response to the authentication request of the electronic device 100 for the simulation object, identify whether the simulation object is generated by a device trusted by the electronic device 100, that is, perform legality authentication or security for the simulation object. Identify and judge whether the simulation object is legal.
  • FIG. 2 is a schematic structural diagram of an exemplary electronic device 100 provided in this application.
  • the electronic device may include a processor 110, a memory 120, a communication module 130, a sensor module 140, buttons 150, an input and output interface 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D,
  • the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), video processing unit (VPU) controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processing Neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • VPU video processing unit
  • memory video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural network processing Neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller can be the nerve center and command center of the electronic device.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (I2C) interface, universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general input and output (general -purpose input/output, GPIO) interface, subscriber identity module (SIM) interface, and/or universal serial bus (universal serial bus, USB) interface, serial peripheral interface (serial peripheral interface, SPI) Interface etc.
  • I2C integrated circuit
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • SIM subscriber identity module
  • USB universal serial bus
  • serial peripheral interface serial peripheral interface
  • the I2C interface is a two-way synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses. The processor 110 may be respectively coupled to the touch sensor, the charger, the camera 190, etc. through different I2C bus interfaces.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to realize communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module in the communication module 130 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the communication module 130.
  • the processor 110 communicates with the Bluetooth module in the communication module 130 through the UART interface to realize the Bluetooth function.
  • the MIPI interface can be used to connect the processor 110 with the display device 180, the camera 190 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display device serial interface (display serial interface, DSI), and so on.
  • the processor 110 and the camera 190 communicate through a CSI interface to implement the shooting function of the electronic device.
  • the processor 110 and the display device 180 communicate through a DSI interface to realize the display function of the electronic device.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 190, the display device 180, the communication module 130, the sensor module 140, the microphone 170C, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface is an interface that complies with the USB standard specifications, and can be a Mini USB interface, a Micro USB interface, and a USB Type C interface.
  • the USB interface can be used to connect a charger to charge the electronic device, and can also be used to transfer data between the electronic device and the peripheral device. It can also be used to connect headphones and play audio through the headphones. This interface can also be used to connect to other electronic devices, such as mobile phones.
  • the USB interface can be USB3.0, which is compatible with high-speed display port (DP) signal transmission, and can transmit high-speed video and audio data.
  • DP display port
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is merely a schematic description, and does not constitute a structural limitation of the electronic device.
  • the electronic device may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the electronic device can implement a wireless communication function through the communication module 130.
  • the communication module 130 may include an antenna, a wireless communication module, a mobile communication module, a modem processor, and a baseband processor.
  • the antenna is used to transmit and receive electromagnetic wave signals.
  • the electronic device can contain multiple antennas, and each antenna can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: a certain antenna can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna can be used in combination with a tuning switch.
  • the mobile communication module can provide wireless communication solutions such as 2G/3G/4G/5G that are applied to electronic devices.
  • the mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module can receive electromagnetic waves by the antenna, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation by the antenna.
  • at least part of the functional modules of the mobile communication module may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker, etc.), or displays an image or video through the display device 180.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module or other functional modules.
  • the wireless communication module can provide applications in electronic devices including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (bluetooth, BT), global navigation satellite systems ( Global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Bluetooth blue, BT
  • global navigation satellite systems Global navigation satellite system, GNSS
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module may be one or more devices integrating at least one communication processing module.
  • the wireless communication module receives electromagnetic waves via an antenna, modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna.
  • the antenna of the electronic device is coupled with the mobile communication module, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device realizes the display function through the GPU, the display device 180, and the application processor.
  • the GPU is a microprocessor for image processing and is connected to the display device 180 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display device 180 may provide a VR display effect.
  • the display device 180 provides the VR display effect in two ways: 1.
  • the display device 180 is a display screen, and the display screen may include a display panel.
  • the display panel of the display device 180 may display a simulated object. The user can see the simulation object from the display panel, thereby realizing the VR display effect.
  • the display panel may adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • the display device 180 may include an optical device for directly projecting an optical signal (such as a light beam) onto the user's retina, and the user can directly see the simulated object through the optical signal output by the optical device, thereby Realize BR display effect.
  • the optical device may be a micro projector or the like.
  • the display device 180 may provide AR/MR display effects.
  • the display device 180 provides AR/MR display effects in two ways: 1.
  • the display device 180 includes a display panel, and the display device 180 can display real objects that actually exist in the physical world on the display panel. And superimpose the simulation object on the real object.
  • the real object displayed on the display panel may be captured by the camera 190.
  • the display device may include lenses and optical devices.
  • the lens can be transparent, and the user's eyes can see real objects that actually exist in the physical world through the lens.
  • the material of the lens can be poly(methyl methacrylate) (PMMA), optical plastic, etc.
  • the optical device can directly project the optical signal onto the user's retina so that the user can see the simulated object.
  • the display device 180 formed by the combination of the lens and the optical device can make the user feel the AR/MR display effect.
  • the simulation object presented to the user by the display device 180 may be generated by the electronic device 100 itself using computer graphics technology, computer simulation technology, etc.
  • the GPU of the electronic device 100 may use computer graphics technology, computer simulation technology, etc. It can also be generated by other computing devices such as mobile phones, computers, or servers using computer graphics technology, computer simulation technology and other technologies and sent to the electronic device 100, which is not limited in this application.
  • the number of the display device 180 in the electronic device may be two, respectively corresponding to the two eyeballs of the user.
  • the content displayed on the two display devices can be displayed independently. Different images can be displayed on the two display devices to improve the three-dimensional sense of the image.
  • the number of the display device 180 in the electronic device may also be one, corresponding to two eyeballs of the user at the same time.
  • the electronic device can realize the shooting function through the ISP, the camera 190, the video codec, the GPU, the display device 180, and the application processor.
  • the ISP is used to process the data fed back by the camera 190. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 190.
  • the camera 190 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the electronic device may include 1 or N cameras 190, and N is a positive integer greater than 1.
  • the camera 190 may include, but is not limited to, a traditional color camera (RGB camera), a depth camera (RGB depth camera), a dynamic vision sensor (DVS) camera, etc.
  • the camera 190 can collect a user's hand image or body image, and the processor 110 can be used to analyze the image collected by the camera 190 to recognize the hand movement or body movement input by the user.
  • the camera 190 can be used in conjunction with an infrared device (such as an infrared transmitter) to detect the user's eye movements, such as eye gaze direction, blink operation, gaze operation, etc., so as to realize eye tracking.
  • an infrared device such as an infrared transmitter
  • the digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device selects the frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device can support one or more video codecs.
  • the electronic device can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • NPU can realize the intelligent cognition of electronic devices and other applications, such as: image recognition, face recognition, voice recognition, text understanding, etc.
  • the memory 120 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device by running instructions stored in the memory 120.
  • the memory 120 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, an application program (such as a VR/AR/MR application) required by at least one function (such as a sound playback function, an image playback function, etc.).
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device.
  • the memory 120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the electronic device can implement audio functions through the audio module 170, the speaker 170A, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module can also be used to encode and decode audio signals.
  • the audio module may be provided in the processor 110, or some functional modules of the audio module may be provided in the processor 110.
  • the speaker 170A also called a “speaker” is used to convert audio electrical signals into sound signals. Electronic devices can listen to music through speakers, or listen to hands-free calls.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the electronic device may be provided with at least one microphone 140. In other embodiments, the electronic device may be provided with two microphones 170C, which can realize noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the microphone 170C can detect the voice signal used to control the portable electronic device.
  • the processor 110 can then process the voice signal to recognize voice commands. For example, when the microphone 170C receives an input voice command for authenticating the simulated object, the electronic device 100 may verify the legality of the simulated object.
  • the headphone jack is used to connect wired headphones.
  • the earphone interface can be a USB interface, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and the US Cellular Telecommunications Industry Association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA US Cellular Telecommunications Industry Association
  • the electronic device may include one or more buttons 150, which may control the electronic device and provide users with access to functions on the electronic device.
  • the key 150 may be in the form of a button, a switch, a dial, and a touch or proximity sensing device (such as a touch sensor). Specifically, for example, the user can turn on the display device 180 of the electronic device by pressing a button.
  • the button 150 includes a power button, a volume button and so on.
  • the button 150 may be a mechanical button. It can also be a touch button.
  • the electronic device can receive key input and generate key signal input related to user settings and function control of the electronic device.
  • the electronic device may include an input-output interface 160, and the input-output interface 160 may connect other devices to the electronic device through appropriate components.
  • Components may include audio/video jacks, data connectors, etc., for example.
  • the sensor module 140 may include various sensors, for example, a proximity light sensor, a distance sensor, a gyroscope sensor, an ambient light sensor, an acceleration sensor, a temperature sensor, a magnetic sensor, a bone conduction sensor, a fingerprint sensor, and so on.
  • sensors for example, a proximity light sensor, a distance sensor, a gyroscope sensor, an ambient light sensor, an acceleration sensor, a temperature sensor, a magnetic sensor, a bone conduction sensor, a fingerprint sensor, and so on.
  • the proximity light sensor may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device emits infrared light through the light-emitting diode.
  • Electronic devices use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device. When insufficient reflected light is detected, the electronic device can determine that there is no object near the electronic device.
  • the electronic device may use the proximity light sensor to detect a gesture operation at a specific position of the electronic device 100 to achieve the purpose of associating the gesture operation with an operation command.
  • the distance sensor can be used to measure distance.
  • Electronic equipment can measure distance through infrared or laser.
  • the gyroscope sensor can be used to determine the movement posture of the electronic device.
  • the angular velocity of the electronic device around three axes ie, x, y, and z axes
  • the gyroscope sensor can also be used for navigation and somatosensory game scenes.
  • the ambient light sensor is used to sense the brightness of the ambient light.
  • the electronic device can adaptively adjust the brightness of the display device 180 according to the perceived brightness of the ambient light.
  • the ambient light sensor can also be used to automatically adjust the white balance when taking pictures.
  • the acceleration sensor can detect the magnitude of the acceleration of the electronic device in various directions (usually three axes).
  • the magnitude and direction of gravity can be detected when the electronic device is stationary. It can also be used to identify the posture of electronic devices and apply to applications such as pedometers.
  • the electronic device 100 may track the movement of the user's head according to an acceleration sensor, a gyroscope sensor, a magnetic sensor, and the like.
  • the temperature sensor is used to detect temperature.
  • the electronic device uses the temperature detected by the temperature sensor to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor exceeds a threshold value, the electronic device may reduce the performance of the processor located near the temperature sensor, so as to reduce power consumption and implement thermal protection.
  • the electronic device when the temperature is lower than another threshold, the electronic device heats the battery 1100 to avoid abnormal shutdown of the electronic device due to low temperature.
  • the electronic device boosts the output voltage of the battery 1100 to avoid abnormal shutdown caused by low temperature.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture, etc., which is not limited in this application.
  • the electronic device 100 in the embodiment of the present application may be equipped with iOS, android, microsoft or other operating systems.
  • the embodiment of the present application provides a simulation object identification method.
  • an electronic device uses VR, AR, MR and other technologies to display a simulated object, it can identify whether the simulated object is legal, and can inform the user of the recognition result, and can also respond to the identified illegal
  • the simulation object is processed to present a safe virtual world to the user. This method can protect user privacy and improve the use safety of electronic equipment.
  • the following uses the AR/MR display scene provided by the electronic device 100 in the embodiment of the present application as an example to introduce the method for identifying the identity of the simulated object.
  • Figs. 3A-3C exemplarily show an AR/MR display scene.
  • the user wears the electronic device 100, and sees the simulated object superimposed on the physical object in the real world through the electronic device 100.
  • Fig. 3A exemplarily shows a real image in the physical world.
  • the real image includes one or more physical objects, such as running athletes and trees.
  • FIG. 3B exemplarily shows the user interface 31 provided by the electronic device 100.
  • the electronic device 100 may include two display screens, both of which display: an image of an athlete 301, an image of an athlete 302, an image of a tree 303, an image of a tree 304, text information 305, text Information 306 and simulated character 307.
  • the objects corresponding to the athlete's image 301, the athlete's image 302, the tree's image 303, and the tree's image 304, such as athletes and trees, are physical objects that actually exist in the physical world.
  • the text information 305, the text information 306, and the simulated character 307 are simulated objects provided by the electronic device 100.
  • the simulated object displayed by the electronic device 100 can interact with the user.
  • the speaker can be used to play voice
  • the microphone is used to collect the voice input by the user, so that the user and the simulated character 307 can interact with each other.
  • voice interaction can make users feel like chatting with simulated characters.
  • the content displayed on the two display screens of the electronic device 100 may be different, thereby presenting a stereoscopic visual effect to the user.
  • the position of the content displayed on the left and right display screens of the electronic device 100 relative to the display screen May be slightly different.
  • images of physical objects such as an image of an athlete 301, an image of an athlete 302, an image of a tree 303, and an image of a tree 304 may be captured by the electronic device 100 through the camera 190.
  • Simulation objects such as text information 305, text information 306, and simulated characters 307 can be generated by the electronic device 100 using computer graphics technology, computer simulation technology, etc., or other computing devices such as mobile phones, computers, or servers using computer graphics technology, The computer simulation technology and other technologies are generated and sent to the electronic device 100.
  • FIG. 3C exemplarily shows an image seen by the user through the electronic device 100.
  • the user’s left eye uses the left display shown in Figure 3B to view the content displayed
  • the user’s right eye uses the right display shown in Figure 3B to view the content displayed, and creates a sense of space in the brain
  • users can feel the image they see as shown in Figure 3C.
  • the user can see the physical object in the real world through the electronic device 100, and can also see the simulation object superimposed on the physical object, that is, the user can feel the AR/MR display scene.
  • the electronic device 100 may also provide only one display screen, so that the user can see the image shown in FIG. 3C through the electronic device 100.
  • the electronic device 100 may also be configured with a transparent lens, so that the user can directly see the real world through the lens. Entity object.
  • the electronic device 100 may also be equipped with an optical device that can project optical signals onto the user's retina so that the user can see the simulated object.
  • the object that is, the user can see the image shown in FIG. 3C through the electronic device 100.
  • the realism of the simulated object seen by the user through the electronic equipment 100 will become stronger and stronger, and the user may not be able to distinguish the presentation of the electronic equipment 100. Entity objects and simulation objects.
  • the electronic device 100 when the electronic device 100 provides an AR/MR display scene, it can prompt the user which of the images currently displayed are physical objects and which are simulated objects.
  • the electronic device 100 may determine whether the displayed object is a physical object or a simulated object by any of the following methods:
  • the electronic device 100 can determine whether the object is a physical object or a simulation object based on the source of each object in the displayed image. Specifically, among the images displayed by the electronic device 100, the images captured by the camera 190 of the electronic device 100 are real physical objects in the physical world, and the images generated only by the electronic device 100 rendered by the GPU are simulation objects.
  • the image generated by the electronic device 100 through GPU rendering may be an image generated by the local end of the electronic device 100 using computer graphics technology, computer simulation technology, etc., or it may be an image generated by another computing device 200 received by the electronic device 100. Rendered using the local GPU.
  • the electronic device can superimpose the simulation object on the physical object, and the simulation object and the physical object seen by the user are rendered by the GPU.
  • the image captured by the camera and then rendered by the GPU is the simulation object
  • the image rendered by the GPU only is the simulation object.
  • the electronic device 100 can extract the characteristics of each object in the displayed image, and determine whether the object is a physical object or a simulation object based on the characteristics of each object.
  • Object features can include color features, texture features, shape features, and so on. In general, the characteristics of the solid object are more abundant than the characteristics of the simulation object.
  • the electronic device 100 can prompt the user which of the images currently displayed are physical objects and which are simulated objects.
  • the electronic device 100 may prompt the user in the following situations: 1.
  • the electronic device 100 may continue to prompt the user which are the physical objects and which are the simulation objects in the currently displayed image when providing the AR/MR display scene for the user.
  • the electronic device 100 may prompt the user that the object is a simulation object within a period of time when the simulation object is initially displayed.
  • the electronic device 100 may prompt the user that the object is a simulation object or a physical object when the user selects an object in the currently displayed image.
  • FIG. 4A shows a possible way for the electronic device 100 to prompt the user which of the images currently displayed are physical objects and which are simulated objects.
  • the electronic device 100 may add a dashed frame around the simulation object displayed on the user interface 31, thereby prompting the user that the content displayed in the dashed frame is the simulation object.
  • FIG. 4B shows the image seen by the user when the electronic device 100 displays the user interface 31 as shown in FIG. 4A. The user can see that a dashed frame is added around the simulated object, so as to accurately distinguish the image in the image. Physical objects and simulation objects.
  • the electronic device 100 may also prompt the user through other display methods.
  • the electronic device 100 may also display simulated objects with animation effects (for example, display simulated objects intermittently rather than continuously), add icons or text on the simulated objects, add icons or text on the physical objects, etc., to remind the user of the current display Which of the images in the image are solid objects and which are simulated objects.
  • the electronic device 100 may also prompt the user through voice, vibration, flashing light, etc., which are physical objects and which are simulated objects in the currently displayed image.
  • the electronic device 100 may be equipped with a motor.
  • the motor can output a vibration feedback; if the object is a simulation object, the motor can output Two vibration feedback.
  • the information used to indicate that the object displayed by the electronic device 100 is a simulation object may be referred to as third prompt information.
  • the third prompt information may include, but is not limited to: visual elements displayed by the electronic device 100, voice, indicator light feedback, or vibration feedback.
  • the information used to indicate that the object displayed by the electronic device 100 is a physical object may be referred to as fourth prompt information.
  • the third prompt information may include, but is not limited to: visual elements displayed by the electronic device 100, voice, indicator light feedback, or vibration feedback.
  • the electronic device 100 may initiate authentication of the currently displayed simulation object, that is, identify whether the currently displayed simulation object is a legal simulation object.
  • the electronic device 100 can initiate authentication of the currently displayed simulation object in the following two situations:
  • the electronic device 100 detects an operation for authenticating the displayed one or more simulated objects, and in response to the operation, the electronic device 100 authenticates the one or more simulated objects. right.
  • the user can select or direct to the simulation object to be authenticated among one or more simulation objects displayed on the electronic device 100.
  • the user can select or be directed to the simulation object to be authenticated in the following ways:
  • the user selects or orients to the simulation object to be authenticated through hand movement.
  • FIG. 5A shows a manner in which the user selects or orients to the simulated object to be authenticated through hand movement.
  • the electronic device 100 can be used with a handheld device, and the movement of the handheld device in the real physical world can be converted into interactive operations between the user and the objects displayed by the electronic device 100.
  • the user holds the handheld device with his hand to exercise, the handheld device can detect and track its own motion information and send it to the electronic device 100, so that the electronic device 100 can grasp the movement of the handheld device.
  • the handheld device can detect and track its own movement through the configured acceleration sensor, gyroscope sensor, magnetic sensor and other sensors.
  • the electronic device 100 may display a simulated prompt message that the handheld device is directed to the simulated object on the display screen according to the operating condition of the handheld device. As shown in FIG. 5A, the electronic device 100 may display prompt information 501 on the user interface 31.
  • the prompt information 501 is used to prompt the user that the simulation object currently directed to or selected by the user is the simulation character 307.
  • the prompt information 501 may be a virtual arrow-shaped light beam.
  • FIG. 5B is an image seen by the user when the electronic device 100 displays the user interface 31 as shown in FIG. 5A.
  • the electronic device 100 may also convert the movement of the user's hand into the simulation displayed by the user and the electronic device 100. Interaction between objects. For example, the electronic device 100 may capture an image of the user's hand movement through the camera 190, and display on the display screen a prompt message indicating that the simulated user's hand movement is directed to the simulated object based on the image of the user's hand movement.
  • the user selects or directs to the simulated object to be authenticated through eye movement.
  • FIG. 6A shows a manner in which the user selects or orients to the simulated object to be authenticated through eye movement.
  • the user's eyeballs can look at the display screen of the electronic device 100 and rotate the eyeballs to select or orientate to the simulated object to be authenticated.
  • the electronic device 100 can detect the position of the user's eyeball gazing on the display screen through an infrared device (such as an infrared transmitter) and/or a camera 190 through eye tracking technology.
  • an infrared device such as an infrared transmitter
  • the electronic device 100 may display a simulated prompt information indicating that the user's eyeball is directed to the simulated object at the position where the user's eyeball is gazing at the display screen according to the detected position where the user's eyeball is gazing at the display screen.
  • the electronic device 100 may display prompt information 601 on the user interface 31.
  • the prompt information 601 is used to prompt the user that the simulation object currently directed to or selected by the user is the simulation character 307.
  • the prompt information 601 may be a virtual circular icon.
  • FIG. 6B is an image seen by the user when the electronic device 100 displays the user interface 31 as shown in FIG. 6A.
  • simulation object to be authenticated it is not limited to selecting the simulation object to be authenticated through hand movement or eye movement, and the embodiment of the present application may also select the simulation object to be authenticated in other ways.
  • the user can also select the simulation object to be authenticated by inputting a voice command.
  • the electronic device 100 can be triggered to authenticate the simulation object.
  • the user triggering the electronic device 100 to authenticate the selected or directed simulation object may include the following types:
  • the user inputs user operations (such as pressing operations, touch operations, etc.) on the physical keys or virtual keys set on the electronic device 100 or a handheld device connected to the electronic device 100, and in response to the user operations, the electronic device 100 selects or The targeted simulation object is authenticated.
  • user operations such as pressing operations, touch operations, etc.
  • the user can input a pressing operation on a physical key on the handheld device, and after the handheld device detects the operation, the instruction information of the operation is sent to the electronic device 100.
  • the electronic device 100 authenticates the simulation object 307 selected or directed to by the user.
  • the user triggers the electronic device 100 to authenticate the simulated object selected or directed to by the user through eye movements.
  • the eye movement may include: a gaze operation input to the simulation object that exceeds a preset duration, one or more blink operations input when the simulation object is gaze, and so on.
  • the electronic device 100 may detect the eye movement through the infrared device and/or the camera 190, and in response to the detected eye movement, the electronic device 100 authenticates the selected or directed simulated object.
  • the user can input a voice command to the electronic device 100, the electronic device 100 can detect the voice command through the microphone 170C, and respond to the voice command to authenticate the selected or directed simulation object.
  • the user can also trigger the electronic device 100 to authenticate the selected or directed simulated object in other ways, which is not limited in the embodiment of the present application.
  • the user can also simulate the gesture of drawing a circle on the simulated object by hand movement, and trigger the electronic device 100 to authenticate it.
  • the operation for triggering the electronic device 100 to authenticate the simulated object may be referred to as the first operation. That is, the first operation may include, but is not limited to: gesture input acting on the simulated object, blinking operation on the simulated object, and gaze operation on the simulated object (for example, a gaze operation exceeding a preset duration) Or, a voice command used to determine whether the simulation object is legal (for example, the voice command "authentication").
  • the electronic device 100 actively initiates authentication of one or more simulation objects currently displayed.
  • the electronic device 100 may authenticate a new simulation object every time the simulation object is displayed. In other embodiments, the electronic device 100 may also periodically authenticate one or more displayed simulated objects. For example, the electronic device 100 may authenticate all displayed simulated objects every half hour.
  • the electronic device 100 may also initiate the authentication of the currently displayed simulation object in other cases.
  • the user can also trigger the electronic device 100 on a mobile phone connected to or paired with the electronic device 100 to authenticate the currently displayed simulation object.
  • the process of the electronic device 100 authenticating the simulated object involves interaction with the server 400 and the authentication server 500.
  • the authentication process will be described in detail in subsequent embodiments, and will not be repeated here.
  • the user may be prompted that the authentication operation is currently being performed.
  • the electronic device 100 may prompt the user that the authentication operation is currently being performed in the following ways:
  • the electronic device 100 may display prompt information on the display screen, and the prompt information is used to prompt the user that the authentication operation is currently being performed.
  • FIG. 7A shows a possible prompt information displayed by the electronic device 100 on the display screen.
  • the electronic device 100 may display a window 701 on the provided user interface 31.
  • the window 701 includes the text message "Authentication of the simulated object, please wait", which can be used to remind the user that the user is currently Perform authentication operations.
  • FIG. 7B is an image seen by the user when the electronic device 100 displays the user interface 31 as shown in FIG. 7A.
  • the window 701 disappears automatically after being displayed on the display screen of the electronic device 100 for a period of time without user interaction.
  • the electronic device 100 may also display other forms of prompt information on the display screen, such as icons, animations, etc., to prompt the user that the authentication operation is currently being performed, which is not the case in this embodiment. Do restrictions.
  • the electronic device 100 can prompt the user that the authentication operation is currently being performed by means of voice, flashing indicator lights, vibration, etc.
  • the electronic device 100 may also use the speaker 170A to play the voice "authentication in progress", control the blinking of the indicator light, control the vibration of the motor, etc. to remind the user that the authentication operation is currently being performed.
  • any of the above-mentioned information output by the electronic device 100 for prompting that the authentication operation is currently being performed may be referred to as second prompt information. That is, the second prompt information may include, but is not limited to: visual elements displayed by the electronic device 100, voice, indicator light feedback, or vibration feedback.
  • the electronic device 100 After the electronic device 100 performs an authentication operation on the displayed simulation object, it can determine whether the simulation object is legal.
  • the electronic device 100 may notify the user of the authentication result, that is, prompt the user whether the one or more simulation objects are legal.
  • the electronic device 100 prompts the user whether the one or more simulation objects are legal may include the following:
  • the electronic device 100 may display prompt information on the display screen, and the prompt information is used to prompt the user whether the simulation object is legal.
  • FIG. 8A shows a possible prompt information displayed by the electronic device 100 on the display screen.
  • the electronic device 100 may display an icon 801 on the provided user interface 31, and the icon 801 may be used to prompt the user that the simulated object 307 is illegal.
  • the icon 801 disappears automatically after being displayed on the display screen of the electronic device 100 for a period of time without user interaction.
  • FIG. 8B is an image seen by the user when the electronic device 100 displays the user interface as shown in FIG. 8A.
  • the electronic device 100 may also display other forms of prompt information, such as text, animation, etc., on the display screen, thereby prompting the user whether the simulated object is legal, which is not limited in this embodiment of the application. .
  • the electronic device 100 can prompt the user whether the simulation object is legal or not by means of voice, flashing indicator lights, vibration, etc.
  • the electronic device 100 may play the voice "legal” through the speaker 170C to prompt the user that the simulated object is legal, and play the voice "illegal” through the speaker 170C to prompt the user that the simulated object is illegal.
  • the electronic device 100 may control the indicator light to flash once to remind the user that the simulation object is legal, and control the indicator light to flash twice to remind the user that the simulation object is illegal.
  • the electronic device 100 may control the motor to output one vibration to remind the user that the simulated object is legal, and control the motor to output two vibrations to remind the user that the simulated object is illegal.
  • the first prompt information may include, but is not limited to: visual elements displayed by the electronic device 100, voice, indicator light feedback, or vibration feedback.
  • the user can know whether the simulation object currently seen is legal. If the currently displayed simulation object is illegal, you can increase the user’s alertness and prevent the user from divulging personal information (such as leaking home address, phone number, etc.) or performing unsafe operations (such as clicking on the unsafe Website or link, etc.), so as to protect user privacy and improve the safety of electronic equipment.
  • personal information such as leaking home address, phone number, etc.
  • unsafe operations such as clicking on the unsafe Website or link, etc.
  • the electronic device 100 After the electronic device 100 prompts the user of the authentication result of the simulated object in any of the above-mentioned ways, the user can learn whether the simulated object is legal. In some embodiments, the user can choose to further process the illegal simulation object.
  • the processing may include: blocking, and/or, reporting. The following introduces several ways for users to block or report illegal simulation objects.
  • the electronic device 100 displays one or more of the following controls on the user interface 31:
  • the control 901 and the control 902 that is, the electronic device 100 displays a user interface as shown in FIG. 9C.
  • the user interface 31 shown in FIG. 9A is the same as the user interface 31 shown in FIG. 8A, and reference may be made to related descriptions.
  • the user operation acting on the icon 801 may include: after the user selects or orients to the icon 801 through hand movement or eye movement, a user operation input on a key set on the electronic device 100 or a handheld device (such as pressing Operation, touch operation, etc.), or the operation of the user's eyes watching the icon 801 for a long time, or one or more blinking operations input when the user's eyes are watching the icon 801, or a voice command input by the user, etc.
  • a user operation input on a key set on the electronic device 100 or a handheld device such as pressing Operation, touch operation, etc.
  • the operation of the user's eyes watching the icon 801 for a long time or one or more blinking operations input when the user's eyes are watching the icon 801, or a voice command input by the user, etc.
  • a user operation input on a key set on the electronic device 100 or a handheld device such as pressing Operation, touch operation, etc.
  • the control 901 can receive a user operation.
  • the electronic device 100 stops displaying the simulated object 307 on the display screen, that is, shields the simulated object 307.
  • the user operation acting on the control 901 may include: after the user selects or orients to the control 901 through hand movement or eye movement, a user operation (for example, pressing the key set on the electronic device 100 or handheld device) Operation, touch operation, etc.), or the operation of the user watching the control 901 for a long time, or one or more blinking operations input when the user is watching the control 901, or a voice command input by the user, etc.
  • a user operation for example, pressing the key set on the electronic device 100 or handheld device Operation, touch operation, etc.
  • the operation of the user watching the control 901 for a long time or one or more blinking operations input when the user is watching the control 901, or a voice command input by the user, etc.
  • the user selecting or orienting to the control 901 through hand movement or eye movement please refer to the related description
  • the control 902 can receive a user operation.
  • the electronic device 100 reports the simulation object 307.
  • the electronic device 100 reporting the simulated object 307 means that the electronic device 100 sends the identification of the simulated object 307 to the authentication server 300.
  • the authentication server 300 may store the identification of the illegal simulation object reported by the electronic device 100.
  • the user operation acting on the control 902 may include: after the user selects or orients to the control 902 through hand movement or eye movement, a user operation (such as pressing a button) on the electronic device 100 or a handheld device is input. Operation, touch operation, etc.), or the operation of the user watching the control 902 for a long time, or one or more blinking operations input when the user is watching the control 902, or a voice command input by the user, etc.
  • a user operation such as pressing a button
  • the user watching the control 902 for a long time or one or more blinking operations input when the user is watching the control 902, or a voice command input by the user, etc.
  • FIG. 9B shows an image seen by a user when the electronic device 100 displays a user interface as shown in FIG. 9A.
  • FIG. 9D shows an image seen by a user when the electronic device 100 displays a user interface as shown in FIG. 9C.
  • the electronic device 100 may also use other methods to shield or report illegal simulation objects.
  • the user can also input a simulated cross gesture on the simulated object 307 to trigger the electronic device 100 to report the simulated object 307.
  • the user can also input a simulated circle gesture on the simulated object 307 to trigger the electronic device 100 to shield the simulated object 307.
  • any of the above-mentioned operations for triggering the electronic device to shield the illegal simulation object may be referred to as the second operation. That is to say, the second operation may include, but is not limited to: gesture input acting on the illegal simulation object, blinking operation on the illegal simulation object, and gaze operation on the illegal simulation object (for example, gaze over a preset duration). Operation), or to shield the voice command of the illegal simulation object.
  • any of the operations mentioned above for triggering the electronic device to report the illegal simulation object may be referred to as the third operation. That is to say, the third operation may include, but is not limited to: gesture input acting on the illegal simulation object, blinking operation on the illegal simulation object, and gaze operation on the illegal simulation object (for example, gaze over a preset duration). Operation), or a voice command used to report the illegal simulation object.
  • the electronic device 100 may not notify the user of the authentication result, that is, the electronic device 100 may not prompt the user whether the one or more simulation objects are legal. , And take the initiative to further deal with illegal simulation objects.
  • the processing may include: blocking, and/or, reporting.
  • the user interface 31 shown in FIG. 10A is an interface displayed after the electronic device 100 shields an illegal simulation object.
  • the electronic device 100 stops displaying the illegal simulation object (ie, the simulation character 307) on the display screen.
  • the electronic device 100 may prompt the user that the electronic device 100 has blocked the illegal simulation object.
  • FIG. 10A shows a way for the electronic device 100 to prompt the user.
  • the electronic device 100 may display a window 1001 on the display screen.
  • the window 1001 includes prompt information (for example, the text message "Illegal simulation objects have been shielded for you, please continue to experience the AR/MR scene!).
  • FIG. 10B shows an image seen by the user when the electronic device 100 displays the user interface 31 shown in FIG. 10A. Not limited to the window 1001 shown in FIG.
  • the electronic device 100 may also display other forms of prompt information, such as icons, animations, etc., on the display screen. Not limited to the prompt information displayed on the display screen, the electronic device 100 can also prompt the user through voice, vibration, flashlight, etc. that the electronic device 100 has shielded an illegal simulation object, which is not limited in this embodiment of the application.
  • the electronic device 100 may prompt the user that the electronic device 100 has reported the illegal simulation object.
  • FIG. 11A shows a way for the electronic device 100 to prompt the user.
  • the electronic device 100 may display a window 1101 on the display screen.
  • the window 1101 includes prompt information (for example, the text message "The illegal simulation object has been successfully reported, please continue to experience the AR/MR scene!).
  • FIG. 11B shows an image seen by the user when the electronic device 100 displays the user interface 31 as shown in FIG. 11A. Not limited to the window 1101 shown in FIG.
  • the electronic device 100 may also display other forms of prompt information, such as icons, animations, etc., on the display screen. Not limited to the prompt information displayed on the display screen, the electronic device 100 can also prompt the user through voice, vibration, flashlight, etc. that the electronic device 100 has shielded an illegal simulation object, which is not limited in this embodiment of the application.
  • the method for recognizing a simulated object in the embodiment of the present application can also be applied to a VR display scene.
  • the electronic device 100 provides a VR display scene for the user, it can also implement the simulation object recognition method provided in the embodiment of the present application.
  • Figures 11A-11B show the simulated object directly on the display screen.
  • the electronic device can be equipped with an optical device that can project optical signals onto the user’s retina, so that the user can use the electronic The device 100 sees these simulation objects.
  • the arrow-shaped light beam shown in FIG. 5B the arrow-shaped light beam shown in FIG.
  • the icon 801 shown in FIG. 8A for prompting the user that the simulated object 307 is illegal may be the object that the user sees after the optical signal is projected onto the user's retina by the optical device.
  • the following describes the process of the electronic device 100 authenticating the simulation object provided by the embodiment of the present application, that is, the process of the electronic device 100 identifying whether the simulation object is legal.
  • FIG. 12 shows a registration process of a legal simulation object provided by an example of the present application. As shown in Figure 12, the registration process may include the following steps:
  • step S110 the electronic device 100 or the computing device 200 generates a simulation object.
  • Step S120 the electronic device 100 or the computing device 200 sends a registration request to the registration device 300.
  • Step S130 In response to the registration request, the registration device 300 assigns an identifier to the simulation object.
  • the identifier assigned by the registration device 300 to the simulation object may be a universally unique identifier UUID.
  • Step S140 The registration device 300 uses the first algorithm to encrypt the identifier to obtain a key value, and sends the identifier and the key value to the electronic device 100 and the server 400, respectively.
  • the first algorithm may be the K4 algorithm, and the key value obtained by encrypting the identification of the simulation object using the first algorithm is the authentication key (Ki).
  • step S140 the registration device 300 transmits a specific value, which can avoid leaking the first algorithm.
  • Step S150 the electronic device 100 and the server 400 store the identification and key value of the simulation object in association.
  • the registration device and the server 400 are associated and stored with the identification of the legal simulation object and the corresponding key value.
  • FIG. 13 shows a process in which the electronic device 100 according to an embodiment of the present application authenticates a simulation object.
  • the authentication process may include the following steps:
  • Step S210 The electronic device 100 sends an authentication request to the authentication server 500, and the authentication request carries the identifier of the simulation object for which the electronic device 100 requests authentication.
  • the identifier of the simulation object may be a universally unique identifier UUID. If the simulation object is legal, the UUID of the simulation object may be assigned to the simulation object by the registered device.
  • Step S220 The authentication server 500 receives the authentication request, and sends the identifier of the simulation object to the server 400.
  • Step S230 The server 400 searches for the key value corresponding to the identification of the simulation object, and calculates the key value at least through the second algorithm and/or the third algorithm to generate the first authentication response value.
  • the server 400 stores the identification of the legal simulation object and the corresponding key value.
  • the identification of the legal simulation object and the corresponding key value may be issued to the server 400 by the registered device.
  • the key value may be a value obtained after calculating the identifier by the first algorithm.
  • the key value corresponding to the identification of the legal simulation object may be the authentication key Ki.
  • the first authentication response value may be a sign response (sign response, SRES).
  • Step S240 The server 400 sends the first authentication response value to the authentication server 500.
  • Step S250 The authentication server 500 receives the first authentication response value, and sends an authentication request to the electronic device 100.
  • Step S260 The electronic device 100 receives the authentication request, searches for the key value corresponding to the identification of the simulation object, and calculates the key value through the second algorithm and the third algorithm to generate a second authentication response value.
  • the electronic device 100 is the same as the server 400, and also stores the identification of the legal simulation object and the corresponding key value.
  • the identification of the legal simulation object and the corresponding key value may be issued to the server 400 by the registered device.
  • the key value may be a value obtained after calculating the identifier by the first algorithm.
  • the first algorithm may be the K4 algorithm
  • the key value corresponding to the identification of the legal simulation object may be the authentication key Ki.
  • the second algorithm may be the A3 algorithm, and the third algorithm may be the A8 algorithm.
  • the second authentication response value may be a sign response (sign response, SRES).
  • Step S270 The electronic device 100 sends the second authentication response value to the authentication server 500.
  • Step S280 The authentication server 500 receives the second authentication response value, and compares the first authentication response value with the second authentication response value. If the two are the same, the simulation object is legal; if the two are different, the simulation Object is illegal.
  • the above steps S240 and S280 transmit specific values, which can avoid leaking the second algorithm and the third algorithm.
  • Step S290 The authentication server 500 sends the authentication result to the electronic device 100.
  • first algorithm, the second algorithm, and the third algorithm mentioned above are only examples. In some other embodiments of the present application, the first algorithm, the second algorithm, and the third algorithm can also be replaced with other algorithms. algorithm.
  • the illegal simulation object may also have a corresponding identification. However, since the illegal simulation object has not gone through the registration process, and the electronic device 100 cannot learn the first algorithm, the key value of the illegal simulation object stored in the electronic device 100 is invalid. Similarly, since the illegal simulation object has not undergone the registration process and the server 400 cannot learn the first algorithm, the key value of the illegal simulation object stored in the server 400 is also invalid.
  • the first authentication response value calculated by the electronic device 100 is different from the second authentication response value calculated by the server 400, and the authentication server 500 can determine that the simulation object is illegal.
  • the simulation object can also be authenticated in other ways, which is not limited in the embodiment of the present application.
  • IMSI international mobile subscriber identification number
  • the simulation object can also be verified through a valid international mobile subscriber identification number (IMSI) authentication method in technologies such as universal mobile telecommunications system (universal mobile telecommunications system) and long term evolution (long term evolution, LTE).
  • IMSI international mobile subscriber identification number
  • the simulation object performs authentication and so on.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种仿真对象的身份识别方法、相关装置及系统,应用于VR、AR、MR等领域。电子设备在显示仿真对象后,确定该仿真对象是否合法,并且把确定结果告知用户。所述方法可以提高用户的警觉性,避免用户在该非法仿真对象的诱导下泄露个人信息,从而保护用户隐私,提升电子设备的使用安全。

Description

仿真对象的身份识别方法、相关装置及系统
本申请要求在2019年6月21日提交中国国家知识产权局、申请号为201910543117.7的中国专利申请的优先权,发明名称为“仿真对象的身份识别方法、相关装置及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机图形技术、身份识别技术,特别涉及仿真对象的身份识别方法、相关装置及系统。
背景技术
随着计算机图形技术的发展,虚拟现实(virtual reality,VR)、增强现实(augmented reality,AR)、混合现实(mixed reality,MR)等技术逐渐应用到人们的生活中。
VR利用电脑模拟产生一个三维空间的虚拟世界,提供用户关于视觉等感官的模拟效果,让用户感觉仿佛身历其境,并允许用户与该模拟的虚拟世界进行交互。
AR是一种将真实世界信息和虚拟世界信息集成的技术,将在现实世界的一定时间空间范围内很难体验到的实体信息,如视觉信息,声音,味道,触觉等等,通过电脑等科学技术模拟仿真后叠加到真实的世界中,从而使用户得到超越现实的感官体验。
MR将真实世界和虚拟世界混合在一起,产生新的可视化环境,环境中同时包含了真实世界信息和虚拟世界信息,真实世界和虚拟世界可以实时交互。
VR/AR头盔、AR眼镜等电子设备在通过VR、AR、MR等技术展示虚拟的图像时,由于电子设备自身的计算能力有限,该虚拟图像有可能是由其它设备例如手机或服务器生成的。虚拟图像在由手机或服务器发送至电子设备的过程中有可能被更换,导致电子设备最终呈现的虚拟图像并不是安全或者合法的。因此,有必要提出一种技术方案,来鉴别电子设备呈现的虚拟图像是否安全或者合法,避免可能的安全隐患。
发明内容
本申请提供了仿真对象的身份识别方法、相关装置及系统,可以给用户呈现一个安全的虚拟世界,从而保护用户隐私,提升电子设备的使用安全。
第一方面,本申请实施例提供了一种仿真对象的身份识别方法,该方法应用于电子设备。该方法可包括:电子设备通过显示装置显示仿真对象,该仿真对象是利用计算机图形技术生成的虚拟图像;该电子设备确定该仿真对象是否合法;响应于该仿真对象是否合法的确定结果,该电子设备输出第一提示信息,该第一提示信息用于指示该仿真对象是否合法;其中,在该电子设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在该电子设备对应的注册设备注册过的仿真对象是非法的仿真对象。
实施第一方面的方法,电子设备可以提示用户当前显示的仿真对象是否是合法的仿真对象。在该仿真对象非法时,可以提高用户的警觉性,避免用户在该非法仿真对象的诱导下泄露个人信息,保障用户操作的安全,从而保护用户隐私,提升电子设备的使用安全。
结合第一方面,在一些实施例中,注册设备会为电子设备信任的设备生成的仿真对象执行注册操作。也就是说,由电子设备信任的设备生成的仿真对象是合法的,由电子设备不信任的设备生成的仿真对象是非法的。电子设备信任的设备可包括:1.该电子设备本身。例如,VR/AR/MR眼镜等。2.该电子设备连接到的其他设备。例如,该电子设备连接到的手机等。3.该电子设备的生产厂商提供的服务器。4.电子设备上已经安装的VR/AR/MR类应用的开发商所提供的服务器。
结合第一方面,在一些实施例中,电子设备的显示装置可以有以下两种实现方式:
1、该显示装置为显示屏。该电子设备通过该显示屏显示仿真对象。
2、该显示装置包括光学装置。该电子设备通过该光学装置投射和该仿真对象所对应的光学信号。这样,用户的视网膜可以接收到该光学信号,从而看到该仿真对象。
结合第一方面,在一些实施例中,电子设备显示的仿真对象是由该电子设备生成的,或者,该仿真对象是由计算设备生成后发送给该电子设备的。
结合第一方面,在一些实施例中,电子设备输出的第一提示信息可以包括以下一项或多项:该显示装置显示的可视化元素、语音、指示灯反馈或者振动反馈。例如,当显示的仿真对象合法时,电子设备可以在该仿真对象上显示文本信息“合法”,当显示的仿真对象非法时,电子设备可以在该仿真对象上显示文本信息“非法”。
结合第一方面,在一些实施例中,电子设备可以在以下任意一种情况下执行确定显示的仿真对象是否合法的操作:
1、在检测到作用于该仿真对象的第一操作时,响应于该第一操作,该电子设备确定该仿真对象是否合法。在一些实施例中,作用于该仿真对象的第一操作包括以下任意一项:作用于该仿真对象的手势输入,作用于该仿真对象的眨眼操作,作用于该仿真对象的注视操作(例如超过预设时长的注视操作),或者,用于确定该仿真对象是否合法的语音指令(例如语音指令“鉴权”)。上述第1种情况,可以由用户自主决定是否触发电子设备对仿真对象进行鉴权。
2、在显示该仿真对象的同时,确定该仿真对象是否合法。
3、周期性地确定该仿真对象是否合法。上述第2、3种情况,电子设备可以主动对仿真对象发起鉴权。
结合第一方面,在一些实施例中,电子设备确定该仿真对象是否合法时,还可以输出第二提示信息,该第二提示信息用于指示该电子设备正在确定该仿真对象是否合法。这里,该第二提示信息,包括以下一项或多项:该显示装置显示的可视化元素、语音、指示灯反馈或者振动反馈。
结合第一方面,在一些实施例中,如果仿真对象的鉴权结果为非法,则电子设备可以主动对该仿真对象执行以下一项或多项操作:
1、响应于该仿真对象非法的确定结果,该电子设备停止显示该仿真对象。
2、响应于该仿真对象非法的确定结果,该电子设备举报该仿真对象。具体的,电子设备可以将该仿真对象的标识发送至鉴权服务器,以使得鉴权服务器将该仿真对象标记为非法的仿真对象。
结合第一方面,在一些实施例中,如果仿真对象的鉴权结果为非法,则电子设备可以在用户的触发下对该仿真对象执行以下一项或多项操作:
1、若该仿真对象为非法的仿真对象,该电子设备检测作用于该仿真对象的第二操作,响应于该第二操作,该电子设备停止显示该仿真对象。这里,第二操作可包括以下任意一项: 作用于该仿真对象的手势输入,作用于该仿真对象的眨眼操作,作用于该仿真对象的注视操作,或者,用于确定该仿真对象是否合法的语音指令。
2、若该仿真对象为非法的仿真对象,该电子设备检测作用于该仿真对象的第三操作,响应于该第三操作,该电子设备举报该仿真对象。这里,电子设备举报仿真对象的具体操作可参照前一实施例的相关描述。第三操作可包括以下任意一项:作用于该仿真对象的手势输入,作用于该仿真对象的眨眼操作,作用于该仿真对象的注视操作,或者,用于确定该仿真对象是否合法的语音指令。
结合第一方面,在一些实施例中,电子设备还可以显示实体对象,并执行:输出第三提示信息,该第三提示信息用于指示该仿真对象是利用计算机图形技术生成的虚拟图像;和/或,该电子设备输出第四提示信息,该第四提示信息用于指示该实体对象对应的物体存在于真实世界中。通过这种方式,可以提示用户电子设备当前显示的图像中哪些是实体对象,哪些是仿真对象。第三提示信息、第四提示信息的实现方式和上述第一提示信息、第二提示信息的实现方式类似,可参考相关描述。
这里,电子设备可以通过以下任意一种方式来显示实体对象:1、电子设备的显示装置为显示屏,在该显示屏上显示实体对象,该实体对象由电子设备的摄像头采集得到。2、电子设备的显示装置包括透明的镜片,用户透过该镜片直接看到真实世界中的实体对象。
结合第一方面,在一些实施例中,电子设备确定该仿真对象是否合法的过程可具体包括:该电子设备将该仿真对象的标识发送给鉴权服务器,以使得该鉴权服务器判断该仿真对象是否合法;该电子设备接收该鉴权服务器返回的鉴权结果,该鉴权结果指示该仿真对象是否合法;该电子设备根据该鉴权结果确定该仿真对象是否合法。
结合第一方面,在一些实施例中,电子设备显示的仿真对象包括以下一项或多项:仿真人物、仿真动物、仿真树木或仿真建筑物。
第二方面,本申请实施例提供了一种仿真对象的识别方法,该方法应用于头戴式设备。该方法可包括:头戴式设备通过显示装置显示仿真对象和实体对象,该仿真对象是利用计算机图形技术生成的虚拟图像;该头戴式设备检测到作用于仿真对象的第一操作,响应于该第一操作确定该仿真对象是否合法;输出第二提示信息,该第二提示信息用于指示该头戴式设备正在确定该仿真对象是否合法;响应于该仿真对象是否合法的确定结果,该头戴式设备输出第一提示信息,该第一提示信息用于指示该仿真对象是否合法;若该仿真对象为非法的仿真对象,头戴式设备检测作用于该仿真对象的第二操作,响应于第二操作停止显示该仿真对象;其中,在该头戴式设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在该头戴式设备对应的注册设备注册过的仿真对象是非法的仿真对象;该仿真对象包括仿真人物或仿真动物。
这里,头戴式设备可包括但不限于:AR/MR眼镜、AR/MR头戴式显示设备、AR/MR一体机等。头戴式设备通过显示装置显示仿真对象和实体对象时,该头戴式设备可提供AR/MR显示场景。
实施第二方面的方法,头戴式设备可以在提供AR/MR显示场景时,提示用户当前显示的仿真对象是否是合法的仿真对象。在该仿真对象非法时,可以提高用户的警觉性,避免用户在AR/MR场景下,在非法仿真对象的诱导下泄露个人信息,保障用户操作的安全,从而保护用户隐私,提升电子设备的使用安全。
结合第二方面,在一些实施例中,注册设备会为头戴式设备信任的设备生成的仿真对象 执行注册操作。也就是说,由头戴式设备信任的设备生成的仿真对象是合法的,由头戴式设备不信任的设备生成的仿真对象是非法的。头戴式设备信任的设备可参照第一方面中的相关描述。
基于同一发明构思,由于第二方面的仿真对象的识别方法以及有益效果可以参见上述第一方面和第一方面的各可能的方法实施方式以及所带来的有益效果,因此该仿真对象的识别方法的实施可以参见上述第一方面和第一方面的各可能的方法的实施方式,重复之处不再赘述。
第三方面,本申请实施例提供了一种电子设备上的图形用户界面,该电子设备包括显示装置、存储器、以及一个或多个处理器,该一个或多个处理器用于执行存储在该存储器中的一个或多个计算机程序,其特征在于,该图形用户界面包括:显示仿真对象,该仿真对象是利用计算机图形技术生成的虚拟图像;响应于该仿真对象是否合法的确定结果,输出第一提示信息,该第一提示信息用于指示该仿真对象是否合法;其中,在该电子设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在该电子设备对应的注册设备注册过的仿真对象是非法的仿真对象。
结合第三方面,在一些实施例中,电子设备的显示装置可以有以下两种实现方式:
1、该显示装置为显示屏。该图形用户界面具体包括:通过该显示屏显示仿真对象。
2、该显示装置包括光学装置。该图形用户界面具体包括:通过该光学装置投射和该仿真对象所对应的光学信号。
结合第三方面,在一些实施例中,显示的仿真对象是由该电子设备生成的,或者,该仿真对象是由计算设备生成后发送给该电子设备的。
结合第三方面,在一些实施例中,第一提示信息,包括以下一项或多项:该显示装置显示的可视化元素、语音、指示灯反馈或者振动反馈。
结合第三方面,在一些实施例中,该图形用户界面还包括:在输出该第一提示信息之前,输出第二提示信息,该第二提示信息用于指示该电子设备正在确定该仿真对象是否合法。这里,该第二提示信息,包括以下一项或多项:该显示装置显示的可视化元素、语音、指示灯反馈或者振动反馈。
结合第三方面,在一些实施例中,该图形用户界面还包括:响应于该仿真对象非法的确定结果,停止显示该仿真对象。
结合第三方面,在一些实施例中,该图形用户界面还包括:若该仿真对象为非法的仿真对象,检测作用于该仿真对象的第二操作,响应于该第二操作,停止显示该仿真对象。这里,第二操作可参照第二方面的方法中的相关描述。
结合第三方面,在一些实施例中,该图形用户界面还包括:显示实体对象;输出第三提示信息,该第三提示信息用于指示该仿真对象是利用计算机图形技术生成的虚拟图像,和/或,输出第四提示信息,该第四提示信息用于指示该实体对象对应的物体存在于真实世界中。这里,第三提示信息、第四提示信息可参照第二方面的方法中的相关描述。
第四方面,本申请实施例提供了一种头戴式设备上的图形用户界面,该头戴式设备包括显示装置、存储器、以及一个或多个处理器,该一个或多个处理器用于执行存储在该存储器中的一个或多个计算机程序,其特征在于,该图形用户界面包括:显示实体对象和仿真对象,该仿真对象是利用计算机图形技术生成的虚拟图像;检测到作用于该仿真对象的第一操作,响应于第一操作,确定该仿真对象是否合法;
输出第二提示信息,该第二提示信息用于指示头戴式设备正在确定该仿真对象是否合法;
响应于该仿真对象是否合法的确定结果,输出第一提示信息,该第一提示信息用于指示该仿真对象是否合法;若该仿真对象为非法的仿真对象,检测作用于该仿真对象的第二操作,响应于第二操作,停止显示该仿真对象;
其中,在该头戴式设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在该头戴式设备对应的注册设备注册过的仿真对象是非法的仿真对象;该仿真对象包括仿真人物或仿真动物。
基于同一发明构思,由于第四方面的图形用户界面以及有益效果可以参见上述第一方面和第一方面的各可能的实施方式以及所带来的有益效果,因此该图形用户界面的实施可以参见上述第一方面和第一方面的各可能的方法的实施方式,重复之处不再赘述。
第五方面,本申请实施例提供了一种电子设备,该电子设备用于执行上述第一方面所描述的方法。该电子设备包括:一个或多个处理器、存储器和显示装置;该存储器与该一个或多个处理器耦合,该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,该一个或多个处理器调用该计算机指令以使得该电子设备执行:通过显示装置显示仿真对象,该仿真对象是利用计算机图形技术生成的虚拟图像;确定该仿真对象是否合法;响应于该仿真对象是否合法的确定结果,输出第一提示信息,该第一提示信息用于指示该仿真对象是否合法;其中,在该电子设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在该电子设备对应的注册设备注册过的仿真对象是非法的仿真对象。
结合第五方面,在一些实施例中,注册设备会为电子设备信任的设备生成的仿真对象执行注册操作。也就是说,由电子设备信任的设备生成的仿真对象是合法的,由电子设备不信任的设备生成的仿真对象是非法的。电子设备信任的设备可参照第一方面的相关描述。
基于同一发明构思,第五方面的电子设备可用于执行第一方面或第一方面任意一种可能的实施方式中的方法。因此,第五方面的电子设备所执行的操作以及该电子设备带来的有益效果,可参照上述第一方面或第一方面任意一种可能的实施方式中的相关描述,这里不再赘述。
第六方面,本申请实施例提供了一种头戴式设备,该头戴式设备用于执行上述第二方面所描述的方法。该头戴式设备包括:一个或多个处理器、存储器和显示装置;该存储器与该一个或多个处理器耦合,该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,该一个或多个处理器调用该计算机指令以使得该头戴式设备执行:通过显示装置显示仿真对象和实体对象,该仿真对象是利用计算机图形技术生成的虚拟图像;确定该仿真对象是否合法;响应于该仿真对象是否合法的确定结果,输出第一提示信息,该第一提示信息用于指示该仿真对象是否合法;其中,在该头戴式设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在该头戴式设备对应的注册设备注册过的仿真对象是非法的仿真对象。
结合第六方面,在一些实施例中,注册设备会为头戴式设备信任的设备生成的仿真对象执行注册操作。也就是说,由头戴式设备信任的设备生成的仿真对象是合法的,由头戴式设备不信任的设备生成的仿真对象是非法的。头戴式设备信任的设备可参照第二方面的相关描述。
基于同一发明构思,第六方面的头戴式设备可用于执行第一方面或第一方面任意一种可能的实施方式中的方法。因此,第六方面的头戴式设备所执行的操作以及该头戴式设备带来的有益效果,可参照上述第一方面或第一方面任意一种可能的实施方式中的相关描述,这里 不再赘述。
第七方面,本申请实施例提供了一种仿真对象的身份识别系统,该系统包括:电子设备、注册设备、鉴权服务器。其中,该注册设备用于对仿真对象进行注册;该电子设备用于通过显示装置显示仿真对象;该电子设备为第五方面或第五方面任意一种可能的实施方式所描述的电子设备;该鉴权服务器用于判断该电子设备显示的仿真对象是否合法;其中,在该注册设备注册过的仿真对象是合法的仿真对象,未在该注册设备注册过的仿真对象是非法的仿真对象。
第八方面本申请实施例提供了一种仿真对象的身份识别系统,该系统包括:头戴式设备、注册设备、鉴权服务器。其中,该注册设备用于对仿真对象进行注册;该头戴式设备用于通过显示装置显示仿真对象;该头戴式设备为第六方面或第六方面任意一种可能的实施方式所描述的头戴式设备;该鉴权服务器用于判断该头戴式设备显示的仿真对象是否合法;其中,在该注册设备注册过的仿真对象是合法的仿真对象,未在该注册设备注册过的仿真对象是非法的仿真对象。
第九方面,本申请实施例提供了一种包含指令的计算机程序产品,当上述计算机程序产品在电子设备上运行时,使得上述电子设备执行如第一方面以及第一方面中任一可能的实现方式描述的方法。
第十方面,本申请实施例提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行如第一方面以及第一方面中任一可能的实现方式描述的方法。
第十一方面,本申请实施例提供了一种包含指令的计算机程序产品,当上述计算机程序产品在电子设备上运行时,使得上述电子设备执行如第二方面以及第二方面中任一可能的实现方式描述的方法。
第十二方面,本申请实施例提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行如第二方面以及第二方面中任一可能的实现方式描述的方法。
实施本申请实施例提供的技术方案,电子设备在显示仿真对象时,可以提示用户当前显示的仿真对象是否是合法的仿真对象。在该仿真对象非法时,可以提高用户的警觉性,避免用户在该非法仿真对象的诱导下泄露个人信息,保障用户操作的安全,从而保护用户隐私,提升电子设备的使用安全。
附图说明
图1是本申请实施例提供的仿真对象的身份识别系统的结构示意图;
图2是本申请实施例提供的电子设备的结构示意图;
图3A为本申请实施例提供的真实图像、图3B为本申请实施例提供的人机交互示意图、图3C为本申请实施例提供的用户看到的图像;
图4A、图5A、图6A、图7A、图8A图9A、图9C、图10A、图11A为本申请实施例提供的人机交互示意图;
图4B、图5B、图6B、图7B、图8B、图9B、图9D、图10B、图11B为本申请实施例提供的用户看到的图像;
图12为本申请实施例提供的仿真对象的注册流程示意图;
图13为本申请实施例提供的仿真对象的鉴权流程示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供了仿真对象的身份识别方法、相关装置及系统,电子设备在利用VR、AR、MR等技术显示仿真对象时,可以识别该仿真对象是否合法。电子设备可以将识别的结果告知用户,还可以对识别到的非法的仿真对象做处理。电子设备对非法的仿真对象的处理可包括:停止显示该非法的仿真对象,或者,举报该非法的仿真对象。实施本申请实施例提供的方法,可以给用户呈现一个安全的虚拟世界,从而保护用户隐私,提升电子设备的使用安全。
在本申请以下实施例中,仿真对象是指利用计算机图形技术、计算机仿真技术、显示技术等渲染生成并显示的虚拟图像,也可以被称为虚拟对象或虚拟元素。仿真对象是假的而非物理世界真实存在的图像。仿真对象可以是仿照真实世界存在的物体的虚拟物体,可以给用户带来沉浸式体验。本申请对仿真对象的类别不做限制,仿真对象可包括仿真动物,仿真人物,仿真树木,仿真建筑物、虚拟的标签、风景、文本信息、图标、图片或者视频等等。仿真对象可以是二维的,也可以是三维的。
在本申请实施例中,在电子设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在电子设备对应的注册设备注册过的仿真对象是非法的仿真对象。在一些实施例中,注册设备会为电子设备信任的设备生成的仿真对象执行注册操作。也就是说,由电子设备信任的设备生成的仿真对象是合法的,由电子设备不信任的设备生成的仿真对象是非法的。
电子设备信任的设备可包括:1.该电子设备本身。例如,VR/AR/MR眼镜、VR/AR/MR头戴式显示设备、VR/AR/MR一体机等。2.该电子设备连接到的其他设备。例如,该电子设备通过无线或者有线的方式连接到的手机、平板电脑或个人电脑等。3.该电子设备的生产厂商提供的服务器。例如,华为VR/AR/MR头戴式显示设备的生产厂商华为(HUAWEI)提供的用于生成仿真对象的服务器。4.电子设备上已经安装的VR/AR/MR类应用的开发商所提供的服务器。例如,电子设备上已经安装的VR/AR/MR类应用(例如VR游戏应用、AR导航应用等)的开发商提供的服务器。在一些实施例中,合法的仿真对象也可以被称为安全的仿真对象,非法的仿真对象也可以被称为不安全的仿真对象。
可理解的,电子设备显示仿真对象后,用户可以看到这些仿真对象(例如仿真人物)并和仿真对象产生交互。具体的,用户可以在仿真对象的指示下行走或驾驶(例如导航)、向电子设备输入信息、摆动身体等等。举例来说,如果仿真对象为仿真人物,则该仿真人物尝试 和用户交互时,电子设备可能调用扬声器播放语音“你好,你叫什么名字呀?”,同时调用电子设备的麦克风采集用户响应于该语音而输入的语音信息(例如用户的名字),即该仿真人物可能诱导用户泄露个人信息。
可理解的,电子设备信任的设备生成的仿真对象是安全可靠的,用户在这类仿真对象的指示下执行的操作是安全的,例如用户向电子设备输入的信息不会被泄露、导航时的路线是准确的等等。而电子设备不信任的设备生成的仿真对象的安全性不能得到保证,用户在这类仿真对象的指示下执行的操作的安全性不能得到保障,例如用户向电子设备输入的信息可能被泄露、导航时的路线可能不准确等等。因此,在本申请实施例中,电子设备将识别结果告知用户,可以提高用户的警觉性,保障用户操作的安全,从而保护用户隐私,提升电子设备的使用安全。
下面,首先介绍本申请实施例中涉及的仿真对象身份识别系统。该仿真对象身份识别系统中涉及到的仿真对象可以包括利用VR、AR、MR等技术生成的仿真对象。参考图1,该仿真对象身份识别系统可包括:电子设备100、用于生成仿真对象的计算设备200、注册设备300、服务器400以及鉴权服务器500、其中:
电子设备100和计算设备200可以组成VR/AR/MR显示系统。
在本申请实施例中,电子设备100是可以利用VR/AR/MR等技术显示仿真对象,为用户提供VR/AR/MR等显示环境的终端设备。
在一些实施例中,电子设备100可以利用VR技术呈现仿真对象,使用户感受到一个完全模拟的虚拟世界,即电子设备100可以为用户提供VR显示环境。
在另一些实施例中,电子设备100可以利用AR/MR等技术在物理世界中实际存在的真实对象上叠加呈现仿真对象,使用户感受到增强现实的效果,即电子设备100可以为用户提供AR/MR显示环境。这里,该物理世界中实际存在的真实对象可以是电子设备100的摄像头捕获到的,也可以是用户的眼睛直接看到的。
电子设备100呈现的仿真对象可以是由电子设备100本身通过计算机图形技术、计算机仿真技术等技术生成的,也可以是电子设备100连接到的其他计算设备200利用计算机图形技术、计算机仿真技术等技术生成后发送给电子设备100的。计算设备200可以是如图1所示的服务器,还可以是电子设备100连接到或者配对的手机、电脑等。也就是说,在一些实施例中,电子设备100可以配合计算设备200等一起使用,计算设备200用于为电子设备100生成并提供仿真对象。
电子设备100呈现的仿真对象可以和用户产生交互。在一些实施例中,用户可以通过手部/手臂运动、头部运动、眼球转动等交互方式来与电子设备100呈现的仿真对象进行交互。在另一些实施例中,电子设备100可以配合手持设备(图1中未示出)一起使用,用户可以通过对手持设备的操控来和电子设备100呈现的仿真对象进行交互。该手持设备例如可以是控制器、陀螺鼠标或者其他手持计算设备。手持设备可配置有多种传感器例如加速度传感器、陀螺仪传感器、磁传感器等,可用于检测和追踪自身运动。手持设备可以通过蓝牙(bluetooth)、近场通信(near field communication,NFC)、ZigBee等近距离传输技术和电子设备100通信。
电子设备100可以安装在用户头部,例如可以为VR/AR/MR眼镜、VR/AR/MR头戴式显示设备(head-mounted display,HMD)、VR/AR/MR一体机等。在本申请其他一些实施例中,电子设备100也可以是支持VR/AR/MR技术的台式计算机、智能电视机、包含显示屏的车辆等非便携式电子设备。
计算设备200可以响应于电子设备100的请求,通过计算机图形技术、计算机仿真技术等技术生成仿真对象,并将生成的仿真对象发送至电子设备100进行显示,使得电子设备100为用户提供VR/AR/MR显示环境。计算设备200是电子设备100信任的设备,该计算设备200可以是电子设备100连接到的手机、电子设备100的生产厂商提供的服务器,也可以是电子设备100上已经安装的VR/AR/MR类应用的开发商所提供的服务器等等。可理解的,在电子设备100本身可以通过计算机图形技术、计算机仿真技术等技术生成仿真对象的情况下,本申请实施例提供的仿真对象身份识别系统可以不设置计算设备200。
注册设备300是提供虚拟物体的注册服务的设备。注册设备300用于为合法的仿真对象提供注册服务,即用于为合法的仿真对象分配标识。该标识用于指示该仿真对象,例如该标识可以是通用唯一识别码(universally unique identifier,UUID)。注册设备300可以将两个数据分别发送给电子设备100以及包括数据库的服务器400进行存储,该两个数据为:分配给合法仿真对象的标识、使用第一算法对该标识加密后得到的用户鉴权键(subscriber authentication key,Ki)。在一些实施例中,该第一算法可以为K4算法。
服务器400以及鉴权服务器500可以为电子设备100呈现的仿真对象提供鉴权服务。
服务器400用于关联存储来自注册设备300的两个数据:分配给合法仿真对象的标识、使用第一算法对该标识加密后得到的Ki。服务器400中还存储了第二算法和第三算法,用于对仿真对象进行鉴权。在一些实施例中,该第二算法可以为A3算法,该第三算法可以为A8算法。
在本申请实施例中,电子设备100同样也关联存储有来自注册设备300的两个数据:分配给合法仿真对象的标识、使用第一算法对该标识加密后得到的Ki。电子设备100还存储了第二算法和第三算法,用于对仿真对象进行鉴权。
鉴权服务器500提供针对电子设备100所显示的仿真对象的鉴权服务。具体的,鉴权服务器500可以响应于电子设备100针对仿真对象的鉴权请求,识别该仿真对象是否是由电子设备100所信任的设备生成的,即对该仿真对象进行合法性鉴别或者安全性鉴别,判断该仿真对象是否合法。
仿真对象的鉴权流程可参考后续实施例的相关描述,在此暂不赘述。
下面详细介绍本申请实施例提供的示例性电子设备100的结构。参见图2,图2为本申请提供的示例性电子设备100的结构示意图。
如图2所示,电子设备可以包括处理器110、存储器120、通信模块130、传感器模块140、按键150、输入输出接口160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、显示装置180、摄像头190以及电池1100等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备的具体限定。在本申请另一些实施例中,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),视频处理单元(video processing unit,VPU)控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单 元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口,串行外设接口(serial peripheral interface,SPI)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器,充电器,摄像头190等。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向通信模块130中的无线通信模块传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与通信模块130。例如:处理器110通过UART接口与通信模块130中的蓝牙模块通信,实现蓝牙功能。
MIPI接口可以被用于连接处理器110与显示装置180,摄像头190等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示装置串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头190通过CSI接口通信,实现电子设备的拍摄功能。处理器110和显示装置180通过DSI接口通信,实现电子设备的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头190,显示装置180,通信模块130,传感器模块140,麦克风170C等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口可以用于连接充电器为电子设备充电,也可以用于电子设备与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接 其他电子设备,例如手机等。USB接口可以是USB3.0,用于兼容高速显示接口(display port,DP)信号传输,可以传输视音频高速数据。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备的结构限定。在本申请另一些实施例中,电子设备也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备可以通过通信模块130实现无线通信功能。通信模块130可以包括天线、无线通信模块、移动通信模块、调制解调处理器以及基带处理器等。
天线用于发射和接收电磁波信号。电子设备中可以包含多个天线,每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将某个天线复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块可以由天线接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块还可以对经调制解调处理器调制后的信号放大,经天线转为电磁波辐射出去。在一些实施例中,移动通信模块的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器等)输出声音信号,或通过显示装置180显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块或其他功能模块设置在同一个器件中。
无线通信模块可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
在一些实施例中,电子设备的天线和移动通信模块耦合,使得电子设备可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou  navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备通过GPU,显示装置180,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示装置180和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
在一些实施例中,显示装置180可以提供VR显示效果。显示装置180提供VR显示效果的方式可包括以下两种:1.在一些实施例中,显示装置180为显示屏,显示屏可包括显示面板,显示装置180的显示面板上可以显示仿真对象。用户可以从显示面板上看到该仿真对象,从而实现VR显示效果。显示面板可以采用液晶显示装置(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。2.在另一些实施例中,显示装置180可包括用于将光学信号(例如光束)直接投射到用户视网膜上的光学装置,用户可以通过该光学装置打出的光学信号直接看到仿真对象,从而实现BR显示效果。该光学装置可以是微型投影仪等等。
在另一些实施例中,显示装置180可以提供AR/MR显示效果。显示装置180提供AR/MR显示效果的方式可包括以下两种:1.在一些实施例中,显示装置180包括显示面板,显示装置180可以在显示面板上显示物理世界中实际存在的真实对象,并在该真实对象上叠加显示仿真对象。显示面板上显示的真实对象可以是由摄像头190捕获到的。显示面板的材料可参照前文相关描述。2.在另一些实施例中,显示装置可包括镜片和光学装置。该镜片可以是透明的,用户的眼睛可以透过该镜片看到物理世界中实际存在的真实对象。镜片的材料可以是有机玻璃(poly(methyl methacrylate),PMMA)、光学塑胶等。光学装置可以将光学信号直接投射到用户的视网膜上,使得用户看到仿真对象。镜片和光学装置组合而成的显示装置180可以使得用户感受到AR/MR显示效果。
在上述实施例中,显示装置180呈现给用户的仿真对象可以是电子设备100本身利用计算机图形技术、计算机仿真技术等技术生成的,例如可以是电子设备100的GPU利用计算机图形技术、计算机仿真技术等生成的,也可以是其他计算设备例如手机、电脑或者服务器等利用计算机图形技术、计算机仿真技术等技术生成后发送给电子设备100的,本申请对此不做限制。
电子设备中显示装置180的数量可以是两个,分别对应用户的两个眼球。这两个显示装置上显示的内容可以独立显示。这两个显示装置上可以显示不同的图像来提高图像的立体感。在一些可能的实施例中,电子设备中显示装置180的数量也可以是一个,同时对应用户的两个眼球。
电子设备可以通过ISP,摄像头190,视频编解码器,GPU,显示装置180以及应用处理器等实现拍摄功能。
ISP用于处理摄像头190反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头190中。
摄像头190用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备可以包括1个或N个摄像头190,N为大于1的正整数。摄像头190可包括但不限于传统彩色摄像头(RGB camera)、深度摄像头(RGB depth camera)、动态视觉传感器(dynamic vision sensor,DVS)相机等。
在一些实施例中,摄像头190可以采集用户的手部图像或者身体图像,处理器110可用于对摄像头190采集到的图像进行分析,从而识别用户输入的手部动作或身体动作。
在一些实施例中,摄像头190可以和红外设备(如红外发射器)配合使用来检测用户的眼部动作,例如眼球注视方向、眨眼操作、注视操作等等,从而实现眼球追踪(eye tracking)。
其中,数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
存储器120可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在存储器120的指令,从而执行电子设备的各种功能应用以及数据处理。存储器120可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能(比如声音播放功能,图像播放功能等)所需的应用程序(比如VR/AR/MR应用)等。存储数据区可存储电子设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备可以通过音频模块170,扬声器170A,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块还可以用于对音频信号编码和解码。在一些实施例中,音频模块可以设置于处理器110中,或将音频模块的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备可以通过扬声器收听音乐,或收听免提通话。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。电子设备可以设置至少一个麦克风140。在另一些实施例中,电子设备可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
在一些实施例中,麦克风170C可以检测到用于控制便携电子设备的语音信号。处理器 110随后可以处理该语音信号,识别语音命令。例如,当麦克风170C接收到输入的用于对仿真对象进行鉴权的语音指令时,电子设备100可以验证该仿真对象的合法性。
耳机接口用于连接有线耳机。耳机接口可以是USB接口,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
在一些实施例中,电子设备可以包括一个或多个按键150,这些按键可以控制电子设备,为用户提供访问电子设备上的功能。按键150的形式可以是按钮、开关、刻度盘和触摸或近触摸传感设备(如触摸传感器)。具体的,例如,用户可以通过按下按钮来打开电子设备的显示装置180。按键150包括开机键,音量键等。按键150可以是机械按键。也可以是触摸式按键。电子设备可以接收按键输入,产生与电子设备的用户设置以及功能控制有关的键信号输入。
在一些实施例中,电子设备可以包括输入输出接口160,输入输出接口160可以通过合适的组件将其他装置连接到电子设备。组件例如可以包括音频/视频插孔,数据连接器等。
传感器模块140可以包含多种传感器,例如,接近光传感器、距离传感器、陀螺仪传感器、环境光传感器、加速度传感器、温度传感器、磁传感器、骨传导传感器和指纹传感器等等。
接近光传感器可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备通过发光二极管向外发射红外光。电子设备使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备附近有物体。当检测到不充分的反射光时,电子设备可以确定电子设备附近没有物体。电子设备可以利用接近光传感器检测处于电子设备100特定位置的手势操作,以实现手势操作与操作命令相关联的目的。
距离传感器可以用于测量距离。电子设备可以通过红外或激光测量距离。
陀螺仪传感器可以用于确定电子设备的运动姿态。在一些实施例中,可以通过陀螺仪传感器确定电子设备围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器还可以用于导航,体感游戏场景。
环境光传感器用于感知环境光亮度。电子设备可以根据感知的环境光亮度自适应调节显示装置180的亮度。环境光传感器也可用于拍照时自动调节白平衡。
加速度传感器可检测电子设备在各个方向上(一般为三轴)加速度的大小。当电子设备静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于计步器等应用。
在本申请的一些实施例中,电子设备100可以根据加速度传感器、陀螺仪传感器、磁传感器等来跟踪用户头部的移动。
温度传感器用于检测温度。在一些实施例中,电子设备利用温度传感器检测的温度,执行温度处理策略。例如,当温度传感器上报的温度超过阈值,电子设备执行降低位于温度传感器附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备对电池1100加热,以避免低温导致电子设备异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备对电池1100的输出电压执行升压,以避免低温导致的异常关机。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构等,本申请对此不做限制。例如,本申请实施例中的电子设备100可以搭载iOS、 android、microsoft或者其他操作系统。
基于图1所示的仿真对象身份识别系统以及图2所示的电子设备100,本申请实施例提供了仿真对象的身份识别方法。实施本申请实施例提供的方法,电子设备在利用VR、AR、MR等技术显示仿真对象时,可以识别该仿真对象是否合法,并可以将识别的结果告知用户,还可以对识别到的非法的仿真对象做处理,给用户呈现一个安全的虚拟世界。该方法可以保护用户隐私,提升电子设备设备的使用安全。
下面以本申请实施例中电子设备100提供的AR/MR显示场景为例,介绍仿真对象的身份识别方法。
参见图3A-图3C,图3A-图3C示例性示出了一种AR/MR显示场景。在该AR/MR显示场景中,用户佩戴电子设备100,并且通过电子设备100看到叠加在真实世界中的实体对象上的仿真对象。
图3A示例性示出了一幅物理世界中的真实图像。该真实图像中包括一个或多个实体对象,例如奔跑的运动员、树木等。
图3B示例性示出了电子设备100提供的用户界面31。如图3B所示,电子设备100可以包括两个显示屏,两个显示屏上均显示有:运动员的图像301、运动员的图像302、树木的图像303、树木的图像304、文本信息305、文本信息306及仿真人物307。
其中,运动员的图像301、运动员的图像302、树木的图像303、树木的图像304所对应的对象例如运动员、树木为物理世界中真实存在的实体对象。文本信息305、文本信息306及仿真人物307为电子设备100提供的仿真对象。电子设备100显示的仿真对象可以和用户进行交互。例如,电子设备显示仿真人物307的同时,可以使用扬声器播放语音,使用麦克风采集用户输入的语音,从而使用户和仿真人物307产生语音交互。这样的语音交互可以让用户产生和仿真人物聊天的感觉。
可理解的,电子设备100的两个显示屏上所显示的内容可以不同,从而给用户呈现立体的视觉效果,例如,电子设备100在左右两个显示屏上显示的内容相对于显示屏的位置可稍有不同。
在图3B所示的用户界面31中,实体对象的图像例如运动员的图像301、运动员的图像302、树木的图像303、树木的图像304可以是由电子设备100通过摄像头190捕获到的。仿真对象例如文本信息305、文本信息306及仿真人物307可以是电子设备100利用计算机图形技术、计算机仿真技术等技术生成的,也可以是其他计算设备例如手机、电脑或者服务器等利用计算机图形技术、计算机仿真技术等技术生成后发送给电子设备100的。
图3C示例性示出了用户通过电子设备100看到的图像。用户的左眼通过图3B所示的左侧的显示屏查看其显示的内容,用户的右眼通过图3B所示的右侧的显示屏查看其显示的内容,并在大脑中产生有空间感的立体视觉效果,用户可以感觉到自己看到的图像如图3C所示。用户可以通过电子设备100看到真实世界中的实体对象,还可以看到叠加在该实体对象上的仿真对象,即用户可以感受到AR/MR显示场景。
不限于图3B所示的2个显示屏,在一些实施例中,电子设备100还可以仅提供一个显示屏,使得用户通过电子设备100看到图3C所示的图像。
不限于图3B所示的在显示屏上显示摄像头190采集到的真实图像,在一些实施例中,电子设备100还可以是配置有透明的镜片,使得用户可以通过镜片直接看到真实世界中的实 体对象。
不限于图3B所示的在显示屏上直接显示仿真对象,在一些实施例中,电子设备100还可以配置有光学装置,该光学装置可以投射光学信号到用户的视网膜上,使得用户看到仿真对象,即用户可以通过电子设备100看到图3C所示的图像。
在一些实施例中,随着AR/MR技术的发展、电子设备硬件装置的提升,用户通过电子设备100看到的仿真对象的真实感会越来越强,用户有可能无法区分电子设备100呈现的实体对象和仿真对象。在本申请实施例中,电子设备100提供AR/MR显示场景时,可以提示用户当前显示的图像中哪些是实体对象,哪些是仿真对象。
在本申请实施例中,电子设备100可以通过以下任意一种方式来判断所显示的对象是实体对象或者仿真对象:
(1)电子设备100可以通过显示的图像中各个对象的来源,判断该对象是实体对象或者仿真对象。具体的,电子设备100显示的图像中,通过电子设备100的摄像头190捕获到的图像为物理世界中真实存在的实体对象,而仅由电子设备100通过GPU来渲染生成的图像为仿真对象。这里,电子设备100通过GPU渲染生成的图像可以是电子设备100本地端利用计算机图形技术、计算机仿真技术等技术生成的图像,也可以是电子设备100接收到的其他计算设备200生成的图像后再利用本地端的GPU渲染而成的。
需要注意的是,在AR场景下,电子设备可以将仿真对象叠加到实体对象之上,用户看到的仿真对象和实体对象均由GPU渲染呈现。在这种情况下,由摄像头捕获后再经过GPU渲染的图像为仿真对象,而仅通过GPU渲染呈现的图像为仿真对象。
(2)电子设备100可以提取显示的图像中各个对象的特征,通过各个对象的特征来判断该对象时实体对象或者仿真对象。对象特征可包括颜色特征、纹理特征、形状特征等。一般情况下,实体对象的特征比仿真对象的特征更加丰富。
在本申请实施例中,电子设备100可以提示用户当前显示的图像中哪些是实体对象,哪些是仿真对象。电子设备100可以在以下情况下提示用户:1.电子设备100可以在为用户提供AR/MR显示场景时持续提示用户当前显示的图像中哪些是实体对象,哪些是仿真对象。2.电子设备100可以在仿真对象初始显示的一段时间之内,提示用户该对象为仿真对象。3.电子设备100可以在用户选中当前显示图像中的某个对象时提示用户该对象为仿真对象或者实体对象。用户选中当前显示图像中的对象的方式可参照后续实施例的相关描述,在此暂不赘述。
参考图4A,图4A示出了一种可能的电子设备100提示用户当前显示的图像中哪些是实体对象,哪些是仿真对象的方式。如图4A所示,电子设备100可以在用户界面31上显示的仿真对象周围添加虚线框,从而提示用户虚线框内显示的内容为仿真对象。参考图4B,图4B为电子设备100显示如图4A所示的用户界面31时用户所看到的图像,用户可以看到仿真对象周围添加了虚线框,从而准确地分辨出所看到图像中的实体对象和仿真对象。
不限于图4A及图4B所示的在仿真对象周围添加虚线框的方式来提示用户,在一些实施例中,电子设备100还可以通过其他的显示方式来提示用户。例如,电子设备100还可以以动画效果显示仿真对象(例如间断而非持续性地显示仿真对象)、在仿真对象上添加图标或者文本、在实体对象上添加图标或者文本等方式,提示用户当前显示的图像中哪些是实体对象,哪些是仿真对象。
不限于视觉上的显示效果,电子设备100还可以通过语音、振动、闪光灯等方式来提示用户当前显示的图像中哪些是实体对象,哪些是仿真对象。例如,电子设备100可以配置有马达,当用户选中电子设备100所显示图像中的某个对象时,若该对象为实体对象,马达可以输出一次振动反馈,若该对象为仿真对象,马达可以输出两次振动反馈。
在本申请实施例中,可以将用于指示电子设备100显示的对象为仿真对象的信息称为第三提示信息。也即是说,第三提示信息可包括但不限于:电子设备100显示的可视化元素、语音、指示灯反馈或者振动反馈。
在本申请实施例中,可以将用于指示电子设备100显示的对象为实体对象的信息称为第四提示信息。也即是说,第三提示信息可包括但不限于:电子设备100显示的可视化元素、语音、指示灯反馈或者振动反馈。
在本申请实施例中,电子设备100可以发起对当前所显示的仿真对象的鉴权,即识别当前所显示的仿真对象是否是合法的仿真对象。电子设备100可以在以下两种情况下发起对当前所显示的仿真对象的鉴权:
(1)第一种情况,电子设备100检测到用于针对所显示的一个或多个仿真对象的进行鉴权的操作,响应于该操作,电子设备100对该一个或多个仿真对象进行鉴权。
首先,用户可以在电子设备100所显示的一个或多个仿真对象中选中或者定向到将要鉴权的仿真对象。用户选中或者定向到将要鉴权的仿真对象的方式可以包括以下几种:
1.用户通过手部运动来选中或者定向到将要鉴权的仿真对象。
参考图5A,图5A示出了一种用户通过手部运动来选中或者定向到将要鉴权的仿真对象的方式。如图5A所示,电子设备100可以配合手持设备一起使用,该手持设备在真实物理世界中的运动可以转换成用户和电子设备100所显示的对象之间的交互操作。具体的,用户手部握持该手持设备进行运动,手持设备可以检测及跟踪自身运动信息后发送至电子设备100,使得电子设备100掌握手持设备的运动情况。这里,手持设备可以通过配置的加速度传感器、陀螺仪传感器、磁传感器等传感器检测及跟踪自身运动情况。电子设备100可以根据手持设备的运行情况,在显示屏上显示模拟的该手持设备定向到仿真对象的提示信息。如图5A所示,电子设备100可以在用户界面31上显示提示信息501,提示信息501用于提示用户当前定向到的或者选中的仿真对象为仿真人物307。该提示信息501可以为虚拟的箭头形状的光束。参考图5B,图5B为电子设备100显示如图5A所示的用户界面31时,用户所看到的图像。
不限于如图5A所示的用户通过通过手持设备来定向到仿真对象的方式,在其他一些实施例中,电子设备100还可以将用户手部的运动转换成用户和电子设备100所显示的仿真对象之间的交互操作。例如,电子设备100可以通过摄像头190捕获用户手部运动过程中的图像,并根据用户手部运动过程中的图像,在显示屏上显示模拟的用户手部运动定向到仿真对象的提示信息。
2.用户通过眼球运动来选中或者定向到将要鉴权的仿真对象。
参考图6A,图6A示出了一种用户通过眼球运动来选中或者定向到将要鉴权的仿真对象的方式。如图6A所示,用户眼球可注视电子设备100的显示屏,并转动眼球来选中或者定向到将要鉴权的仿真对象。电子设备100可通过眼动追踪(eye tracking)技术,通过红外设备(如红外发射器)和/或摄像头190来检测到用户的眼球注视到显示屏上的位置。如图6A 所示,电子设备100可以根据检测到的用户眼球注视显示屏的位置,在用户眼球注视到显示屏的位置上显示模拟的用户眼球定向到仿真对象的提示信息。如图6A所示,电子设备100可以在用户界面31上显示提示信息601,提示信息601用于提示用户当前定向到的或者选中的仿真对象为仿真人物307。该提示信息601可以为虚拟的圆形图标。参考图6B,图6B为电子设备100显示如图6A所示的用户界面31时,用户所看到的图像。
不限于通过手部运动或者眼球运动来选中将要鉴权的仿真对象,本申请实施例还可以通过其他的一些方式来选中将要鉴权的仿真对象。例如,用户还可以通过输入语音指令来选中将要鉴权的仿真对象等等。
用户选中或者定向到将要鉴权的仿真对象之后,可以触发电子设备100对该仿真对象进行鉴权。用户触发电子设备100对该选中或者定向到的仿真对象进行鉴权的操作可包括以下几种:
1.用户在电子设备100或者连接到电子设备100的手持设备上设置的实体按键或者虚拟按键上输入用户操作(例如按压操作、触摸操作等),响应于该用户操作,电子设备100对选中或者定向到的仿真对象进行鉴权。
示例性地,可参考图5A,用户可以在手持设备上的实体按键上输入按压操作,手持设备检测到该操作后,将该操作的指示信息发送至电子设备100。响应于该操作,电子设备100对用户选中或者定向到的仿真对象307进行鉴权。
2.用户通过眼球运动触发电子设备100对用户选中或者定向到的仿真对象进行鉴权。该眼球运动可包括:对该仿真对象输入的超过预设时长的注视操作、注视该仿真对象时输入的一次或多次的眨眼操作等等。电子设备100可通过红外设备和/或摄像头190检测到该眼球运动,响应于检测到的该眼球运动,电子设备100对选中或者定向到的仿真对象进行鉴权。
3.用户可以向电子设备100输入语音指令,电子设备100可以通过麦克风170C检测到该语音指令,并响应于该语音指令对选中或者定向到的仿真对象进行鉴权。
不限于上述3种触发电子设备100对仿真对象进行鉴权的方式,用户还可以通过其他方式触发电子设备100对选中或者定向到的仿真对象进行鉴权,本申请实施例对此不做限制。例如,在其他的一些实施例中,用户还可以通过手部运动模拟在仿真对象上的画圈手势,触发电子设备100对其进行鉴权。
在本申请实施例中,用于触发电子设备100对仿真对象进行鉴权的操作,可以被称为第一操作。也就是说,该第一操作可包括但不限于:作用于该仿真对象的手势输入,作用于该仿真对象的眨眼操作,作用于该仿真对象的注视操作(例如超过预设时长的注视操作),或者,用于确定该仿真对象是否合法的语音指令(例如语音指令“鉴权”)。
(2)第二种情况,电子设备100主动发起对当前所显示的一个或多个仿真对象的鉴权。
在一些实施例中,电子设备100可以在每次显示新的仿真对象时,对该仿真对象进行鉴权。在另一些实施例中,电子设备100还可以周期性对显示的一个或多个仿真对象进行鉴权,例如电子设备100可以每半个小时对显示的全部仿真对象进行鉴权等。
不限于在上述两种情况下发起对仿真对象的鉴权,在本申请实施例中,电子设备100还可以在其他情况下发起对当前所显示的仿真对象的鉴权。例如,用户还可以在和电子设备100连接或者配对的手机上触发电子设备100对当前所显示的仿真对象进行鉴权。
电子设备100对仿真对象进行鉴权的过程涉及到和服务器400、鉴权服务器500的交互,该鉴权过程将在后续实施例进行详细的描述,在此暂不赘述。
在本申请实施例中,电子设备100对仿真对象进行鉴权的过程当中,可以提示用户当前正在执行鉴权操作。电子设备100提示用户当前正在执行鉴权操作的方式可包括以下几种:
1.电子设备100可以在显示屏上显示提示信息,该提示信息用于提示用户当前正在执行鉴权操作。
参考图7A,图7A示出了一种可能的电子设备100在显示屏上显示的提示信息。如图7A所示,电子设备100可以在提供的用户界面31上显示窗口701,窗口701中包括文本信息“正在对仿真对象进行鉴权,请等待…”,该文本信息可用于提示用户当前正在执行鉴权操作。参考图7B,图7B为电子设备100显示如图7A所示的用户界面31时,用户所看到的图像。在一些实施例中,窗口701在电子设备100的显示屏上显示一段时间后自动消失,无需用户交互。
不限于图7A所示的窗口701,电子设备100还可以在显示屏上显示其他形式的提示信息,例如图标、动画等等,来提示用户当前正在执行鉴权操作,本申请实施例对此不做限制。
2.电子设备100可以通过语音、闪烁指示灯、振动等方式提示用户当前正在执行鉴权操作。
具体的,电子设备100还可以通过扬声器170A播放语音“正在鉴权”、控制指示灯闪烁、控制马达振动等方式来提示用户当前正在执行鉴权操作。
在本申请实施例中,电子设备100输出的上述任意一种用于提示当前正在执行鉴权操作的信息可以被称为第二提示信息。即,第二提示信息可包括但不限于:电子设备100显示的可视化元素、语音、指示灯反馈或者振动反馈。
电子设备100对显示的仿真对象执行鉴权操作后,可以确定该仿真对象是否合法。
在一些实施例中,电子设备100执行针对一个或多个仿真对象的鉴权操作之后,可以将鉴权结果告知用户,即提示用户该一个或多个仿真对象是否合法。电子设备100提示用户该一个或多个仿真对象是否合法的方式可包括以下几种:
1.电子设备100可以在显示屏上显示提示信息,该提示信息用于提示用户仿真对象是否合法。
参考图8A,图8A示出了一种可能的电子设备100在显示屏上显示的提示信息。如图8A所示,电子设备100可以在提供的用户界面31上显示图标801,该图标801可用于提示用户仿真对象307非法。在一些实施例中,图标801在电子设备100的显示屏上显示一段时间后自动消失,无需用户交互。参考图8B,图8B为电子设备100显示如图8A所示的用户界面时,用户所看到的图像。不限于图8A所示的图标801,电子设备100还可以在显示屏上显示其他形式的提示信息,例如文本、动画等等,从而提示用户仿真对象是否合法,本申请实施例对此不做限制。
2.电子设备100可以通过语音、闪烁指示灯、振动等方式提示用户仿真对象是否合法。
例如,电子设备100可以通过扬声器170C播放语音“合法”来提示用户该仿真对象合法,通过扬声器170C播放语音“非法”来提示用户仿真对象非法。又例如,电子设备100可以控制指示灯闪烁一次来提示用户该仿真对象合法,控制指示灯闪烁两次来提示用户仿真对象非法。又例如,电子设备100可以控制马达输出一次振动来提示用户该仿真对象合法,控制马达输出两次振动来提示用户仿真对象非法。
在本申请实施例中,电子设备100输出的上述任意一种用于提示仿真对象是否合法的信息可以被称为第一提示信息。即,第一提示信息可包括但不限于:电子设备100显示的可视化元素、语音、指示灯反馈或者振动反馈。
通过上述任意一种方式来提示用户仿真对象的鉴权结果,可以使得用户获知当前看到的仿真对象是否合法。如果当前显示的仿真对象是非法的,则可以提高用户的警觉性,避免用户在仿真对象的诱导下泄露个人信息(例如泄露家庭住址、电话号码等)或者执行不安全的操作(例如点击不安全的网站或者链接等),从而保护用户隐私,提升电子设备的使用安全。
电子设备100通过上述任意一种方式提示用户仿真对象的鉴权结果之后,用户可以获知该仿真对象是否是合法的。在一些实施例中,用户可以选择对非法的仿真对象做进一步处理。该处理可包括:屏蔽,和/或,举报。下面介绍几种用户屏蔽或者举报非法的仿真对象的方式。
参考图9A及图9C,在检测到作用于如图9A所示的用户界面31上图标801的用户操作时,响应于该操作,电子设备100在用户界面31上显示以下一个或多个控件:控件901、控件902,即电子设备100显示如图9C所示的用户界面。图9A所示的用户界面31和图8A所示的用户界面31相同,可参照相关描述。这里,作用于图标801上的用户操作可以包括:用户通过手部运动或眼球运动来选中或者定向到图标801上后,在电子设备100或者手持设备上设置的按键上输入的用户操作(例如按压操作、触摸操作等),或者,用户眼睛长时间地注视图标801的操作,或者,用户眼睛注视图标801时输入的一次或多次眨眼操作,或者,用户输入的语音指令等等。用户通过手部运动或眼部运动选中或者定向到图标801上的具体实现,可参照前文实施例中关于用户通过手部运动或眼部运动选中或者定向到仿真对象的相关描述。
参考图9C,图9C所示的用户界面31中,控件901可以接收用户操作,响应于该作用于控件901上的用户操作,电子设备100停止在显示屏上显示仿真对象307,即屏蔽仿真对象307。这里,作用于控件901上的用户操作可以包括:用户通过手部运动或眼球运动来选中或者定向到控件901上后,在电子设备100或者手持设备上设置的按键上输入的用户操作(例如按压操作、触摸操作等),或者,用户眼睛长时间地注视控件901的操作,或者,用户眼睛注视控件901时输入的一次或多次眨眼操作,或者,用户输入的语音指令等等。用户通过手部运动或眼部运动选中或者定向到控件901上的具体实现,可参照前文实施例中关于用户通过手部运动或眼部运动选中或者定向到仿真对象的相关描述。
参考图9C,图9C所示的用户界面31中,控件902可以接收用户操作,响应于该作用于控件902上的用户操作,电子设备100举报仿真对象307。电子设备100举报该仿真对象307是指,电子设备100将仿真对象307的标识发送至鉴权服务器300。在一些实施例中,鉴权服务器300可以存储电子设备100举报的非法仿真对象的标识,在接收到其他电子设备针对仿真对象的鉴权请求时,如果该仿真对象的标识和存储的非法仿真对象的标识相同,可直接向该其他电子设备反馈该仿真对象非法的鉴权结果。
这里,作用于控件902上的用户操作可以包括:用户通过手部运动或眼球运动来选中或者定向到控件902上后,在电子设备100或者手持设备上设置的按键上输入的用户操作(例如按压操作、触摸操作等),或者,用户眼睛长时间地注视控件902的操作,或者,用户眼睛注视控件902时输入的一次或多次眨眼操作,或者,用户输入的语音指令等等。用户通过手部运动或眼部运动选中或者定向到控件902上的具体实现,可参照前文实施例中关于用户通过手部运动或眼部运动选中或者定向到仿真对象的相关描述。
参考图9B,图9B示出了电子设备100显示如图9A所示的用户界面时,用户所看到的图像。参考图9D,图9D示出了电子设备100显示如图9C所示的用户界面时,用户所看到的图像。
不限于通过图9A及图9C所示的方式来屏蔽或者举报非法的仿真对象,在本申请其他一些实施例中,电子设备100还可以通过其他方式来屏蔽或者举报非法的仿真对象。例如,用户还可以在仿真对象307上输入模拟的画叉手势,触发电子设备100举报仿真对象307。又例如,用户还可以在仿真对象307上输入模拟的画圈手势,触发电子设备100屏蔽仿真对象307。
在本申请实施例中,上述提及的任意一种用于触发电子设备屏蔽非法仿真对象的操作可以被称为第二操作。也就是说,第二操作可以包括但不限于:作用于该非法仿真对象的手势输入,作用于该非法仿真对象的眨眼操作,作用于该非法仿真对象的注视操作(例如超过预设时长的注视操作),或者,用于屏蔽该非法仿真对象的语音指令。
在本申请实施例中,上述提及的任意一种用于触发电子设备举报非法仿真对象的操作可以被称为第三操作。也就是说,第三操作可以包括但不限于:作用于该非法仿真对象的手势输入,作用于该非法仿真对象的眨眼操作,作用于该非法仿真对象的注视操作(例如超过预设时长的注视操作),或者,用于举报该非法仿真对象的语音指令。
在另一些实施例中,电子设备100执行针对一个或多个仿真对象的鉴权操作之后,可以不将鉴权结果告知用户,即电子设备100可以不提示用户该一个或多个仿真对象是否合法,并且主动对非法的仿真对象做进一步处理。该处理可包括:屏蔽,和/或,举报。
参考图10A,图10A示出的用户界面31为电子设备100屏蔽非法的仿真对象后所显示的界面。如图10A所示,电子设备100停止在显示屏上显示非法的仿真对象(即仿真人物307)。
在一些实施例中,电子设备100屏蔽非法的仿真对象之后,可以提示用户电子设备100已经屏蔽了非法的仿真对象。参考图10A,其示出了电子设备100提示用户的一种方式。如图10A所示,电子设备100可以在显示屏上显示窗口1001。窗口1001中包括提示信息(例如文本信息“已为您屏蔽非法的仿真对象,请继续体验AR/MR场景!”)。参考图10B,图10B示出了电子设备100显示如图10A所示的用户界面31时,用户所看到的图像。不限于图10A所示的窗口1001,电子设备100还可以在显示屏上显示其他形式的提示信息,例如图标、动画等等。不限于在显示屏上显示的提示信息,电子设备100还可以通过语音、振动、闪光灯等来提示用户电子设备100已经屏蔽了非法的仿真对象,本申请实施例对此不做限制。
在一些实施例中,电子设备100举报非法的仿真对象之后,可以提示用户电子设备100已经举报了非法的仿真对象。参考图11A,其示出了电子设备100提示用户的一种方式。如图11A所示,电子设备100可以在显示屏上显示窗口1101。窗口1101中包括提示信息(例如文本信息“已成功举报非法的仿真对象,请继续体验AR/MR场景!”)。参考图11B,图11B示出了电子设备100显示如图11A所示的用户界面31时,用户所看到的图像。不限于图11A所示的窗口1101,电子设备100还可以在显示屏上显示其他形式的提示信息,例如图标、动画等等。不限于在显示屏上显示的提示信息,电子设备100还可以通过语音、振动、闪光灯等来提示用户电子设备100已经屏蔽了非法的仿真对象,本申请实施例对此不做限制。
不限于上述图3A-图3C、图4A-图4B、图5A-图5B、图6A-图6B、图7A-图7B、图8A-图8B、图9A-图9D、图10A-图10B、图11A-图11B所示的AR/MR显示场景,本申请实施例的仿真对象的识别方法还可以应用到VR显示场景中。也就是说,电子设备100为用户提 供VR显示场景时,也可以实施本申请实施例提供的仿真对象的识别方法。
不限于上述图3A-图3C、图4A-图4B、图5A-图5B、图6A-图6B、图7A-图7B、图8A-图8B、图9A-图9D、图10A-图10B、图11A-图11B所示的在显示屏上直接显示仿真对象,在一些实施例中,电子设备可以配置有光学装置,该光学装置可以投射光学信号到用户的视网膜上,使得用户可以通过电子设备100看到这些仿真对象。例如,参考图5B,图5B所示的用于提示用户定向到仿真对象的箭头形状的光束,可以是由光学装置投射光学信号到用户的视网膜上之后,用户所看到的对象。又例如,参考图8A,图8A所示的用于提示用户仿真对象307非法的图标801,可以是由光学装置投射光学信号到用户的视网膜上之后,用户所看到的对象。
下面介绍本申请实施例提供的电子设备100对仿真对象进行鉴权的过程,即电子设备100识别仿真对象是否合法的过程。
在本申请实施例中,合法的仿真对象均预先注册到注册设备,由注册设备为合法的仿真对象分配标识,该标识用于鉴权。图12示出了本申请实施例示例性提供的合法仿真对象的注册流程。如图12所示,该注册流程可包括如下步骤:
步骤S110、电子设备100或计算设备200生成仿真对象。
步骤S120、电子设备100或计算设备200向注册设备300发送注册请求。
步骤S130、注册设备300响应于该注册请求,为该仿真对象分配标识。
在一些实施例中,注册设备300为仿真对象分配的标识可以是唯一识别码(universally unique identifier)UUID。
步骤S140、注册设备300使用第一算法对该标识进行加密后得到密钥值,并将该标识和密钥值分别发送给电子设备100和服务器400。
在一些实施例中,第一算法可以为K4算法,使用第一算法对仿真对象的标识加密后得到的密钥值为鉴权键(Ki)。
步骤S140中注册设备300传输的为具体的数值,可以避免泄露第一算法。
步骤S150、电子设备100、服务器400关联存储该仿真对象的标识和密钥值。
通过上述注册流程,注册设备、服务器400中都关联存储有合法仿真对象的标识以及对应的密钥值。
参考图13,图13示出了本申请实施例提供的电子设备100对仿真对象进行鉴权的过程。电子设备100对仿真对象进行鉴权的时机可参照前文实施例的相关描述。如图13所示,该鉴权过程可包括如下步骤:
步骤S210、电子设备100向鉴权服务器500发送鉴权请求,该鉴权请求携带有电子设备100请求鉴权的仿真对象的标识。
该仿真对象的标识可以是通用唯一识别码(universally unique identifier)UUID。如果该仿真对象合法,则仿真对象的UUID可以是注册设备为该仿真对象分配的。
步骤S220、鉴权服务器500接收到鉴权请求,将该仿真对象的标识发送给服务器400。
步骤S230、服务器400查找该仿真对象的标识对应的密钥值,并至少通过第二算法和/或第三算法对该密钥值做计算,生成第一鉴权响应值。
服务器400中存储了合法仿真对象的标识及对应的密钥值。合法仿真对象的标识及对应 的密钥值可以是注册设备下发给服务器400的。该密钥值可以是通过第一算法对该标识做计算之后得到的值。合法仿真对象的标识所对应的密钥值可以为鉴权键Ki。
第一鉴权响应值可以为符号响应(sign response,SRES)。
步骤S240、服务器400将第一鉴权响应值发送给鉴权服务器500。
步骤S250、鉴权服务器500接收到第一鉴权响应值,向电子设备100发送鉴权请求。
步骤S260、电子设备100接收到鉴权请求,查找该仿真对象的标识对应的密钥值,并通过第二算法和第三算法对该密钥值做计算,生成第二鉴权响应值。
电子设备100和服务器400相同,同样存储有合法仿真对象的标识及对应的密钥值。合法仿真对象的标识及对应的密钥值可以是注册设备下发给服务器400的。该密钥值可以是通过第一算法对该标识做计算之后得到的值。在一些实施例中,该第一算法可以为K4算法,合法仿真对象的标识所对应的密钥值可以为鉴权键Ki。
在一些实施例中,第二算法可以为A3算法,第三算法可以为A8算法。第二鉴权响应值可以为符号响应(sign response,SRES)。
步骤S270、电子设备100将第二鉴权响应值发送给鉴权服务器500.
步骤S280、鉴权服务器500接收到第二鉴权响应值,对比第一鉴权响应值和第二鉴权响应值,若两者相同,则该仿真对象合法,若两者不同,则该仿真对象非法。
上述步骤S240和步骤S280传输的为具体的数值,可以避免泄露第二算法和第三算法。
步骤S290、鉴权服务器500将鉴权结果发送给电子设备100。
可理解的,上述提及的第一算法、第二算法、第三算法仅为示例,在本申请其他的一些实施例中,第一算法、第二算法、第三算法也可以被替换为其他算法。
电子设备100在对非法的仿真对象进行鉴权时,该非法的仿真对象有可能也有对应的标识。但是,由于该非法的仿真对象没有经过注册流程,并且电子设备100无法获知第一算法,因此电子设备100中存储的该非法的仿真对象的密钥值是无效的。类似的,由于非法的仿真对象没有经过注册流程,并且服务器400无法获知第一算法,因此服务器400中存储的该非法的仿真对象的密钥值也是无效的。在鉴权过程中,针对非法的仿真对象,电子设备100计算得到的第一鉴权响应值和服务器400计算得到的第二鉴权响应值不同,鉴权服务器500可判定该仿真对象非法。
不限于如图12所示的鉴权过程,还可以通过其他方式来对仿真对象进行鉴权,本申请实施例对此不做限制。例如,还可以通过类似通用移动通讯系统(universal mobile telecommunications system)、长期演进(long term evolution,LTE)等技术中有效国际移动用户识别码(international mobile subscriber identification number,IMSI)的鉴权方式来对仿真对象进行鉴权等。
通过上述实施例可知,实施本申请提供的技术方案,可以提高用户的警觉性,保障用户操作的安全,避免泄露用户隐私,从而提升电子设备的使用安全。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算 机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk)等。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (23)

  1. 一种仿真对象的身份识别方法,其特征在于,所述方法包括:
    电子设备通过显示装置显示仿真对象,所述仿真对象是利用计算机图形技术生成的虚拟图像;
    所述电子设备确定所述仿真对象是否合法;
    响应于所述仿真对象是否合法的确定结果,所述电子设备输出第一提示信息,所述第一提示信息用于指示所述仿真对象是否合法;
    其中,在所述电子设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在所述电子设备对应的注册设备注册过的仿真对象是非法的仿真对象。
  2. 根据权利要求1所述的身份识别方法,其特征在于,
    所述显示装置为显示屏,所述电子设备通过显示装置显示仿真对象具体包括:所述电子设备通过所述显示屏显示仿真对象;或者,
    所述显示装置包括光学装置,所述电子设备通过显示装置显示仿真对象具体包括:所述电子设备通过所述光学装置投射和所述仿真对象所对应的光学信号。
  3. 根据权利要求1或2所述的身份识别方法,其特征在于,所述仿真对象是由所述电子设备生成的,或者,所述仿真对象是由计算设备生成后发送给所述电子设备的。
  4. 根据权利要求1-3任一项所述的身份识别方法,其特征在于,所述第一提示信息,包括以下一项或多项:所述显示装置显示的可视化元素、语音、指示灯反馈或者振动反馈。
  5. 根据权利要求1-4任一项所述的身份识别方法,其特征在于,所述电子设备确定所述仿真对象是否合法,包括:
    所述电子设备检测到作用于所述仿真对象的第一操作,响应于所述第一操作,所述电子设备确定所述仿真对象是否合法;或者,
    所述电子设备在显示所述仿真对象的同时,确定所述仿真对象是否合法;或者,
    所述电子设备周期性地确定所述仿真对象是否合法。
  6. 根据权利要求5所述的身份识别方法,其特征在于,所述作用于所述仿真对象的第一操作包括以下任意一项:作用于所述仿真对象的手势输入,作用于所述仿真对象的眨眼操作,作用于所述仿真对象的注视操作,或者,用于确定所述仿真对象是否合法的语音指令。
  7. 根据权利要求1-6任一项所述的身份识别方法,其特征在于,所述电子设备确定所述仿真对象是否合法时,所述方法还包括:所述电子设备输出第二提示信息,所述第二提示信息用于指示所述电子设备正在确定所述仿真对象是否合法。
  8. 根据权利要求7所述的身份识别方法,其特征在于,所述第二提示信息,包括以下一项或多项:所述显示装置显示的可视化元素、语音、指示灯反馈或者振动反馈。
  9. 根据权利要求1-8任一项所述的身份识别方法,其特征在于,所述方法还包括:
    响应于所述仿真对象非法的确定结果,所述电子设备停止显示所述仿真对象。
  10. 根据权利要求1-8任一项所述的身份识别方法,其特征在于,所述方法还包括:
    若所述仿真对象为非法的仿真对象,所述电子设备检测作用于所述仿真对象的第二操作,响应于所述第二操作,所述电子设备停止显示所述仿真对象。
  11. 根据权利要求1-10任一项所述的身份识别方法,其特征在于,所述方法还包括:
    所述电子设备显示实体对象;
    所述电子设备输出第三提示信息,所述第三提示信息用于指示所述仿真对象是利用计算 机图形技术生成的虚拟图像;和/或,
    所述电子设备输出第四提示信息,所述第四提示信息用于指示所述实体对象对应的物体存在于真实世界中。
  12. 根据权利要求1-11任一项所述的身份识别方法,其特征在于,所述电子设备确定所述仿真对象是否合法,具体包括:
    所述电子设备将所述仿真对象的标识发送给鉴权服务器,以使得所述鉴权服务器判断所述仿真对象是否合法;
    所述电子设备接收所述鉴权服务器返回的鉴权结果,所述鉴权结果指示所述仿真对象是否合法;
    所述电子设备根据所述鉴权结果确定所述仿真对象是否合法。
  13. 根据权利要求1-12任一项所述的身份识别方法,其特征在于,所述仿真对象包括以下一项或多项:仿真人物、仿真动物、仿真树木或仿真建筑物。
  14. 一种仿真对象的识别方法,其特征在于,所述方法包括:
    头戴式设备通过显示装置显示仿真对象和实体对象,所述仿真对象是利用计算机图形技术生成的虚拟图像;
    所述头戴式设备检测到作用于所述仿真对象的第一操作,响应于所述第一操作,所述头戴式设备确定所述仿真对象是否合法;
    所述头戴式设备输出第二提示信息,所述第二提示信息用于指示所述头戴式设备正在确定所述仿真对象是否合法;
    响应于所述仿真对象是否合法的确定结果,所述头戴式设备输出第一提示信息,所述第一提示信息用于指示所述仿真对象是否合法;
    若所述仿真对象为非法的仿真对象,所述头戴式设备检测作用于所述仿真对象的第二操作,响应于所述第二操作,所述头戴式设备停止显示所述仿真对象;
    其中,在所述头戴式设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在所述头戴式设备对应的注册设备注册过的仿真对象是非法的仿真对象;
    所述仿真对象包括仿真人物或仿真动物。
  15. 一种电子设备上的图形用户界面,所述电子设备包括显示装置、存储器、以及一个或多个处理器,所述一个或多个处理器用于执行存储在所述存储器中的一个或多个计算机程序,其特征在于,所述图形用户界面包括:
    显示仿真对象,所述仿真对象是利用计算机图形技术生成的虚拟图像;
    响应于所述仿真对象是否合法的确定结果,输出第一提示信息,所述第一提示信息用于指示所述仿真对象是否合法;
    其中,在所述电子设备对应的注册设备注册过的仿真对象是合法的仿真对象,未在所述电子设备对应的注册设备注册过的仿真对象是非法的仿真对象。
  16. 根据权利要求15所述的图形用户界面,其特征在于,
    所述显示装置为显示屏,显示仿真对象具体包括:触发所述显示屏显示仿真对象;或者,
    所述显示装置包括光学装置,显示仿真对象具体包括:触发所述光学装置投射和所述仿真对象所对应的光学信号。
  17. 根据权利要求15或16所述的图形用户界面,其特征在于,所述图形用户界面还包括:在输出所述第一提示信息之前,输出第二提示信息,所述第二提示信息用于指示所述电子设备正在确定所述仿真对象是否合法。
  18. 根据权利要求15-17任一项所述的图形用户界面,其特征在于,所述图形用户界面还包括:响应于所述仿真对象非法的确定结果,停止显示所述仿真对象。
  19. 根据权利要求15-17任一项所述的图形用户界面,其特征在于,所述图形用户界面还包括:
    若所述仿真对象为非法的仿真对象,检测作用于所述仿真对象的第二操作,响应于所述第二操作,停止显示所述仿真对象。
  20. 根据权利要求15-19任一项所述的图形用户界面,其特征在于,所述图形用户界面还包括:
    显示实体对象;
    输出第三提示信息,所述第三提示信息用于指示所述仿真对象是利用计算机图形技术生成的虚拟图像;和/或,
    输出第四提示信息,所述第四提示信息用于指示所述实体对象对应的物体存在于真实世界中。
  21. 一种电子设备,所述电子设备包括:一个或多个处理器、存储器和显示装置;
    所述存储器与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得所述电子设备执行如权利要求1-13中任一项所述的方法。
  22. 一种仿真对象的身份识别系统,其特征在于,所述系统包括:电子设备、注册设备、鉴权服务器;其中:
    所述注册设备用于对仿真对象进行注册;
    所述电子设备用于通过显示装置显示仿真对象;所述电子设备为权利要求37-49任一项所述的电子设备;
    所述鉴权服务器用于判断所述电子设备显示的仿真对象是否合法;
    其中,在所述注册设备注册过的仿真对象是合法的仿真对象,未在所述注册设备注册过的仿真对象是非法的仿真对象。
  23. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-13中任一项所述的方法。
PCT/CN2020/096933 2019-06-21 2020-06-19 仿真对象的身份识别方法、相关装置及系统 WO2020253800A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20827624.6A EP3964985A4 (en) 2019-06-21 2020-06-19 Simulation object identity recognition method, and related apparatus and system
US17/620,424 US11887261B2 (en) 2019-06-21 2020-06-19 Simulation object identity recognition method, related apparatus, and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910543117.7 2019-06-21
CN201910543117.7A CN110348198B (zh) 2019-06-21 2019-06-21 仿真对象的身份识别方法、相关装置及系统

Publications (1)

Publication Number Publication Date
WO2020253800A1 true WO2020253800A1 (zh) 2020-12-24

Family

ID=68182759

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/096933 WO2020253800A1 (zh) 2019-06-21 2020-06-19 仿真对象的身份识别方法、相关装置及系统

Country Status (4)

Country Link
US (1) US11887261B2 (zh)
EP (1) EP3964985A4 (zh)
CN (1) CN110348198B (zh)
WO (1) WO2020253800A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110348198B (zh) 2019-06-21 2024-04-12 华为技术有限公司 仿真对象的身份识别方法、相关装置及系统
CN111324209A (zh) * 2020-04-09 2020-06-23 云南电网有限责任公司电力科学研究院 一种增强体感与平衡的vr显示设备及电路
WO2023214905A1 (en) * 2022-05-04 2023-11-09 Telefonaktiebolaget Lm Ericsson (Publ) Methods and devices for selective sharing of information in extended reality
CN117325762A (zh) * 2022-06-27 2024-01-02 深圳市中兴微电子技术有限公司 一种车辆控制方法、系统及ar头显
CN116436723B (zh) * 2023-06-13 2023-09-01 北京集度科技有限公司 总线识别方法、确定方法和执行方法及相关装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483856A (zh) * 2009-06-25 2012-05-30 三星电子株式会社 虚拟世界处理装置及方法
CN107977834A (zh) * 2016-10-21 2018-05-01 阿里巴巴集团控股有限公司 一种虚拟现实/增强现实空间环境中的数据对象交互方法及装置
US20180261011A1 (en) * 2017-03-09 2018-09-13 Samsung Electronics Co., Ltd. System and method for enhancing augmented reality (ar) experience on user equipment (ue) based on in-device contents
WO2019040065A1 (en) * 2017-08-23 2019-02-28 Visa International Service Association SECURE AUTHORIZATION OF ACCESS TO PRIVATE DATA IN VIRTUAL REALITY
CN110348198A (zh) * 2019-06-21 2019-10-18 华为技术有限公司 仿真对象的身份识别方法、相关装置及系统

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9311883B2 (en) * 2011-11-11 2016-04-12 Microsoft Technology Licensing, Llc Recalibration of a flexible mixed reality device
US9331856B1 (en) * 2014-02-10 2016-05-03 Symantec Corporation Systems and methods for validating digital signatures
US9811650B2 (en) * 2014-12-31 2017-11-07 Hand Held Products, Inc. User authentication system and method
CN105681316B (zh) * 2016-02-02 2019-12-17 腾讯科技(深圳)有限公司 身份验证方法和装置
EP3424039A4 (en) * 2016-04-01 2019-11-06 Incontext Solutions, Inc. VIRTUAL REALITY PLATFORM FOR RETAIL LEVEL SIMULATION
CN107463245A (zh) * 2016-06-02 2017-12-12 上海矩道网络科技有限公司 虚拟现实仿真实验系统
EP3495949B1 (en) 2016-08-04 2022-06-01 Tencent Technology (Shenzhen) Company Limited Virtual reality-based information verification method, device, data storage medium, and virtual reality apparatus
US11269480B2 (en) * 2016-08-23 2022-03-08 Reavire, Inc. Controlling objects using virtual rays
CN109276887B (zh) * 2018-09-21 2020-06-30 腾讯科技(深圳)有限公司 虚拟对象的信息显示方法、装置、设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483856A (zh) * 2009-06-25 2012-05-30 三星电子株式会社 虚拟世界处理装置及方法
CN107977834A (zh) * 2016-10-21 2018-05-01 阿里巴巴集团控股有限公司 一种虚拟现实/增强现实空间环境中的数据对象交互方法及装置
US20180261011A1 (en) * 2017-03-09 2018-09-13 Samsung Electronics Co., Ltd. System and method for enhancing augmented reality (ar) experience on user equipment (ue) based on in-device contents
WO2019040065A1 (en) * 2017-08-23 2019-02-28 Visa International Service Association SECURE AUTHORIZATION OF ACCESS TO PRIVATE DATA IN VIRTUAL REALITY
CN110348198A (zh) * 2019-06-21 2019-10-18 华为技术有限公司 仿真对象的身份识别方法、相关装置及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3964985A4

Also Published As

Publication number Publication date
US20220351469A1 (en) 2022-11-03
CN110348198A (zh) 2019-10-18
US11887261B2 (en) 2024-01-30
CN110348198B (zh) 2024-04-12
EP3964985A4 (en) 2022-06-29
EP3964985A1 (en) 2022-03-09

Similar Documents

Publication Publication Date Title
WO2020253800A1 (zh) 仿真对象的身份识别方法、相关装置及系统
CN110321790B (zh) 一种对抗样本的检测方法及电子设备
EP3926448A1 (en) Device control page display method, related apparatus and system
WO2019128592A1 (zh) 进行直播的方法和装置
CN112003879B (zh) 用于虚拟场景的数据传输方法、计算机设备及存储介质
CN112835445B (zh) 虚拟现实场景中的交互方法、装置及系统
US11798234B2 (en) Interaction method in virtual reality scenario and apparatus
KR102452314B1 (ko) 컨텐츠 재생 방법 및 이를 지원하는 전자 장치
WO2022160991A1 (zh) 权限控制方法和电子设备
CN110365501B (zh) 基于图形码进行群组加入处理的方法及装置
WO2021238821A1 (zh) 快速匹配方法及头戴电子设备
EP3086216A1 (en) Mobile terminal and controlling method thereof
EP4044000A1 (en) Display method, electronic device, and system
CN111045945B (zh) 模拟直播的方法、装置、终端、存储介质及程序产品
CN108837509B (zh) 配置虚拟场景的设置参数的方法、计算机设备及存储介质
CN115017495B (zh) 定时校验方法、电子设备和可读存储介质
KR20140147057A (ko) 안경형 단말기 및 안경형 단말기의 제어방법
KR101622730B1 (ko) 이동 단말기 및 그 제어방법
CN116414500A (zh) 电子设备操作引导信息录制方法、获取方法和终端设备
WO2022042273A1 (zh) 密钥使用方法及相关产品
CN116774836A (zh) 一种气流产生方法、装置及电子设备
KR20170013082A (ko) 이동 단말기 및 그 제어방법
CN117641359A (zh) 数据处理方法及电子设备
KR20160005529A (ko) 글래스 타입의 이동 단말기 및 그 제어방법
CN116055233A (zh) 一种物联网设备配网方法、终端和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20827624

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020827624

Country of ref document: EP

Effective date: 20211201

NENP Non-entry into the national phase

Ref country code: DE