WO2019061798A1 - 显示控制方法、系统和虚拟现实设备 - Google Patents

显示控制方法、系统和虚拟现实设备 Download PDF

Info

Publication number
WO2019061798A1
WO2019061798A1 PCT/CN2017/113969 CN2017113969W WO2019061798A1 WO 2019061798 A1 WO2019061798 A1 WO 2019061798A1 CN 2017113969 W CN2017113969 W CN 2017113969W WO 2019061798 A1 WO2019061798 A1 WO 2019061798A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
display
display content
near field
virtual reality
Prior art date
Application number
PCT/CN2017/113969
Other languages
English (en)
French (fr)
Inventor
秦文东
Original Assignee
歌尔科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔科技有限公司 filed Critical 歌尔科技有限公司
Priority to JP2018555914A priority Critical patent/JP6983176B2/ja
Priority to EP17905916.7A priority patent/EP3690604A4/en
Priority to KR1020187031157A priority patent/KR20190056348A/ko
Priority to US16/096,651 priority patent/US20190130647A1/en
Publication of WO2019061798A1 publication Critical patent/WO2019061798A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/067Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components
    • G06K19/07Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips
    • G06K19/0723Record carriers with conductive marks, printed circuits or semiconductor circuit elements, e.g. credit or identity cards also with resonating or responding marks without active components with integrated circuit chips the record carrier comprising an arrangement for non-contact communication, e.g. wireless communication circuits on transponder cards, non-contact smart cards or RFIDs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • H04B5/70Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • H04B5/70Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes
    • H04B5/77Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes for interrogation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the present application relates to the field of virtual reality technologies, and in particular, to a display control method, system, and virtual reality device.
  • virtual reality technology has been widely used in various fields.
  • the experiencer views the corresponding virtual reality scenes in various fields through virtual reality technology.
  • virtual reality technology For example, in the real estate sales field to experience the type of real estate; for example, in the field of teaching to truly experience the teaching content, and so on.
  • the operations corresponding to the control of the display content vary widely. In most cases, the experiencer does not know what kind of operation to perform the control. If the operation is improper, the device will be useless and increase power consumption.
  • aspects of the present application provide a display control method, system, and virtual reality device for conveniently controlling content displayed by a virtual reality interface.
  • a display control method provided by an embodiment of the present application includes:
  • the near field interaction tag If the near field interaction tag is detected, acquiring a control message pre-stored in the near field interaction tag, where the control message includes a control type and a display object;
  • a second aspect of the present disclosure provides a virtual reality device, where the device includes:
  • the processor is respectively connected to the memory and the near field interaction reader/writer;
  • the near field interaction reader/writer configured to read a control message pre-stored into the near field interaction tag, and send the control message to the processor;
  • the memory is for storing one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the display control method provided by the present application.
  • a display control system provided by the embodiment of the present application includes: a near field interaction tag, and a virtual reality device provided by the embodiment of the present application.
  • a display control method, system, and virtual reality device record a control message through a near field interaction tag.
  • a virtual reality device that needs to control an experiencer displays a certain content
  • the control personnel will perform corresponding near field interaction.
  • the tag is close to the VR device.
  • the VR device can read the control message in the near field interaction tag and respond accordingly, so that the experience of the virtual reality interface can be conveniently controlled when the experiencer does not know how to perform the operation. content.
  • FIG. 1 is a flowchart of a display control method according to an embodiment of the present application
  • FIG. 2 is another flowchart of a display control method according to an embodiment of the present application.
  • FIG. 3 is still another flowchart of a display control method according to an embodiment of the present application.
  • FIG. 4 is still another flowchart of a display control method according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a display control apparatus according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of a virtual reality device according to an embodiment of the present disclosure.
  • the display control method provided by the embodiment of the present application is applied to a virtual reality device.
  • the virtual reality device may include a virtual reality product in the form of monocular glasses, binocular glasses, and the like.
  • the display control method provided by the embodiment of the present application is applied to a display control device.
  • the device may be an application software dedicated to display control, and may also be a function plug-in for an operating system, a control application, or the like.
  • a display control method provided by an embodiment of the present application includes the following steps:
  • NFC Near Field Communication
  • the interaction parties in the embodiment of the present application may be a near field interaction tag and a virtual reality device to achieve the purpose of controlling content displayed in the virtual reality device through the near field interaction tag.
  • the control message is written to the near field interaction tag in advance, if the near field interaction tag is away from the virtual reality When the device is close, the virtual reality device can detect the near field interaction tag, and then read the content to complete the communication with the near field interaction tag.
  • control messages include, but are not limited to, control types and display objects.
  • the virtual reality device can know what needs to be done according to the type of control.
  • Control types can include adjusting display brightness, switching display content, setting default parameters, and the like. For example, if the control type is to adjust the brightness, the virtual reality device knows that the operation of adjusting the display brightness should be performed when receiving the control type for adjusting the display brightness.
  • the display object may be a brightness value to be displayed, a content to be displayed, a target value of the set parameter, and the like. For example, if the virtual reality device knows that the control type is to adjust the display brightness and knows that the display brightness value is 90%, the virtual reality device adjusts the display brightness value to 90%.
  • the display object After learning the type of operation according to the control type and knowing the display object that needs to be operated, the display object can be operated according to the control type.
  • a control message written in advance in a near field interaction tag placed by a virtual reality device controller in a close distance is read, and the display content is controlled according to the control message. It is also convenient to control the content displayed by the virtual reality interface when the virtual reality scene experiencer does not know how to perform the operation.
  • the user may need to switch the display scene of the virtual reality device.
  • the experiencer is performing the apartment type experience of the real estate in the sales office
  • the virtual reality device visits the household type A
  • the virtual scene corresponding to the type A needs to be switched to the virtual scene corresponding to the type B.
  • the control class The type is the switching control
  • the display object is the virtual reality scene identifier that needs to be switched to.
  • the control type is a switching control.
  • the control type may be implemented as a guide;
  • the display object is a virtual reality scene identifier that needs to be switched to, and the display object may be embodied as com.android.vr.test:A, ie, Scene A in the application com.android.vr.test.
  • the control message is guide:com.android.vr.test:A, and the message at the application level means: switch to scenario A in the application name com.android.vr.test.
  • S202 Switch to display a virtual reality scene corresponding to the virtual reality scene identifier.
  • the virtual reality scene identifier may be a name of the virtual reality scene.
  • the control person closes the near field interaction label corresponding to the message of "switching to the virtual scene corresponding to the type B" corresponding to the type B.
  • the virtual reality device can read the message and implement a control operation corresponding to the message, that is, switch the scene to the apartment B.
  • control type is a default display content configuration control, the display object being a display content identification that needs to be configured as the default display content.
  • the embodiment of the present application includes:
  • the control type in the embodiment of the present application is a default display content configuration control
  • the display object is a display content identifier that needs to be configured as a default display content.
  • the display content identifier may be a name of the display content.
  • the control type can be embodied as default_guide
  • the display object can be embodied as com.android.vr.test:A.
  • the control message is default_guide:com.android.vr.test:A
  • the message at the application level means: Set scene A as the default scene with the name com.android.vr.test.
  • control type can be embodied as default_app
  • display object can be embodied as com.android.vr.test.
  • the control message is default_app:com.android.vr.test.
  • the message at the application level means that the application with the application name com.android.vr.test is set as the default program.
  • S302 Assign the display content identifier to the preset default display content attribute, so that the display content corresponding to the display content identifier becomes the default display content.
  • assigning the display content identifier to the preset default display content attribute may be implemented by: determining a target attribute according to a type of the default display content configuration control, and the target attribute is one of a plurality of preset default display content attributes; If the display content identifier corresponds to the target attribute, the display content identifier is assigned to the target attribute. For example, if the default display content configuration control is default_guide, it can be determined that the default scene attribute needs to be assigned; if the default display content configuration control is default_app, it can be determined that the default application attribute needs to be assigned.
  • the control message is incorrectly set, so that the display content identifier does not correspond to the determined target attribute, for example, when the class behavior of the default display content configuration control displays the scene configuration control by default, the content is displayed.
  • the identifier is the application identifier.
  • the display content identifier is the application scene identifier. Therefore, it is necessary to detect whether the display content identifier corresponds to the target attribute. In the corresponding case, the display content identifier is assigned to the target attribute.
  • the types of the default display content configuration control include, but are not limited to, a default display scene configuration control, and a default display application configuration control.
  • control message received in the above embodiment is pre-written into the near field interaction tag.
  • the control message may be written to the near field interaction tag by: obtaining a control message entered by the user in the tag write interface; writing the control message to the near field interaction tag.
  • the input mode may be keyboard input, voice input, or the like.
  • the control message is written into the near field interaction tag.
  • a hand error may occur, causing the input control message to be not the control message that the user wants to input. Therefore, in an optional embodiment of the present application, before the control message is written to the near field interaction tag, Waiting for the write command issued by the user, if receiving the write command, writing the control message to the near field interaction tag, ensuring the accuracy of the written control message.
  • an optional embodiment of the present application includes:
  • S401 Display a control type selection list and a display object selection list in the label writing interface.
  • control type selection list includes multiple control types, and may first select a control type in the control type selection list, and then determine a display object that is controllable according to the control type, and then display the display object selection list, and continue to provide User selection. For example, if the control type selected by the user is the default display scene configuration control, all available scenes are displayed in the display object selection list; if the control type selected by the user is the default display application configuration control, all are displayed in the display object selection list. Available applications.
  • control type and the display object are encapsulated according to the control message encapsulation format corresponding to the control type to obtain a control message.
  • the above two characters are only two characters that are displayed together on the display, and cannot be recognized by the virtual reality device. Therefore, the format needs to be encapsulated according to a specific control message. And converting the above control type and the display object into a machine language recognizable by the virtual reality device, so that the virtual reality device recognizes the corresponding content and then performs conversion of the display content.
  • the user completes the pre-writing of the control message in the approach interaction tag by selecting the manner, thereby completing the near field interaction. Since the input mode is selected in the list instead of inputting one character and one character through the keyboard, In the embodiment, on the basis of conveniently controlling the display content, the accuracy of the control message input is also improved.
  • the embodiment of the present application further provides a display control apparatus, including:
  • the first obtaining module 510 is configured to: if the near field interaction tag is detected, acquire a control message pre-stored in the near field interaction tag, where the control message includes a control type and a display object;
  • the processing module 520 is configured to perform corresponding operations on the display object according to the control type.
  • a control message written in advance in a near field interaction tag placed by a virtual reality device controller in a close distance is read, and the display content is controlled according to the control message. It is also convenient to control the content displayed by the virtual reality interface when the virtual reality scene experiencer does not know how to operate.
  • control type is a handover control
  • display object is a virtual reality scene identifier that needs to be switched to.
  • the processing module 520 is specifically configured to:
  • the virtual reality scene corresponding to the virtual reality scene identifier is displayed.
  • control type is a default display content configuration control
  • display object is a display content identifier that needs to be configured as a default display content
  • the processing module 520 is specifically configured to:
  • the device further includes: a second obtaining module 530 and a writing module 540.
  • the second obtaining module 530 is configured to acquire the control message input by the user in the label writing interface.
  • the writing module 540 is configured to write the control message to the near field interaction tag.
  • the second obtaining module 530 includes: a display submodule 531 and a packaging module 532.
  • the display submodule 531 is configured to display a control type selection list and a display object selection list in the label writing interface.
  • the encapsulation submodule 532 is configured to: if the user selects the control type from the control type selection list and selects the display object from the display object list, according to the control type
  • Corresponding control message encapsulation format encapsulates the control type and the display object to obtain the control message.
  • the embodiment of the present application further provides a display control system, including: a near field interaction tag, and a virtual reality device provided by the embodiment of the present application.
  • the embodiment of the present application further provides a virtual reality device, where the device includes: a processor 610, a memory 620, and a near field interaction reader/writer 630.
  • the processor 610 is connected to the memory 620 and the near field interaction reader/writer 630, respectively.
  • the near field interaction reader/writer 630 is configured to read a control message pre-stored into the near field interaction tag, and send the control message to the processor 610.
  • the memory 620 is configured to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor 610 to implement a display control method of an embodiment provided by the present application.
  • embodiments of the present application can be provided as a method, system, or computer program product.
  • the present application can take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment in combination of software and hardware.
  • the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device.
  • computer readable media does not include temporary storage of computer readable media, such as modulated data signals and carrier waves.
  • embodiments of the present application can be provided as a method, system, or computer program product.
  • the present application can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment in combination of software and hardware.
  • the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种显示控制方法、系统和虚拟现实设备。所述方法包括:若检测到近场交互标签,则获取预先存储至所述近场交互标签中的控制消息,所述控制消息中包括控制类型和显示对象(S101);根据所述控制类型对所述显示对象进行相应操作(S102)。所述方法可以便捷控制虚拟现实界面所显示的内容。

Description

显示控制方法、系统和虚拟现实设备
交叉引用
本申请引用于2017年9月27日递交的名称为“显示控制方法、系统和虚拟现实设备”的第2017108902093号中国专利申请,其通过引用被全部并入本申请。
技术领域
本申请涉及虚拟现实技术领域,尤其涉及一种显示控制方法、系统和虚拟现实设备。
背景技术
随着数据信息时代的到来,虚拟现实技术逐渐被广泛应用到各个领域中。体验者通过虚拟现实技术观看各个领域相应的虚拟现实场景。例如,在房产销售领域中体验楼盘的户型;再如,在教学领域中真实地体验教学内容,等等。在体验过程中,需要控制所显示的内容,以进行相应体验,例如选择所显示的体验场景等。
基于设计思路的不同,对显示内容进行控制所对应的操作也千差万别,而多数情况下体验者不知道该进行何种操作来实现控制,若操作不当,会导致设备做无用功,增加功耗。
发明内容
本申请的多个方面提供一种显示控制方法、系统和虚拟现实设备,用以便捷控制虚拟现实界面所显示的内容。
第一方面,本申请实施例提供的一种显示控制方法,包括:
若检测到近场交互标签,则获取预先存储至所述近场交互标签中的控制消息,所述控制消息中包括控制类型和显示对象;
根据所述控制类型对所述显示对象进行相应操作。
第二方面,本申请实施例提供的一种虚拟现实设备,所述设备包括:
处理器、存储器、近场交互读写器;
所述处理器分别与所述存储器、所述近场交互读写器连接;
所述近场交互读写器,用于读取预先存储至近场交互标签中的控制消息,并将所述控制消息发送至所述处理器;
所述存储器用于存储一条或多条计算机指令,其中,所述一条或多条计算机指令被所述处理器执行时实现本申请提供的显示控制方法。
第三方面,本申请实施例提供的一种显示控制系统,包括:近场交互标签、以及本申请实施例提供的虚拟现实设备。
本申请实施例所提供的一种显示控制方法、系统和虚拟现实设备,通过近场交互标签记载控制消息,当需要控制体验者的虚拟现实设备显示某内容时,控制人员将相应的近场交互标签贴近至VR设备,如此,VR设备可以读取到近场交互标签中的控制消息,进而做出相应响应,因此,在体验者不知道如何进行操作时也可以便捷控制虚拟现实界面所显示的内容。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请实施例提供的一种显示控制方法的流程图;
图2为本申请实施例提供的一种显示控制方法的另一流程图;
图3为本申请实施例提供的一种显示控制方法的又一流程图;
图4为本申请实施例提供的一种显示控制方法的再一流程图;
图5为本申请实施例提供的一种显示控制装置的结构示意图;
图6为本申请实施例提供的一种虚拟现实设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请具体实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供的显示控制方法,应用于虚拟现实设备。该虚拟现实设备可以包括单目头戴眼镜、双目头戴眼镜等形式的虚拟现实产品。具体地,本申请实施例提供的显示控制方法应用于显示控制装置。该装置可以为专用于显示控制的应用软件,还可以为操作系统、控制应用程序等相关程序的功能插件。
如图1所示,本申请实施例提供的一种显示控制方法,包括如下步骤:
S101:若检测到近场交互标签,则获取预先存储至近场交互标签中的控制消息,控制消息中包括控制类型和显示对象。
近场交互(Near Field Communication,简称NFC)为实现短距离无线交互的技术,当交互双方的距离在预设范围以内时,可以进行无线通信。本申请实施例中,预先将需要传输的内容写入至交互双方中的其中一方,进而,在交互双方的距离在一定范围内时,另一方可以读取到需要传输的内容,进而实现交互。
本申请实施例中的交互双方可以为近场交互标签和虚拟现实设备,以实现通过近场交互标签来控制虚拟现实设备中显示的内容的目的。为实现该目的,预先将控制消息写入至近场交互标签中,若近场交互标签距离虚拟现实 设备较近时,虚拟现实设备可以检测到近场交互标签,进而读取到上述内容,完成与近场交互标签的通信。
具体地,虚拟现实设备读取到预先写入至近场交互标签中的控制消息,进而虚拟现实设备可以依据控制消息进行显示控制。控制消息中包括但不限于控制类型和显示对象。虚拟现实设备可以根据控制类型获知需要进行何种操作。控制类型可以包括调节显示亮度、切换显示内容、设置默认参数等。例如,若控制类型为调节亮度,则虚拟现实设备在接收到调节显示亮度的控制类型时,则获知应该进行调节显示亮度的操作。
相应于控制类型,显示对象可以为所需要显示的亮度值、所需要显示的内容、所设置参数的目标值等。例如,若在虚拟现实设备获知控制类型为调节显示亮度时,又获知显示亮度值为90%,则虚拟现实设备将显示亮度值调节为90%。
S102:根据控制类型对显示对象进行相应操作。
在根据控制类型获知该进型何种类型的操作,并获知所需要操作的显示对象后,则可以根据控制类型对显示对象进行相应操作。
例如,将显示亮度值调节为90%。
本申请实施例中,读取近距离内的虚拟现实设备控制人员所放置的近场交互标签中预先写入的控制消息,并根据控制消息控制显示内容。在虚拟现实场景体验者不知道如何进行操作时也可以便捷控制虚拟现实界面所显示的内容。
在实际应用中,用户可能需要对虚拟现实设备的显示场景进行切换。例如,当体验者在售楼处进行楼盘的户型体验时,通过虚拟现实设备参观完户型A后需要继续参观户型B,则需要将户型A对应的虚拟场景切换为户型B对应的虚拟场景。再如,当体验者体验游乐场的各种项目时,体验完第一种项目后想继续体验第二种项目,则需要将第一种项目对应的虚拟场景切换至第二种项目对应的虚拟场景。为响应上述需求,在本申请实施例中,控制类 型为切换控制,显示对象为需要切换至的虚拟现实场景标识。如图2所示,本申请实施例包括如下步骤:
S201:若检测到近场交互标签,则获取预先存储至近场交互标签中的控制消息,控制消息中包括控制类型和显示对象。
本申请实施例中控制类型为切换控制,可选地,控制类型可以体现为guide;显示对象为需要切换至的虚拟现实场景标识,显示对象可以体现为com.android.vr.test:A,即应用程序com.android.vr.test中的场景A。进而控制消息为guide:com.android.vr.test:A,该消息在应用层面的意思为:切换至应用名称为com.android.vr.test中的场景A。
S202:切换显示虚拟现实场景标识对应的虚拟现实场景。
可选地,虚拟现实场景标识可以为虚拟现实场景的名称。
实际使用中,若体验者观看完某一楼盘的户型A需要接着观看户型B,则由控制人员将户型B对应的、写入“切换至户型B对应虚拟场景”的消息的近场交互标签贴近至虚拟现实设备,进而,虚拟现实设备可以读取到上述消息,实现与消息相应的控制操作,即切换场景至户型B。
实际情况下,用户还可能需要控制默认的显示内容,例如设置开机画面、设置开机后默认的场景等。因此,在另一可选实施例中,控制类型为默认显示内容配置控制,显示对象为需要被配置为默认显示内容的显示内容标识。如图3所示,本申请实施例包括:
S301:若检测到近场交互标签,则获取预先存储至近场交互标签中的控制消息,控制消息中包括控制类型和显示对象。
本申请实施例中的控制类型为默认显示内容配置控制,显示对象为需要被配置为默认显示内容的显示内容标识。进一步地,显示内容标识可以为显示内容的名称。例如,若需要对默认显示场景进行配置控制,则控制类型可以体现为default_guide,显示对象可以体现为com.android.vr.test:A。进而控制消息为default_guide:com.android.vr.test:A,该消息在应用层面的意思为:将应 用名称为com.android.vr.test中场景A设置为默认场景。
再如,若需要对默认显示的应用程序进行配置控制,则控制类型可以体现为default_app,显示对象可以体现为com.android.vr.test。进而控制消息为default_app:com.android.vr.test,该消息在应用层面的意思为:将应用名称为com.android.vr.test的应用程序设置为默认程序。
S302:将显示内容标识赋值给预设默认显示内容属性,以使显示内容标识对应的显示内容成为默认显示内容。
可选地,将显示内容标识赋值给预设默认显示内容属性可以通过以下方式实现:根据默认显示内容配置控制的类型,确定目标属性,目标属性为多种预设默认显示内容属性中的一个;若显示内容标识与目标属性相对应,则将显示内容标识赋值给目标属性。例如,若控默认显示内容配置控制为default_guide,则可以确定出需要对默认场景这一属性进行赋值;若默认显示内容配置控制为default_app,则可以确定出需要对默认应用程序这一属性进行赋值。在现实应用中,可能由于控制消息被错误地设置,以至于显示内容标识与所确定出目标属性并不能相互对应,例如,当默认显示内容配置控制的类行为默认显示场景配置控制时,显示内容标识却为应用程序标识,再如,当默认显示内容配置控制的类行为默认显示应用程序配置控制时,显示内容标识却为应用场景标识,因此,需要检测显示内容标识与目标属性是否相对应,在对应的情况下,将显示内容标识赋值给目标属性。
需要说明的是,默认显示内容配置控制的类型包括但并不限于:默认显示场景配置控制、默认显示应用程序配置控制。
可以理解到的是,上述实施例中接收到的控制消息为预先写入至近场交互标签中的。可选地,在一实施例中,控制消息可以通过以下方式写入至近场交互标签中:获取用户在标签写入界面中输入的控制消息;将控制消息写入到近场交互标签。可选地,输的方式可以为键盘输入、语音输入等。当获取到所输入的控制消息后,若检测到近场交互标签,且接收到用户发出的写入指令,则将控制消息写入到近场交互标签中。实际情况下,用户输入控制 消息时,可能出现手误的情况,导致所输入的控制消息不是自己所希望输入的控制消息,因此,本申请的一可选实施例中,在将控制消息写入到近场交互标签之前,等待用户发出的写入指令,若接收到写入指令,则将控制消息写入到近场交互标签,保证了所写入控制消息的准确程度。
具体地,如图4所示,本申请的一可选实施例包括:
S401:在标签写入界面中显示控制类型选择列表和显示对象选择列表。
具体地,控制类型选择列表中包括多个控制类型,可以首先在控制类型选择列表中选择一个控制类型,接着根据控制类型,确定该类型可控的显示对象,进而显示显示对象选择列表,继续供用户选择。例如,若用户选择的控制类型为默认显示场景配置控制,则在显示对象选择列表中显示所有可用场景;若用户选择的控制类型为默认显示应用程序配置控制,则在显示对象选择列表中显示所有可用应用程序。
S402:若用户从控制类型选择列表中选择出控制类型以及显示对象列表中选择出显示对象,则根据控制类型对应的控制消息封装格式对控制类型和显示对象进行封装,以获得控制消息。
可以理解的是,当选择出控制类型和显示对象后,上述两者只是显示在显示器上的两段拼接在一起的字符,并不能被虚拟现实设备识别,因此,需要根据特定的控制消息封装格式,将上述控制类型和显示对象转化为虚拟现实设备可识别的机器语言,以使得虚拟现实设备识别出相应内容后进行显示内容的转换。
S403:将所述控制消息写入到近场交互标签。
本实施例中,用户通过选择的方式完成进场交互标签中控制消息的预先写入,进而完成近场交互,由于输入方式为在列表中选择,而不是通过键盘一个字符一个字符地输入,因此,本实施例在便捷控制显示内容的基础上,还提高了控制消息输入的准确率。
相应于上述方法实施例,如图5所示,本申请实施例还提供一种显示控制装置,包括:
第一获取模块510,用于若检测到近场交互标签,则获取预先存储至所述近场交互标签中的控制消息,所述控制消息中包括控制类型和显示对象;
处理模块520,用于根据所述控制类型对所述显示对象进行相应操作。
本申请实施例中,读取近距离内的虚拟现实设备控制人员所放置的近场交互标签中预先写入的控制消息,并根据控制消息控制显示内容。在虚拟现实场景体验者不知道如何进行操作时也可以便捷控制虚拟现实界面所显示的内容
可选地,所述控制类型为切换控制,所述显示对象为需要切换至的虚拟现实场景标识。
所述处理模块520,具体用于:
切换显示所述虚拟现实场景标识对应的虚拟现实场景。
可选地,所述控制类型为默认显示内容配置控制,所述显示对象为需要被配置为默认显示内容的显示内容标识。
所述处理模块520,具体用于:
将所述显示内容标识赋值给预设默认显示内容属性,以使所述显示内容标识对应的显示内容成为默认显示内容。
可选地,所述装置还包括:第二获取模块530、写入模块540。
第二获取模块530,用于获取用户在标签写入界面中输入的所述控制消息。
写入模块540,用于将所述控制消息写入到所述近场交互标签。
可选地,所述第二获取模块530,包括:显示子模块531、封装模块532。
显示子模块531,用于在所述标签写入界面中显示控制类型选择列表和显示对象选择列表。
封装子模块532,用于若用户从所述控制类型选择列表中选择出所述控制类型以及从所述显示对象列表中选择出所述显示对象,则根据所述控制类型 对应的控制消息封装格式对所述控制类型和所述显示对象进行封装,以获得所述控制消息。
本申请实施例还提供一种显示控制系统,包括:近场交互标签、本申请实施例提供的虚拟现实设备。
需要说明的是,对于装置/系统实施例而言,由于其基本相似于方法实施例,所以描述得较为简单,相关之处参见方法实施例的部分说明即可。
如图6所示,本申请实施例还提供一种虚拟现实设备,所述设备包括:处理器610、存储器620、近场交互读写器630。
所述处理器610分别与所述存储器620、所述近场交互读写器630连接。
所述近场交互读写器630,用于读取预先存储至近场交互标签中的控制消息,并将所述控制消息发送至所述处理器610。
所述存储器620用于存储一条或多条计算机指令,其中,所述一条或多条计算机指令被所述处理器610执行时实现本申请提供实施例的显示控制方法。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这 种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (10)

  1. 一种显示控制方法,其特征在于,包括:
    若检测到近场交互标签,则获取预先存储至所述近场交互标签中的控制消息,所述控制消息中包括控制类型和显示对象;
    根据所述控制类型对所述显示对象进行相应操作。
  2. 根据权利要求1所述的方法,其特征在于,所述控制类型为切换控制,所述显示对象为需要切换至的虚拟现实场景标识;
    所述根据所述控制类型对所述显示对象进行相应操作,包括:
    切换显示所述虚拟现实场景标识对应的虚拟现实场景。
  3. 根据权利要求1或2中任一项所述的方法,其特征在于,所述控制类型为默认显示内容配置控制,所述显示对象为需要被配置为默认显示内容的显示内容标识;
    所述根据所述控制类型对所述显示对象进行相应操作,包括:
    将所述显示内容标识赋值给预设默认显示内容属性,以使所述显示内容标识对应的显示内容成为默认显示内容。
  4. 根据权利要求3所述的方法,其特征在于,所述将所述显示内容标识赋值给预设默认显示内容属性,包括:
    根据所述默认显示内容配置控制的类型,确定目标属性,所述目标属性为多种预设默认显示内容属性中的一个;
    若所述显示内容标识与所述目标属性相对应,则将所述显示内容标识赋值给所述目标属性。
  5. 根据权利要求4所述的方法,其特征在于,所述预设默认显示内容属性的类型包括:默认显示场景配置控制、默认显示应用程序配置控制。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,还包括:
    获取用户在标签写入界面中输入的所述控制消息;
    将所述控制消息写入到所述近场交互标签。
  7. 根据权利要求6所述的方法,其特征在于,所述获取用户在标签写入界面中输入的所述控制消息,包括:
    在所述标签写入界面中显示控制类型选择列表和显示对象选择列表;
    若用户从所述控制类型选择列表中选择出所述控制类型以及从所述显示对象列表中选择出所述显示对象,则根据所述控制类型对应的控制消息封装格式对所述控制类型和所述显示对象进行封装,以获得所述控制消息。
  8. 根据权利要求6或7所述的方法,其特征在于,在所述将所述控制消息写入到所述近场交互标签之前,还包括:
    若接收到写入指令,则执行所述将所述控制消息写入到所述近场交互标签。
  9. 一种虚拟现实设备,其特征在于,所述设备包括:
    处理器、存储器、近场交互读写器;
    所述处理器分别与所述存储器、所述近场交互读写器连接;
    所述近场交互读写器,用于读取预先存储至近场交互标签中的控制消息,并将所述控制消息发送至所述处理器;
    所述存储器用于存储一条或多条计算机指令,其中,所述一条或多条计算机指令被所述处理器执行时实现如权利要求1至8中任一项所述的显示控制方法。
  10. 一种显示控制系统,其特征在于,包括:近场交互标签、以及如权利要求9所述的虚拟现实设备。
PCT/CN2017/113969 2017-09-27 2017-11-30 显示控制方法、系统和虚拟现实设备 WO2019061798A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2018555914A JP6983176B2 (ja) 2017-09-27 2017-11-30 表示制御方法、システム及びバーチャルリアリティデバィス
EP17905916.7A EP3690604A4 (en) 2017-09-27 2017-11-30 DISPLAY CONTROL PROCESS AND SYSTEM AND DEVICE OF VIRTUAL REALITY
KR1020187031157A KR20190056348A (ko) 2017-09-27 2017-11-30 디스플레이 제어 방법, 시스템 및 가상 현실 기기
US16/096,651 US20190130647A1 (en) 2017-09-27 2017-11-30 Display control method and system, and virtual reality device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710890209.3 2017-09-27
CN201710890209.3A CN107678548A (zh) 2017-09-27 2017-09-27 显示控制方法、系统和虚拟现实设备

Publications (1)

Publication Number Publication Date
WO2019061798A1 true WO2019061798A1 (zh) 2019-04-04

Family

ID=61137531

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/113969 WO2019061798A1 (zh) 2017-09-27 2017-11-30 显示控制方法、系统和虚拟现实设备

Country Status (6)

Country Link
US (1) US20190130647A1 (zh)
EP (1) EP3690604A4 (zh)
JP (1) JP6983176B2 (zh)
KR (1) KR20190056348A (zh)
CN (1) CN107678548A (zh)
WO (1) WO2019061798A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990400A (zh) * 2021-03-31 2021-06-18 建信金融科技有限责任公司 基于nfc标签的场景服务方法、装置及系统
CN114510152A (zh) * 2022-04-18 2022-05-17 梯度云科技(北京)有限公司 基于容器构建元宇宙系统的方法及装置

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020027331A1 (ja) * 2018-08-02 2020-02-06 ソニー株式会社 カートリッジメモリ、テープカートリッジおよびデータ管理システム
CN112305924B (zh) * 2019-07-31 2024-07-19 广东美的制冷设备有限公司 家电设备的控制方法、装置、电子设备和存储介质
CN111429051B (zh) * 2020-02-18 2024-01-09 北京旷视机器人技术有限公司 电子标签初始化方法、装置和系统
CN112463279A (zh) * 2020-12-10 2021-03-09 歌尔科技有限公司 一种显示元素的设置方法、智能设备及介质
CN113225549B (zh) * 2021-04-19 2022-07-01 广州朗国电子科技股份有限公司 一种vr智能生活系统
CN113965428A (zh) * 2021-10-18 2022-01-21 珠海格力电器股份有限公司 联动控制方法、装置、计算机设备和存储介质
CN114092674B (zh) * 2022-01-24 2022-04-22 北京派瑞威行互联技术有限公司 多媒体数据分析方法和系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105653137A (zh) * 2016-01-29 2016-06-08 宇龙计算机通信科技(深圳)有限公司 一种桌面显示方法以及装置
CN106020493A (zh) * 2016-03-13 2016-10-12 成都市微辣科技有限公司 一种基于虚拟现实的产品展示装置及方法
CN106095235A (zh) * 2016-06-07 2016-11-09 腾讯科技(深圳)有限公司 基于虚拟现实的控制方法和装置
CN106200972A (zh) * 2016-07-14 2016-12-07 乐视控股(北京)有限公司 一种调整虚拟现实场景参数的方法及装置
US20170011557A1 (en) * 2015-07-06 2017-01-12 Samsung Electronics Co., Ltd Method for providing augmented reality and virtual reality and electronic device using the same
CN106569614A (zh) * 2016-11-11 2017-04-19 上海远鉴信息科技有限公司 虚拟现实中场景切换控制方法及系统

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008217753A (ja) * 2007-02-06 2008-09-18 Ntt Comware Corp 情報入力方法、情報入力プログラム及び情報入力装置
JP4963642B2 (ja) * 2007-07-19 2012-06-27 キヤノン株式会社 情報処理装置、情報処理システム及び情報処理方法
JP2010061307A (ja) * 2008-09-02 2010-03-18 Brother Ind Ltd 仮想世界提供システム
KR100911032B1 (ko) * 2009-04-01 2009-08-05 (주)애니쿼터스 Nfc 칩 모듈과 외부 rf 리더기를 통한 휴대폰 단말기의 벨소리·카메라·통신기능을 제어하는 장치 및 방법
FI20095402A0 (fi) * 2009-04-09 2009-04-09 Valtion Teknillinen Lyhyen kantaman viestintään sovitettu mobiililaite, menetelmä ja vastaava palvelinjärjestelmä
US20110202842A1 (en) * 2010-02-12 2011-08-18 Dynavox Systems, Llc System and method of creating custom media player interface for speech generation device
US8661354B2 (en) * 2010-09-24 2014-02-25 Nokia Corporation Methods, apparatuses and computer program products for using near field communication to implement games and applications on devices
US8764571B2 (en) * 2010-09-24 2014-07-01 Nokia Corporation Methods, apparatuses and computer program products for using near field communication to implement games and applications on devices
WO2013033522A1 (en) * 2011-09-01 2013-03-07 Avery Dennison Corporation Apparatus, system and method for consumer tracking
US20130155107A1 (en) * 2011-12-16 2013-06-20 Identive Group, Inc. Systems and Methods for Providing an Augmented Reality Experience
JP2014092934A (ja) * 2012-11-02 2014-05-19 Sony Corp 情報通信装置及び情報通信方法、情報通信システム、並びにコンピューター・プログラム
KR102041452B1 (ko) * 2013-01-03 2019-11-06 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. 근거리 무선 통신(nfc) 기능을 지원하는 화상형성장치 및 nfc 디바이스를 이용하여 화상 작업의 설정을 수행하는 방법
KR101505871B1 (ko) * 2014-06-20 2015-03-25 (주) 맑은생각 주차 위치 확인시스템 및 이를 이용한 주차 위치 확인방법
JP6149822B2 (ja) * 2014-08-21 2017-06-21 コニカミノルタ株式会社 情報処理システム、情報処理装置、携帯端末装置およびプログラム
JP2016110379A (ja) * 2014-12-05 2016-06-20 コニカミノルタ株式会社 操作入力システム
CN105989393B (zh) * 2015-03-19 2019-09-03 东芝存储器株式会社 无线通信装置、无线通信装置控制方法、存储装置
JP2016180885A (ja) * 2015-03-24 2016-10-13 株式会社東芝 表示システム、情報処理装置および情報処理方法
EP3286718A4 (en) * 2015-04-23 2018-12-05 Hasbro, Inc. Context-aware digital play
US10045001B2 (en) * 2015-12-04 2018-08-07 Intel Corporation Powering unpowered objects for tracking, augmented reality, and other experiences
US20180353869A1 (en) * 2015-12-17 2018-12-13 Lyrebird Interactive Holdings Pty Ltd Apparatus and method for an interactive entertainment media device
CN106227327B (zh) * 2015-12-31 2018-03-30 深圳超多维光电子有限公司 一种显示转换方法、装置及终端设备
JP2017157126A (ja) * 2016-03-04 2017-09-07 富士ゼロックス株式会社 情報表示システム、情報提供装置、情報処理装置及びプログラム
CN205581784U (zh) * 2016-04-14 2016-09-14 江苏华博创意产业有限公司 一种基于现实场景的可交互混合现实平台
CN106060520B (zh) * 2016-04-15 2018-01-16 深圳超多维光电子有限公司 一种显示模式切换方法及其装置、智能终端
US10096165B2 (en) * 2016-06-30 2018-10-09 Intel Corporation Technologies for virtual camera scene generation using physical object sensing
JP6630255B2 (ja) * 2016-09-30 2020-01-15 株式会社沖データ 画像形成装置及び通信システム
CN107193381A (zh) * 2017-05-31 2017-09-22 湖南工业大学 一种基于眼球追踪传感技术的智能眼镜及其显示方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170011557A1 (en) * 2015-07-06 2017-01-12 Samsung Electronics Co., Ltd Method for providing augmented reality and virtual reality and electronic device using the same
CN105653137A (zh) * 2016-01-29 2016-06-08 宇龙计算机通信科技(深圳)有限公司 一种桌面显示方法以及装置
CN106020493A (zh) * 2016-03-13 2016-10-12 成都市微辣科技有限公司 一种基于虚拟现实的产品展示装置及方法
CN106095235A (zh) * 2016-06-07 2016-11-09 腾讯科技(深圳)有限公司 基于虚拟现实的控制方法和装置
CN106200972A (zh) * 2016-07-14 2016-12-07 乐视控股(北京)有限公司 一种调整虚拟现实场景参数的方法及装置
CN106569614A (zh) * 2016-11-11 2017-04-19 上海远鉴信息科技有限公司 虚拟现实中场景切换控制方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3690604A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990400A (zh) * 2021-03-31 2021-06-18 建信金融科技有限责任公司 基于nfc标签的场景服务方法、装置及系统
CN112990400B (zh) * 2021-03-31 2023-05-26 建信金融科技有限责任公司 基于nfc标签的场景服务方法、装置及系统
CN114510152A (zh) * 2022-04-18 2022-05-17 梯度云科技(北京)有限公司 基于容器构建元宇宙系统的方法及装置

Also Published As

Publication number Publication date
US20190130647A1 (en) 2019-05-02
JP6983176B2 (ja) 2021-12-17
EP3690604A4 (en) 2020-11-04
EP3690604A1 (en) 2020-08-05
KR20190056348A (ko) 2019-05-24
CN107678548A (zh) 2018-02-09
JP2019533844A (ja) 2019-11-21

Similar Documents

Publication Publication Date Title
WO2019061798A1 (zh) 显示控制方法、系统和虚拟现实设备
CN106874174B (zh) 接口测试及功能测试的实现方法和装置
KR102019967B1 (ko) 사용자 단말 장치, 디스플레이 장치, 서버 및 그 제어 방법
KR102248474B1 (ko) 음성 명령 제공 방법 및 장치
JP2020522001A (ja) 計算機アシスタントによる遅延応答
AU2013225479B2 (en) Application display method and terminal
US9888061B2 (en) Method for organizing home screen and electronic device implementing the same
US20140374474A1 (en) In-Store Content Sampling and Shopping Bag Techniques For Electronic Devices
CN111628897B (zh) 一种智能设备的初始化方法、装置及系统
US20150350123A1 (en) User terminal apparatus, communication system, and method of controlling user terminal apparatus
US20160266880A1 (en) Application name modification method and device, and storage medium
US11669346B2 (en) System and method for displaying customized user guides in a virtual client application
KR102416071B1 (ko) 전자장치 및 전자장치의 어플리케이션 실행 방법
EP3441865A1 (en) Electronic device for storing user data, and method therefor
JP2017538202A (ja) 画面表示装置にオブジェクト情報を表示するための方法及び装置
CN109857964B (zh) 一种页面操作的热力图绘制方法、装置、存储介质及处理器
CN108550033A (zh) 一种显示数字对象唯一标识符的方法及装置
KR102201577B1 (ko) 쇼핑몰 관련 정보를 제공하는 방법 및 그 장치
US10320786B2 (en) Electronic apparatus and method for controlling the same
KR102332674B1 (ko) 콘텐츠 변경 알림 방법 및 장치
KR102196241B1 (ko) 쇼핑몰 관련 웹사이트를 통한 검색 결과를 제공하는 전자 장치 및 그 동작 방법
EP3058450A1 (en) Rendering interface objects defined by a separate application
US9081487B2 (en) System and method for manipulating an image
TW201617832A (zh) 命令表面深入控制
KR101784796B1 (ko) 네트워크 기기 시뮬레이션 방법

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018555914

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20187031157

Country of ref document: KR

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17905916

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017905916

Country of ref document: EP

Effective date: 20200428