WO2023142755A1 - 设备控制方法、装置、用户设备和计算机可读存储介质 - Google Patents

设备控制方法、装置、用户设备和计算机可读存储介质 Download PDF

Info

Publication number
WO2023142755A1
WO2023142755A1 PCT/CN2022/139285 CN2022139285W WO2023142755A1 WO 2023142755 A1 WO2023142755 A1 WO 2023142755A1 CN 2022139285 W CN2022139285 W CN 2022139285W WO 2023142755 A1 WO2023142755 A1 WO 2023142755A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
user equipment
connectable device
control
connectable
Prior art date
Application number
PCT/CN2022/139285
Other languages
English (en)
French (fr)
Inventor
刘维维
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023142755A1 publication Critical patent/WO2023142755A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces

Definitions

  • the present application relates to the technical field of terminals, and more specifically, relates to a device control method, device, user equipment, and computer-readable storage medium.
  • Embodiments of the present application provide a device control method, device, user equipment, and computer-readable storage medium, which can reduce the complexity of device control.
  • a device control method includes:
  • the first interface includes a real scene image and a control sub-interface of a connectable device in the real scene superimposed on the real scene image;
  • the trigger operation is obtained based on the control sub-interface, and the target connectable device is controlled according to the trigger operation.
  • an equipment control device includes:
  • the display module is used to display the first interface in response to the image acquisition instruction;
  • the first interface includes a real scene image and a control sub-interface of the connectable device in the real scene superimposed on the real scene image;
  • the control module is configured to obtain a trigger operation based on the control sub-interface, and control the target connectable device according to the trigger operation.
  • a user device includes a memory and a processor, wherein a computer program is stored in the memory, and when the computer program is executed by the processor, the processor is made to execute the steps of the above device control method.
  • a computer-readable storage medium stores a computer program thereon, and when the computer program is executed by a processor, the steps of the above device control method are implemented.
  • a computer program product includes a computer program, and when the computer program is executed by a processor, the steps of the above device control method are implemented.
  • FIG. 1 is an application environment diagram of a device control method in an embodiment of the present application
  • FIG. 2 is a flowchart of a device control method in an embodiment of the present application
  • Fig. 3 is a schematic diagram of the first interface in an embodiment of the present application.
  • Fig. 4 is a schematic diagram of the first interface in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a control sub-interface in an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a second operation sub-interface in an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a control sub-interface in an embodiment of the present application.
  • FIG. 8 is a flowchart of a device control method in an embodiment of the present application.
  • FIG. 9 is a flowchart of a device control method in an embodiment of the present application.
  • FIG. 10 is a flowchart of a device control method in an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a device control method in an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a device control method in an embodiment of the present application.
  • FIG. 13 is a flowchart of a device control method in an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a device control method in an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a device control method in an embodiment of the present application.
  • FIG. 16 is a schematic diagram of a device control method in an embodiment of the present application.
  • FIG. 17 is a flowchart of a device control method in an embodiment of the present application.
  • Fig. 18 is a structural block diagram of an equipment control device in an embodiment of the present application.
  • Fig. 19 is a structural block diagram of an equipment control device in an embodiment of the present application.
  • Fig. 20 is a schematic structural diagram of a user equipment in an embodiment of the present application.
  • the device control method provided in the embodiment of the present application may be applied in the application environment shown in FIG. 1 .
  • the user equipment 100 may communicate with the connectable device 200 in the current scene through a network.
  • the aforementioned user equipment 100 may be various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices; the aforementioned portable wearable devices may be smart watches, smart bracelets, head-mounted devices, and the like.
  • the above-mentioned connectable device 200 may be an IoT device such as a smart speaker, a smart TV, a smart air conditioner, and a smart vehicle device.
  • a device control method is provided.
  • the method is applied to the user equipment 100 in FIG. 1 as an example for illustration, including:
  • S102 In response to an image acquisition instruction, display a first interface; the first interface includes a real scene image and a control sub-interface of a connectable device in the real scene superimposed on the real scene image.
  • the above-mentioned image acquisition instruction may be an instruction triggered by the user on the camera operation control of the user equipment.
  • the aforementioned camera operation control may be a physical control on the user equipment, or may be a virtual control displayed on a display screen of the user equipment, which is not limited here.
  • the camera operation control described above may be a control set in an application program of the user equipment, or may be a control set in an operation window of the user equipment, which is not limited here.
  • the above-mentioned camera operation control may be a control set in an application program in the user equipment for connecting to an external device.
  • the user device can collect the real scene image within the field of view of the camera, and superimpose and display the control sub-interface of the connectable device in the real scene on the displayed scene image, and generate the first interface backward user display.
  • the above-mentioned connectable device may be an Internet of Things device set in the current scene, which may be a smart speaker, a smart TV, a smart air conditioner, a smart desk lamp, a smart curtain, etc.; the above-mentioned Internet of Things device may also be a vehicle-mounted device in a smart car, In-vehicle systems, etc., do not limit the types of IoT devices here.
  • the above connectable device may establish a communication connection with the user equipment, and the above communication connection may be a Bluetooth connection, a cellular network connection, or a WiFi connection, which is not limited herein.
  • the above control sub-interface can be used to control connectable devices.
  • the above-mentioned control sub-interface may include controls for controlling connectable devices, and the above-mentioned control sub-interface may also include the remaining power indicator of the connectable device, the network connection status mark, the current working status of the connectable device, etc.; for the display of the control sub-interface
  • the method is not limited here. It should be noted that the control sub-interfaces corresponding to different types of connectable devices may be different.
  • the user equipment can receive the broadcast message sent by each connectable device, and determine which connectable devices are in the current scene according to the broadcast message; further, the user equipment can determine the corresponding control sub-interface according to the content carried in the broadcast message, and then connect the available
  • the control sub-interface of the connected device is superimposed on the real scene image captured by the camera, and the first interface is generated and displayed.
  • the user equipment can superimpose and display the control sub-interfaces of all connectable devices in the current scene on the real scene image, or display the control sub-interfaces of some connectable devices, which is not limited here.
  • the control sub-interface displayed on the first interface may be a control sub-interface corresponding to a connectable device within the field of view of the camera.
  • the aforementioned real scene image may be a static image collected and stored by the user equipment, or may be a preview image displayed in an image collection window of the user equipment, which is not limited here.
  • an image acquisition command may be triggered to display the first interface on the user device; the preview image displayed on the first interface may be a preview image within the field of view of the camera, and the preview image is superimposed on the preview image to display the current view.
  • the connectable devices in the field range can be device A and device B; when the user moves the user device and adjusts the field of view of the camera to another position, the adjusted field of view can be displayed on the first interface
  • the connectable devices in the memory can be device B and device C.
  • the control sub-interfaces of each connectable device can be superimposed and displayed on the real scene image in the form of a list, as shown in FIG. 3 .
  • the display position of the control sub-interface in the real scene image may correspond to the position of the connectable device in the real scene.
  • the connectable devices included in the real scene can be device A and device B
  • the control sub-interface of device A can be located at the location of device A in the real scene image
  • the control sub-interface of device B can be located in the real scene The location of device B in the image.
  • the user equipment may acquire, based on the control sub-interface, a trigger operation performed by the user on the control sub-interface.
  • the user equipment may determine the connectable device corresponding to the control sub-interface operated by the user as the target device, and control the target connectable device according to the trigger operation.
  • the control type may include connection control, disconnection, device operation, system update, sleep, etc., which are not limited here.
  • the user equipment can control the target connectable device through signaling interaction with the target connectable device; or, it can also control the target connectable device through other devices such as a server.
  • control methods we will not do it here. limited.
  • the user device responds to the image acquisition instruction, and displays the first interface including the real scene image and the control sub-interface of the connectable device in the real scene superimposed on the real scene image; and then, based on the control sub-interface to acquire Trigger operation, control the target connectable device according to the trigger operation. Since the first interface displayed by the user equipment includes the display scene image and the connectable device superimposed on the display scene image, the user can quickly perceive the current scene through the first interface after collecting the real scene image.
  • this embodiment relates to a control sub-interface displayed on the first interface.
  • the above-mentioned control sub-interface may include device information of connectable devices, and connection information for connecting connectable devices. controls.
  • the above device information may be a diagram representing the device type of the connectable device, or may be a text describing the information of the connectable device, which is not limited here.
  • the above device information may include a device name of a connectable device, and/or a device type of a connectable device.
  • the user can trigger the above-mentioned connection control by clicking, double-clicking, dragging, and the like. If the user equipment detects that the connection control is triggered, a connection request is generated; and the above connection request is sent to the target connectable device.
  • the above-mentioned connection request is used to request to establish a communication connection with the target connectable device, and the above-mentioned connection request may include the name of the user equipment, and may also include the name of the target connectable device, the IP address of the target connectable device, etc. limited.
  • the control sub-interface of the target connectable device may display a connected state.
  • the connection control on the control sub-interface can be switched to a non-triggerable state; or, the control sub-interface can be switched to a connected state prompt interface; or, the background color of the control sub-interface can be switched to the target color to
  • the target color is used to identify that the connectable device has established a connection with the user device; or, the control sub-interface may display a disconnection control, so as to obtain the user's disconnection instruction through the control; the presentation method of the above connected state is in This is not limited.
  • the user equipment may switch from the first interface to the second interface.
  • the above-mentioned second interface includes a real scene image and a second operation sub-interface of the target connectable device superimposed on the real scene image, as shown in FIG. 6 .
  • the above-mentioned second operation sub-interface includes operation items matching the device type of the target connectable device.
  • the above-mentioned operation item may be used to perform operation control on the target connectable device.
  • the above-mentioned operation control can control the switch of the connectable device, such as the switch of the smart desk lamp; it can also control the working mode of the connectable device, and the types of the above-mentioned operation items are not limited here.
  • Operation items corresponding to different types of connectable devices may be different.
  • the operation items of the smart TV can be used to select the input source, TV program selection, volume control, screen brightness control, etc.
  • the control items of the smart desk lamp can be used to select the on and off status of the desk lamp, and adjust the brightness of the desk lamp.
  • the control sub-interface of the connectable device includes a connection control, so that the user can establish a communication connection with the connectable device based on the connection control, without searching for the name of the connectable device before establishing a connection, which simplifies The operation steps of device connection; especially when the user enters an unfamiliar scene, through the control sub-interface on the first interface, the device can be connected without knowing the device name in advance, so as to improve the user's operating experience.
  • the first interface is switched to the second interface, so that the user can operate and control the connected connectable device on the second interface.
  • the method displays to the user that the device is connected, and the user's operation on the connected device can be further obtained on the second interface, which further simplifies the user's operation control steps.
  • this embodiment relates to a display manner of a control sub-interface.
  • the control sub-interface may further include operation controls corresponding to connectable devices. If the user equipment detects that the operation control is triggered, it may display the first operation sub-interface of the target connectable device.
  • the above-mentioned first operation sub-interface may include operation items matching the device type of the target connectable device. For specific limitations of the foregoing operation items, reference may be made to the foregoing embodiments, and details are not described here.
  • the corresponding operation items displayed on the first operation sub-interface and the second operation sub-interface may be the same.
  • the difference between the first operation sub-interface and the second operation sub-interface may be that the above-mentioned second operation sub-interface is superimposed and displayed on the real scene image after the user equipment establishes a communication connection with the target connectable device, so that the user can operate on the above-mentioned second operation sub-interface. Select the corresponding operation item on the operation sub-interface. At this time, the user may need to further operate the target connectable device, or may not need to perform further operations on it; and the above-mentioned first operation sub-interface is for the user to trigger the operation control of the connectable device.
  • the above-mentioned first operation sub-interface can be superimposed and displayed on the real-scene image, or can be directly switched from the first interface where the real-scene image is located to the above-mentioned first operation sub-interface.
  • the control sub-interface includes a connection control and an operation control
  • the operation control can be used to Obtain the user's trigger operation to control the operation of the target connectable device.
  • the above-mentioned target connectable device may be a smart desk lamp.
  • the user equipment has established a Bluetooth connection with the smart desk lamp.
  • the user equipment can control the switch of the smart desk lamp; In some cases, when the user equipment has not established a Bluetooth connection with the smart desk lamp, after the user triggers the operation control, the user equipment can also control the switch of the smart desk lamp through other devices such as a server.
  • control sub-interface on the first interface includes connection controls and operation controls, providing users with multiple control options, making the user's control of connectable devices more flexible.
  • Fig. 8 is a schematic flowchart of a device control method in an embodiment. This embodiment involves that the user device can obtain the processing method after the user's trigger operation through the operation items on the first operation sub-interface or the second operation sub-interface. On the basis of the foregoing embodiments, as shown in FIG. 8, the foregoing method further includes:
  • the above-mentioned operation control signal may have a one-to-one correspondence with the operation item. For example, if the switch operation item is triggered, a switch control signal is generated; if the volume operation item is triggered, a volume control signal is generated.
  • the user equipment controls the operation of the target connectable device according to the operation control signal.
  • the user equipment when the user equipment has established a communication connection with the target connectable device, the user equipment can communicate with the connectable device.
  • the target connectable device sends an operation control signal.
  • the above operation control method can reduce data interaction between multiple devices and simplify the control process.
  • an operation control signal is sent to the server, so as to perform operation control on the target connectable device through the server.
  • the user equipment can be connected to the server, and the target connectable device can also be connected to the server; after the user equipment generates the operation control signal, it can send the operation control signal to the server, and the server can transmit the operation control signal to the target connectable device, and the target connectable device Take operational control.
  • the above operation control method can realize the operation control of the target connectable device when the user equipment has not established a communication connection with the target connectable device, so that the operation control of the target connectable device is more convenient.
  • the above operation control signal can also be sent to the target connectable device through various paths, the user device can send the operation control signal to the server and the target connectable device simultaneously, and the server can transmit the received operation control signal to the target connectable device.
  • the user device can generate an operation control signal according to the user's trigger operation, and then transmit the operation control signal to the target connectable device through different paths, and can control the connectable device in different connection states, so that the operation Control is more flexible.
  • Fig. 9 is a schematic flowchart of a device control method in an embodiment. This embodiment relates to a way for a user device to generate a first interface. On the basis of the above embodiment, as shown in Fig. 9, before the above S102, it also includes:
  • the user equipment may calculate the location information of each connectable device through measurement, or may directly receive the location information of each connectable device, which is not limited here.
  • the above location information may include the distance between the connectable device and the user equipment, and the orientation information of the connectable device relative to the user equipment.
  • the above orientation information may be an angle between a line connecting the connectable device and the user equipment and a focal plane normal vector of the user equipment.
  • the user equipment may acquire the coordinates of the user equipment itself, the coordinates of the connectable device, and the focal plane normal vector of the user equipment, and determine the position information of the connectable device.
  • the aforementioned coordinates may be coordinates in a coordinate system with the user equipment as a reference, coordinates in a coordinate system with the current scene as a reference, or earth coordinates, etc., which are not limited here.
  • the user equipment has high-precision angle measurement capabilities, such as Ultra Wide Band (UWB) angle measurement capabilities, and the angle measurement sensitive direction of the user equipment is related to the focal plane normal vector of the camera The direction is the same.
  • UWB beacon When the UWB beacon is set on the connectable device, the user device can measure the relative position of the user device and the connectable device without knowing its own coordinates, and obtain the position information of the connectable device.
  • the user equipment may determine which connectable devices are candidate connectable devices within the view range according to the current view range.
  • the user equipment can map the actual space coordinates of the candidate connectable devices to the image coordinates on the real scene image, and superimpose the control sub-interface of the candidate connectable devices on the real scene image according to the above image coordinates On, generate the first interface.
  • the user device superimposes and displays the control sub-interface on the real scene image according to the location information, so that the user can more intuitively see which locations are equipped with connectable devices, and then can control the connectable devices that need to be controlled. control.
  • FIG. 10 is a schematic flowchart of a device control method in an embodiment. This embodiment relates to a way for a user equipment to obtain location information of a connectable device.
  • the above S302 includes:
  • Multiple UWB beacons may be set in the current scenario, and the user equipment obtains the positioning coordinates of the multiple UWB beacons.
  • the user equipment can obtain the positioning coordinates of each UWB beacon in the current scene by accessing the server, and can also receive the broadcast message sent by each UWB beacon, and obtain the above positioning coordinates from the broadcast message.
  • the method of obtaining the positioning coordinates is not limited here.
  • the foregoing broadcast message may be a Bluetooth Low Energy (Bluetooh Low Energy, BLE for short) broadcast or a WiFi broadcast, which is not limited here.
  • the user equipment may measure the distance between the user equipment and each UWB beacon, and then calculate the first coordinate of the user equipment itself. Taking FIG. 11 as an example, the user equipment can determine the coordinates of the user equipment on the two-dimensional plane through the positioning coordinates of the three UWB beacons and the distances R1, R2, and R3 from the above-mentioned UWB beacons. For the three-dimensional space coordinates, the user equipment needs to obtain the positioning coordinates of at least four UWB beacons and the distances to the four UWB beacons.
  • the above-mentioned connectable devices may include connectable devices with fixed positions and connectable devices with variable positions.
  • a connectable device with a fixed location its coordinates are also fixed; the user equipment can obtain the second coordinates of the connectable device by receiving the broadcast message sent by the above connectable device, or access the server to receive the positioning information sent by the server, from The second coordinates of the connectable device are obtained from the above positioning information.
  • the connectable device can calculate its own second coordinates through multiple UWB beacons in the same way as the user equipment, and can send the above-mentioned second coordinates to the user equipment through broadcast information, or The above-mentioned second coordinates may be sent to the server, so that the user equipment may obtain the above-mentioned second coordinates from the server.
  • the user equipment After the user equipment obtains its own coordinates, it also needs to obtain the focal plane normal vector of the camera.
  • the user equipment may display prompt information, and the above prompt information is used to prompt the user to move the user equipment along a preset path; then obtain multiple moving coordinates of the user equipment during the moving process, and determine according to the multiple moving coordinates The camera orientation of the user equipment; finally, according to the camera orientation and the pitch angle of the camera, the focal plane normal vector is obtained.
  • the aforementioned preset path may be centered on the user, the camera of the user equipment is directed toward the side of the principle user, and the user equipment is rotated, as shown in FIG. 12 .
  • the user equipment moves from the initial position to the final position.
  • the user equipment can calculate the movement trajectory, so as to determine the inside and outside of the user equipment according to the trajectory, that is, the side facing the user and the side away from the user; The side of is determined as the camera orientation.
  • the user equipment After the user equipment determines the orientation of the camera, it can obtain the pitch angle of the camera according to a sensor set on the user equipment, such as a rotation vector sensor. On the basis of obtaining the orientation and pitch angle of the camera, the user equipment can determine the normal vector of the focal plane of the camera.
  • S410 Determine the position information of the connectable device according to the first coordinate, the second coordinate, and the focal plane normal vector.
  • the user equipment may determine the position information of the connectable equipment according to the first coordinate, the second coordinate and the focal plane normal vector.
  • the user device when the location beacon is set in the current scene, the user device can obtain its own first coordinate, the second coordinate of the connectable device, and the focal plane normal vector of the camera, and can obtain a more accurate position of the connectable device information, which improves the accuracy of overlaying control sub-interfaces on real scene images.
  • FIG. 13 is a schematic flowchart of a device control method in an embodiment. This embodiment relates to a way for a user equipment to obtain location information of a connectable device.
  • the above S302 includes:
  • S502. Collect a real scene image, and identify a positioning reference line in the real scene image; the positioning reference line includes a vertical reference line.
  • the user equipment After the user equipment acquires the image acquisition instruction, it can acquire the image of the real scene. Further, the user equipment can use AR scene understanding to identify the positioning reference line in the real scene image.
  • the above-mentioned positioning reference line may be a boundary line between adjacent walls, such as a connection line between a ceiling, a floor, and a wall.
  • the positioning reference line may include a vertical reference line, and the vertical reference line may be a wall connection line perpendicular to the ground.
  • the indoor space can be an approximate cuboid, and the model parameters of the indoor scene can be preset in the server.
  • the above model parameters may include various dimensions of the indoor space, such as wall height, floor length, and the like.
  • the user equipment can obtain the model parameters of the current scene from the server.
  • the user equipment may correspond the model parameters with the identified positioning reference line to obtain the size of the positioning reference line.
  • S506. Determine the third coordinate of the user equipment and the focal plane normal vector of the camera of the user equipment according to the preset camera parameters, the size of the reference line, and the pitch angle of the camera.
  • the user equipment can determine the third coordinate of the user equipment according to the preset camera parameters, the size of the reference line and the pitch angle of the camera.
  • the user equipment can determine the height of the user equipment and the distance between the user equipment and each vertical reference line according to the pitch angle of the camera and the parameters of the camera.
  • O in the figure may indicate the camera of the user equipment
  • AB in the figure may be the vertical reference line in the current scene
  • A'B'C' in the figure may represent the imaging of the camera.
  • the distance OC between the camera and the vertical reference line AB can be: (f ⁇ AB)/A'B'.
  • the height of the user equipment it can be obtained by (B'C' ⁇ AB)/A'B'.
  • the user equipment may determine the coordinates of the user equipment on the horizontal plane according to the distance between the user equipment and at least two vertical reference lines. As shown in FIG. 15 , the user equipment can capture two vertical reference lines at the same time, and calculate the coordinates of the user equipment on the horizontal plane according to the distances R1 and R2 between the user equipment and the two vertical reference lines. On the basis of obtaining the horizontal coordinates of the user equipment and the height of the user equipment, the user equipment obtains the third coordinates of the user equipment according to the coordinates of the horizontal plane and the height of the user equipment.
  • the user equipment can determine the focal plane normal vector of the camera of the user equipment according to the preset camera parameters, the size of the reference line, and the pitch angle of the camera. As shown in FIG. 16 , the user equipment may calculate the deflection of the focal plane normal vector relative to the vertical reference line on the horizontal plane according to the distance from the vertical reference line AB to the imaging center. After the vertical reference line AB deviates from the axis of the camera lens, its image A'B' deviates from the imaging center by a certain distance.
  • the deflection angle of the camera axis relative to the vertical reference line AB may be arctan(PC/OC); wherein, point P is the intersection point between the imaging of the vertical reference line and the imaging center. If the object distance is far greater than the distance, OC is the focal length f of the camera, and the deflection angle can be arctan(PC/f).
  • S508. Determine position information of the connectable device according to the third coordinate, the fourth coordinate of the connectable device, and the focal plane normal vector.
  • the user equipment may also acquire the fourth coordinates of the connectable equipment.
  • the above-mentioned connectable devices may include connectable devices with fixed positions and connectable devices with variable positions.
  • For a connectable device with a fixed location its coordinates are also fixed; the user equipment can obtain the fourth coordinate of the connectable device by receiving the broadcast message sent by the above connectable device, or access the server to receive the positioning information sent by the server, from The fourth coordinate of the connectable device is obtained from the above positioning information.
  • UWB beacons can be set on the above connectable devices; the user equipment can measure the distance to the connectable devices at multiple spatial locations, and then determine the distance based on the measured multiple distance values The fourth coordinate of the connectable device.
  • the user equipment can determine the position information of the connectable device according to the third coordinate, the fourth coordinate of the connectable device, and the focal plane normal vector.
  • the user equipment can acquire the location information of the connectable device in a scene where there is no UWB beacon, which improves the applicability of the device control method.
  • a device control method As shown in FIG. 17, the above method includes:
  • S604. Determine, according to the location information and the field of view of the user equipment, candidate connectable devices within the field of view.
  • connection control on the first interface If it is detected that the connection control on the first interface is triggered, generate a connection request.
  • steps in the flow charts involved in the above embodiments are shown sequentially according to the arrows, these steps are not necessarily executed sequentially in the order indicated by the arrows. Unless otherwise specified herein, there is no strict order restriction on the execution of these steps, and these steps can be executed in other orders. Moreover, at least some of the steps in the flow charts involved in the above embodiments may include multiple steps or stages, and these steps or stages are not necessarily executed at the same time, but may be executed at different times, The execution order of these steps or stages is not necessarily performed sequentially, but may be executed in turn or alternately with other steps or at least a part of steps or stages in other steps.
  • an embodiment of the present application further provides an equipment control device for implementing the above-mentioned equipment control method.
  • the solution to the problem provided by the device is similar to the implementation described in the above method, so the specific limitations in one or more device control device embodiments provided below can refer to the above-mentioned limitations on the device control method, I won't repeat them here.
  • a device control device including:
  • the display module 10 is configured to display a first interface in response to an image acquisition instruction; the first interface includes a real scene image and a control sub-interface of a connectable device in a real scene superimposed on the real scene image;
  • the control module 20 is configured to acquire a trigger operation based on the control sub-interface, and control the target connectable device according to the trigger operation.
  • control sub-interface includes device information of connectable devices and connection controls for connecting connectable devices.
  • the foregoing device information includes device names of connectable devices, and/or device types of connectable devices.
  • the above-mentioned control module 20 is specifically configured to: generate a connection request if it is detected that the connection control is triggered; the connection request is used to request to establish a communication connection with the target connectable device; A connection request is sent to the target connectable device.
  • control sub-interface further includes operation controls corresponding to connectable devices; the above-mentioned control module 20 is specifically configured to: if it is detected that the operation controls are triggered, display the information of the target connectable device.
  • the above control module 20 is further configured to: switch from the first interface to the second interface after establishing a communication connection with the target connectable device; the second interface includes a real scene image And a second operation sub-interface of the target connectable device superimposed and displayed on the real scene image; the second operation sub-interface includes operation items matching the device type of the target connectable device.
  • control module 20 is further configured to: if it is detected that the operation item is triggered, generate an operation control signal corresponding to the triggered operation item; Devices can be connected for operational control.
  • control module 20 is specifically configured to: send an operation control signal to a target connectable device through a communication connection with the connectable device.
  • control module 20 is specifically configured to: send an operation control signal to the server, so as to perform operation control on the target connectable device through the server.
  • the display position of the control sub-interface in the real scene image corresponds to the position of the connectable device in the real scene.
  • the above apparatus further includes a generation module 30, configured to: obtain the location information of each connectable device in the current scene; the location information includes the connectable device and The distance between user equipment and the orientation information of the connectable equipment relative to the user equipment; according to the position information and the field of view of the user equipment, determine the candidate connectable equipment within the field of view; control the candidate connectable equipment The sub-interface is superimposed and displayed on the real scene image within the field of view according to the position information to generate the first interface.
  • the generation module 30 is specifically configured to: obtain multiple positioning coordinates sent by multiple positioning beacons set in the current scene; determine the first location of the user equipment according to the multiple positioning coordinates A coordinate; obtain the second coordinate of the connectable device; the second coordinate is determined according to the broadcast information sent by the connectable device, or determined according to the positioning information sent by the server; obtain the focal plane normal vector of the camera of the user device; According to the first coordinate, the second coordinate and the focal plane normal vector, determine the position information of the connectable device.
  • the generation module 30 is specifically configured to: display prompt information; the prompt information is used to prompt the user to move the user equipment according to a preset path; Coordinates, and determine the camera orientation of the user equipment according to multiple mobile coordinates; obtain the focal plane normal vector according to the camera orientation and the pitch angle of the camera.
  • the generation module 30 is specifically configured to: collect a real scene image, identify a positioning reference line in the real scene image; the positioning reference line includes a vertical reference line; according to the preset current scene
  • the model parameters of the user equipment determine the size of the positioning reference line; according to the preset camera parameters, the size of the reference line and the pitch angle of the camera, determine the third coordinate of the user equipment and the focal plane normal vector of the camera of the user equipment; according to the third Coordinates, the fourth coordinate of the connectable device, and the focal plane normal vector determine the position information of the connectable device.
  • the generating module 30 is specifically configured to: determine the height of the user equipment and the distance between the user equipment and the vertical reference line according to the pitch angle of the camera and camera parameters; The distance between the device and at least two vertical reference lines, determine the coordinates of the user equipment on the horizontal plane, and obtain the third coordinate of the user equipment according to the coordinates of the horizontal plane and the height of the user equipment; according to the distance between the vertical reference lines on the imaging plane of the camera The distance from the imaging center to calculate the focal plane normal vector of the camera.
  • Each module in the above-mentioned equipment control device can be fully or partially realized by software, hardware and a combination thereof.
  • the above-mentioned modules can be embedded in or independent of the processor in the computer device in the form of hardware, and can also be stored in the memory of the computer device in the form of software, so that the processor can invoke and execute the corresponding operations of the above-mentioned modules.
  • a user equipment is provided, and its internal structure diagram may be as shown in FIG. 20 .
  • the user equipment includes a processor, a memory, an input/output interface, a communication interface, a display unit and an input device.
  • the processor, the memory and the input/output interface are connected through the system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface.
  • the processor of the user equipment is used to provide calculation and control capabilities.
  • the memory of the user equipment includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system and computer programs.
  • the internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage medium.
  • the input/output interface of the user equipment is used for exchanging information between the processor and external equipment.
  • the communication interface of the user equipment is used to communicate with an external terminal in a wired or wireless manner, and the wireless manner can be realized through WIFI, mobile cellular network, NFC (Near Field Communication) or other technologies.
  • WIFI Wireless Fidelity
  • NFC Near Field Communication
  • the computer program is executed by the processor, a device control method is realized.
  • the display unit of the user equipment is used to form a visually visible picture, and may be a display screen, a projection device or a virtual reality imaging device.
  • the display screen may be a liquid crystal display screen or an electronic ink display screen
  • the input device of the computer device may be a touch layer covered on the display screen, or a button, a trackball or a touch pad set on the casing of the computer device, or a External keyboard, touchpad or mouse etc.
  • Figure 20 is only a block diagram of a partial structure related to the solution of this application, and does not constitute a limitation on the computer equipment on which the solution of this application is applied.
  • the specific computer equipment can be More or fewer components than shown in the figures may be included, or some components may be combined, or have a different arrangement of components.
  • the embodiment of the present application also provides a computer-readable storage medium.
  • One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the device control method.
  • the embodiment of the present application also provides a computer program product including instructions, which, when running on a computer, causes the computer to execute the device control method.
  • any reference to storage, database or other media used in the various embodiments provided in the present application may include at least one of non-volatile and volatile storage.
  • Non-volatile memory can include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory, optical memory, high-density embedded non-volatile memory, resistive variable memory (ReRAM), magnetic variable memory (Magnetoresistive Random Access Memory, MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (Phase Change Memory, PCM), graphene memory, etc.
  • the volatile memory may include random access memory (Random Access Memory, RAM) or external cache memory, etc.
  • RAM Random Access Memory
  • RAM Random Access Memory
  • RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM).
  • the databases involved in the various embodiments provided in this application may include at least one of a relational database and a non-relational database.
  • the non-relational database may include a blockchain-based distributed database, etc., but is not limited thereto.
  • the processors involved in the various embodiments provided by this application can be general-purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, data processing logic devices based on quantum computing, etc., and are not limited to this.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Selective Calling Equipment (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请涉及一种设备控制方法、装置、用户设备和计算机可读存储介质,用户设备响应于图像采集指令,显示包括现实场景图像以及叠加显示在现实场景图像上的现实场景中的连接设备的控制子界面的第一界面;然后,基于控制子界面获取触发操作,根据触发操作控制目标可连接设备。由于用户设备显示的第一界面中包括了显示场景图像以及叠加在该显示场景图像上的可连接设备,使得用户可以在采集现实场景图像之后,可以通过该第一界面快速感知当前场景中都有哪些可连接设备;进一步地,由于第一界面上包括了可连接设备的控制子界面,用户可以通过对目标可连接设备的控制子界面执行触发操作,完成对目标可连接设备的控制。

Description

设备控制方法、装置、用户设备和计算机可读存储介质
相关申请
本申请要求2022年01月25日申请的,申请号为2022100846289,名称为“设备控制方法、装置、用户设备和计算机可读存储介质”的中国专利申请的优先权,在此将其全文引入作为参考。
技术领域
本申请涉及终端技术领域,更具体的说,涉及一种设备控制方法、装置、用户设备和计算机可读存储介质。
背景技术
目前,用户在使用蓝牙音箱、智能电视等设备时,可以在手机的设备搜索界面上查询可连接设备的名称,然后对其进行连接。
然而,随着可连接的智能设备越来越多,用户可能需要临时连接一些设备。
发明内容
本申请实施例提供了一种设备控制方法、装置、用户设备和计算机可读存储介质,可以降低设备控制的复杂度。
第一方面,一种设备控制方法,包括:
响应于图像采集指令,显示第一界面;第一界面包括现实场景图像以及叠加显示在现实场景图像上的现实场景中可连接设备的控制子界面;
基于控制子界面获取触发操作,根据触发操作控制目标可连接设备。
第二方面,一种设备控制装置,包括:
显示模块,用于响应于图像采集指令,显示第一界面;第一界面包括现实场景图像以及叠加显示在现实场景图像上的现实场景中可连接设备的控制子界面;
控制模块,用于基于控制子界面获取触发操作,根据触发操作控制目标可连接设备。
第三方面,一种用户设备,包括存储器及处理器,存储器中储存有计算机程序,计算机程序被处理器执行时,使得处理器执行上述设备控制方法的步骤。
第四方面,一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述设备控制方法的步骤。
第五方面,一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现上述设备控制方法的步骤。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据公开的附图获得其他的附图。
图1为本申请一个实施例中设备控制方法的应用环境图;
图2为本申请一个实施例中设备控制方法的流程图;
图3为本申请一个实施例中第一界面的示意图;
图4为本申请一个实施例中第一界面的示意图;
图5为本申请一个实施例中控制子界面的示意图;
图6为本申请一个实施例中第二操作子界面的示意图;
图7为本申请一个实施例中控制子界面的示意图;
图8为本申请一个实施例中设备控制方法的流程图;
图9为本申请一个实施例中设备控制方法的流程图;
图10为本申请一个实施例中设备控制方法的流程图;
图11为本申请一个实施例中设备控制方法的示意图;
图12为本申请一个实施例中设备控制方法的示意图;
图13为本申请一个实施例中设备控制方法的流程图;
图14为本申请一个实施例中设备控制方法的示意图;
图15为本申请一个实施例中设备控制方法的示意图;
图16为本申请一个实施例中设备控制方法的示意图;
图17为本申请一个实施例中设备控制方法的流程图;
图18为本申请一个实施例中设备控制装置的结构框图;
图19为本申请一个实施例中设备控制装置的结构框图;
图20为本申请一个实施例中用户设备的结构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
本申请实施例提供的设备控制方法,可以应用于如图1所示的应用环境中。其中,用户设备100可以通过网络与当前场景中的可连接设备200进行通信连接。上述用户设备100可以是各种个人计算机、笔记本电脑、智能手机、平板电脑和便携式可穿戴设备;上述便携式可穿戴设备可以是智能手表、智能手环、头戴设备等。上述可连接设备200可以是智能音箱、智能电视、智能空调、智能车载设备等物联网设备。
在一个实施例中,如图2所示,提供了一种设备控制方法,以该方法应用于图1中的用户设备100为例进行说明,包括:
S102、响应于图像采集指令,显示第一界面;第一界面包括现实场景图像以及叠加显示在现实场景图像上的现实场景中可连接设备的控制子界面。
其中,上述图像采集指令可以是用户在用户设备的摄像头操作控件上触发的指令。上述摄像头操作控件可以是用户设备上的实体控件,也可以是用户设备的显示屏上显示的虚拟控件,在此不做限定。上述摄像头操作控件可以设置于用户设备的应用程序中的控件,也可以是设置于用户设备的操作窗口中的控件,在此不做限定。例如,上述摄像头操作控件可以是设置于用户设备中用于连接外部设备的应用程序中的控件。
用户发送上述图像采集指令之后,用户设备可以采集摄像头视场范围内的现实场景图像,并将现实场景中的可连接设备的控制子界面叠加显示在显示场景图象上,生成第一界面后向用户展示。
其中,上述可连接设备可以是设置于当前场景中的物联网设备,可以是智能音箱、智能电视、智能空调、智能台灯、智能窗帘等;上述物联网设备还可以是智能汽车中的车载设备、车机系统等,对于物联网设备的类型在此不做限定。上述可连接设备可以与用户设备建立通信连接,上述通信连接可以是蓝牙连接,也可以是通过蜂窝网络连接,还可以是通过WiFi连接,在此不做限定。
上述控制子界面可以用于对可连接设备进行控制。上述控制子界面可以包括对可连接设备进行控制的控件,上述控制子界面还可以包括可连接设备的剩余电量标识、网络连接状态标识、可连接设备的当前工作状态等;对于控制子界面的显示方式在此不做限定。需 要说明的是,不同类型的可连接设备对应的控制子界面可以不同。
用户设备可以接收各个可连接设备发送的广播消息,根据广播消息确定当前场景下有哪些可连接设备;进一步地,用户设备可以根据广播消息中携带的内容,确定对应的控制子界面,然后将可连接设备的控制子界面叠加显示摄像头拍摄的现实场景图像上,生成第一界面并显示。用户设备可以将当前场景下所有的可连接设备的控制子界面均叠加显示在现实场景图像上,也可以显示部分可连接设备的控制子界面,在此不做限定。在一种实现方式中,上述第一界面上显示的控制子界面,可以是位于摄像头的视场范围内的可连接设备对应的控制子界面。
上述现实场景图像可以是用户设备采集并存储的静态图像,也可以是用户设备的图像采集窗口中显示的预览图像,在此不做限定。例如,用户进入当前场景之后,可以触发图像采集指令,使得用户设备上显示第一界面;上述第一界面上显示的可以是摄像头的视场范围内的预览图像,上述预览图像上叠加显示当前视场范围内存在的可连接设备,可以是设备A和设备B;当用户移动用户设备,将摄像头的视场范围调整至另一位置之后,第一界面上显示的可以是调整后的视场范围内存在的可连接设备,可以是设备B和设备C。
各个可连接设备的控制子界面可以按照列表方式叠加显示在现实场景图像上,如图3所示。可选地,上述控制子界面在现实场景图像中的显示位置可以与可连接设备在现实场景中的位置对应。如图4所示,现实场景中包括的可连接设备可以为设备A和设备B,设备A的控制子界面可以位于现实场景图像中设备A所在位置处;设备B的控制子界面可以位于现实场景图像中设备B所在位置处。
S104、基于控制子界面获取触发操作,根据触发操作控制目标可连接设备。
用户设备可以基于控制子界面,获取用户在控制子界面上执行的触发操作。用户设备可以将用户操作的控制子界面对应的可连接设备确定为目标设备,并根据上述触发操作对上述目标可连接设备进行控制。用户设备对可连接设备进行控制时,控制类型可以包括连接控制、断开连接、设备操作、系统更新、休眠等,在此不做限定。
用户设备可以通过与目标可连接设备之间的信令交互,对目标可连接设备进行控制;或者,还可以通过服务器等其它设备对目标可连接设备进行控制,对于上述控制方式,在此不做限定。
上述设备控制方法,用户设备响应于图像采集指令,显示包括现实场景图像以及叠加显示在现实场景图像上的现实场景中的可连接设备的控制子界面的第一界面;然后,基于控制子界面获取触发操作,根据触发操作控制目标可连接设备。由于用户设备显示的第一界面中包括了显示场景图像以及叠加在该显示场景图像上的可连接设备,使得用户可以在采集现实场景图像之后,可以通过该第一界面快速感知当前场景中都有哪些可连接设备;进一步地,由于第一界面上包括了可连接设备的控制子界面,用户可以通过对目标可连接设备的控制子界面执行触发操作,完成对目标可连接设备的控制,简化对可连接设备的控制操作。
在一个实施例中,如图5所示,本实施例涉及第一界面上显示的一种控制子界面,上述控制子界面可以包括可连接设备的设备信息,以及用于连接可连接设备的连接控件。
上述设备信息可以为表征可连接设备的设备类型的图表,也可以是描述可连接设备信息的文字,在此不做限定。可选地,上述设备信息可以包括可连接设备的设备名称,和/或,可连接设备的设备类型。
用户可以通过点击、双击、拖动等方式触发上述连接控件。若用户设备检测到连接控件被触发,则生成连接请求;并将上述连接请求发送至目标可连接设备。其中,上述连接请求用于请求与目标可连接设备建立通信连接,上述连接请求可以包括用户设备的名称,还可以包括目标可连接设备的名称、目标可连接设备的IP地址等,在此不做限定。
在上述连接控件被触发,用户设备与目标可连接设备建立通信连接之后,该目标可连接设备的控制子界面上可以呈现已连接状态。例如,该控制子界面上的连接控件可以切换 为不可触发状态;或者,该控制子界面上可以切换至已连接状态的提示界面;或者,该控制子界面的背景颜色可以切换为目标颜色,以通过目标颜色标识该可连接设备已与用户设备建立连接;或者,该控制子界面上可以显示断开连接控件,以通过该控件获取用户的断开连接指令;对于上述已连接状态的呈现方式在此不做限定。
在一种实现方式中,在用户设备与目标可连接设备建立通信连接之后,用户设备可以从第一界面切换至第二界面。其中,上述第二界面包括现实场景图像以及叠加显示在现实场景图像上的目标可连接设备的第二操作子界面,如图6所示。上述第二操作子界面包括与目标可连接设备的设备类型匹配的操作项。
其中,上述操作项可以用于对目标可连接设备进行操作控制。上述操作控制可以控制可连接设备的开关,例如智能台灯的开关;也可以控制可连接设备的工作模式,对于上述操作项的类型在此不做限定。对于不同类型的可连接设备对应的操作项可以不同。例如,智能电视的操作项可以用于进行输入源的选择、电视节目选择、音量控制、屏幕亮度控制等;智能台灯的控制项可以用于选择台灯的开启和关闭状态,以及调节台灯的亮度。
上述设备控制方法,可连接设备的控制子界面上包括连接控件,使得用户可以基于该连接控件与可连接设备建立通信连接,而不需要通过搜索查询可连接设备的名称之后再建立连接,简化了设备连接的操作步骤;尤其是对于用户进入不熟悉的场景时,通过该第一界面上的控制子界面,可以在未提前获知设备名称的情况下连接设备,提升用户的操作体验。进一步地,当用户设备与目标可连接设备建立通信连接之后,将第一界面切换至第二界面,使得用户可以在第二界面上对已连接的可连接设备进行操作控制,既可以通过界面切换的方式向用户显示该设备已连接,又可以在第二界面上进一步获取用户对已连接设备的操作,进一步简化了用户的操作控制步骤。
在一个实施例中,在上述实施例的基础上,本实施例涉及一种控制子界面的显示方式。如图7所示,上述控制子界面还可以包括可连接设备对应的操作控件。若用户设备检测到操作控件被触发,则可以显示目标可连接设备的第一操作子界面。其中,上述第一操作子界面可以包括与目标可连接设备的设备类型匹配的操作项。上述操作项的具体限定可以参见上述实施例,在此不做赘述。对于同一个可连接设备,其对应的第一操作子界面和第二操作子界面上显示的操作项可以相同。第一操作子界面和第二操作子界面不同可以在于,上述第二操作子界面是用户设备与目标可连接设备建立通信连接之后,叠加显示在现实场景图像上的,使得用户可以在上述第二操作子界面上选择对应的操作项,此时用户可能需要进一步操作目标可连接设备,也可能不需要对其进行进一步操作;而上述第一操作子界面是用户触发可连接设备的操作控件,确定用户需要对可连接设备进行操作控制之后显示的界面,上述第一操作子界面可以叠加显示在现实场景图像上,也可以直接从现实场景图像所在的第一界面切换为上述第一操作子界面。
由于上述控制子界面中包括了连接控件与操作控件,用户设备与目标可连接设备建立通信连接的情况下,或者用户设备未与目标可连接设备建立通信连接的情况下,均可以通过该操作控件获取用户的触发操作,对目标可连接设备进行操作控制。例如,上述目标可连接设备可以是智能台灯,在一种情况下用户设备已与智能台灯建立蓝牙连接,当用户触发该操作控件之后,用户设备可以对智能台灯的开关进行控制;在另一种情况下,用户设备未与智能台灯建立蓝牙连接时,当用户触发该操作控件之后,用户设备也可以通过服务器等其它设备对智能台灯的开关进行控制。
上述设备控制方法,第一界面上的控制子界面上包括连接控件和操作控件,为用户提供了多种控制选择,使得用户对可连接设备的控制更加灵活。
图8为一个实施例中设备控制方法的流程示意图,本实施例涉及用户设备可以通过上述第一操作子界面或第二操作子界面上的操作项,获取用户的触发操作之后的处理方式,在上述实施例的基础上,如图8所示,上述方法还包括:
S202、若检测到操作项被触发,则生成与被触发的操作项对应的操作控制信号。
上述操作控制信号可以与操作项一一对应,例如若开关操作项被触发,则生成开关控制信号;若音量操作项被触发,则生成音量控制信号。
S204、根据操作控制信号对目标可连接设备进行操作控制。
用户设备根据操作控制信号对目标可连接设备进行操作控制,在一种实现方式中,用户设备与目标可连接设备已建立通信连接的情况下,可以通过与可连接设备之间的通信连接,向目标可连接设备发送操作控制信号。上述操作控制方式可以减少多设备之间的数据交互,简化控制流程。
在另一种实现方式中,无论用户设备与目标可连接设备是否已建立向服务器发送操作控制信号,以通过服务器对目标可连接设备进行操作控制。例如,用户设备可以与服务器连接,目标可连接设备也可以与服务器连接;用户设备生成操作控制信号之后,可以将操作控制信号发送至服务器,通过服务器传递至目标可连接设备,对目标可连接设备进行操作控制。上述操作控制方式可以在用户设备与目标可连接设备未建立通信连接的情况下,实现对目标可连接设备的操作控制,使得对目标可连接设备的操作控制更便利。
上述操作控制信号还可以通过多种路径发送至目标可连接设备,用户设备可以将操作控制信号同时发送至服务器和目标可连接设备,服务器可以将接收到的操作控制信号传递至目标可连接设备。
上述设备控制方法,用户设备可以根据用户的触发操作生成操作控制信号,然后通过不同的路径将操作控制信号传递至目标可连接设备,可以对不同连接状态下的可连接设备进行操作控制,使得操作控制更灵活。
图9为一个实施例中设备控制方法的流程示意图,本实施例涉及用户设备生成第一界面的一种方式,在上述实施例的基础上,如图9所示,上述S102之前,还包括:
S302、获取当前场景中各个可连接设备的位置信息;位置信息包括可连接设备与用户设备之间的距离,以及可连接设备相对于用户设备的方位信息。
用户设备可以通过测量,计算各个可连接设备的位置信息,也可以直接接收各个可连接设备的位置信息,在此不做限定。
上述位置信息可以包括可连接设备与用户设备之间的距离,以及可连接设备相对于用户设备的方位信息。上述方位信息可以是可连接设备与用户设备的连线,与用户设备的焦平面法线矢量之间的夹角。
在一种实现方式中,用户设备可以获取用户设备自身的坐标、可连接设备的坐标、以及用户设备的焦平面法线矢量,确定上述可连接设备的位置信息。上述坐标可以为以用户设备为参考的坐标系下的坐标,也可以是以当前场景为参考的坐标系下的坐标,还可以是地球坐标等,在此不做限定。
在另一种实现方式中,若用户设备具有高精度的测角能力,例如超宽带(Ultra Wide Band,简称UWB)测角能力,且用户设备的测角敏感方向与摄像头的焦平面法线矢量方向一致,在可连接设备设置了UWB信标的情况下,用户设备可以在不知道自身坐标的前提下测得用户设备与可连接设备的相对位置,获得可连接设备的位置信息。
S304、根据各位置信息以及用户设备的视场范围,确定位于视场范围内的候选可连接设备。
在确定了各个可连接设备的位置信息的基础上,用户设备可以根据当前视场范围,确定哪些可连接设备是位于视场范围内的候选可连接设备。
S306、将候选可连接设备的控制子界面,按照位置信息叠加显示在视场范围内的现实场景图像上,生成第一界面。
在上述步骤的基础上,用户设备可以将候选可连接设备的实际空间坐标,对应到现实场景图像上的图像坐标上,并将候选可连接设备的控制子界面按照上述图像坐标叠加在现实场景图像上,生成第一界面。
上述设备控制方法,用户设备通过将控制子界面按照位置信息叠加显示在现实场景图 像上,使得用户可以更直观地看到哪些位置上设置有可连接设备,进而可以对需要控制的可连接设备进行控制。
图10为一个实施例中设备控制方法的流程示意图,本实施例涉及用户设备获取可连接设备的位置信息的一种方式,在上述实施例的基础上,如图10所示,上述S302包括:
S402、获取设置于当前场景中的多个定位信标发送的多个定位坐标。
当前场景下可以设置多个UWB信标,用户设备获取上述多个UWB信标的定位坐标。用户设备可以通过访问服务器获取当前场景下各个UWB信标的定位坐标,也可以接收各个UWB信标发送的广播消息,从广播消息中获取上述定位坐标,对于定位坐标的获取方式在此不做限定。上述广播消息可以是低功耗蓝牙(Bluetooh Low Energy,简称BLE)广播,也可以是WiFi广播,在此不做限定。
S404、根据多个定位坐标确定用户设备的第一坐标。
用户设备可以测量用户设备与各个UWB信标之间的距离,进而解算用户设备自身的第一坐标。以图11为例,用户设备可以通过3个UWB信标的定位坐标,以及与上述UWB信标之间的距离R1、R2、R3,确定用户设备在二维平面上的坐标。对于三维空间坐标,用户设备需获取至少4个UWB信标的定位坐标以及与该4个UWB信标之间的距离。
S406、获取可连接设备的第二坐标;第二坐标为根据可连接设备发送的广播信息确定的,或,根据服务器发送的定位信息确定的。
上述可连接设备可以包括位置固定的可连接设备,以及位置可变的可连接设备。对于位置固定的可连接设备,其坐标也是固定的;用户设备可以通过接收上述可连接设备发送的广播消息获得可连接设备的第二坐标,也可以通过访问服务器,接收服务器发送的定位信息,从上述定位信息中获取可连接设备的第二坐标。对于位置可变的可连接设备,可连接设备可以采用与用户设备相同的方式,通过多个UWB信标解算自身的第二坐标,可以将上述第二坐标通过广播信息发送至用户设备,也可以将上述第二坐标发送至服务器,使得用户设备可以从服务器获取到上述第二坐标。
S408、获取用户设备的摄像头的焦平面法线矢量。
用户设备在获取自身的坐标之后,还需要获取摄像头的焦平面法线矢量。在一种实现方式中,用户设备可以显示提示信息,上述提示信息用于提示用户按照预设路径移动用户设备;然后获取用户设备在移动过程中的多个移动坐标,并根据多个移动坐标确定用户设备的摄像头朝向;最后,根据摄像头朝向以及摄像头的俯仰角度,获得焦平面法线矢量。
其中,上述预设路径可以是以用户为中心,用户设备的摄像头朝向原理用户的一侧,旋转用户设备,如图12所示。上图中用户设备从初始位置移动至终了位置,用户设备可以解算出移动轨迹,从而根据轨迹确定用户设备的内侧和外侧,也就是朝向用户的一侧以及远离用户的一侧;进而将远离用户的一侧确定为摄像头朝向。
用户设备确定摄像头朝向之后,可以根据用户设备上设置的传感器,例如旋转矢量传感器,获得摄像头的俯仰角度。在获得摄像头朝向以及俯仰角度的基础上,用户设备可以确定摄像头的焦平面法线矢量。
S410、根据第一坐标、第二坐标以及焦平面法线矢量,确定可连接设备的位置信息。
在上述步骤的基础上,用户设备可以根据第一坐标、第二坐标以及焦平面法线矢量,确定可连接设备的位置信息。
上述设备控制方法,当前场景设置定位信标的情况下,用户设备可以获取自身的第一坐标、可连接设备的第二坐标以及摄像头的焦平面法线矢量,可以获得更准确的可连接设备的位置信息,提高了现实场景图像上叠加控制子界面的精确度。
图13为一个实施例中设备控制方法的流程示意图,本实施例涉及用户设备获取可连接设备的位置信息的一种方式,在上述实施例的基础上,如图13所示,上述S302包括:
S502、采集现实场景图像,识别现实场景图像中的定位基准线;定位基准线包括垂直基准线。
用户设备可以在获取图像采集指令之后,采集现实场景图像。进一步地,用户设备可以采用AR场景理解,识别现实场景图像中的定位基准线。以室内场景为例,上述定位基准线可以是相邻墙面之间的交界线,例如天花板、地板和墙壁之间的连接线。上述定位基准线可以包括垂直基准线,上述垂直基准线可以是垂直于地面的墙面连接线。
S504、根据预设的当前场景的模型参数,确定定位基准线的尺寸。
对于大部分的室内场景,室内空间可以是一个近似的长方体,服务器中可以预设室内场景的模型参数。上述模型参数可以包括室内空间的各个尺寸,例如墙面高度、地面长度等。用户设备可以从服务器获取当前场景的模型参数。
用户设备可以将模型参数与上述识别到的定位基准线进行对应,获得定位基准线的尺寸。
S506、根据预设的摄像头参数、基准线的尺寸以及摄像头的俯仰角,确定用户设备的第三坐标以及用户设备的摄像头的焦平面法线矢量。
在上述步骤的基础上,用户设备可以根据预设的摄像头参数、基准线的尺寸以及摄像头的俯仰角,确定用户设备的第三坐标。
用户设备可以根据摄像头的俯仰角度以及摄像头参数,确定用户设备的高度,以及用户设备与各垂直基准线之间的距离。如图14所示,图中O可以标识用户设备的摄像头,图中AB可以为当前场景中的垂直基准线,图中A′B′C′可以表示摄像头的成像。已知焦距f和垂直基准线AB的长度,在成像图像中计算A′B′的大小,那么摄像头距离垂直基准线AB的距离OC可以为:(f×AB)/A′B′。对于用户设备的高度,可以通过(B′C′×AB)/A′B′得到。
进一步地,用户设备可以根据用户设备与至少两个垂直基准线之间的距离,确定用户设备在水平面的坐标。如图15所示,用户设备可以同时拍摄到两条垂直基准线,根据用户设备与两条垂直基准线之间的距离R1和R2,计算出用户设备在水平面的坐标。在获得用户设备的水平面坐标、用户设备的高度的基础上,用户设备根据水平面的坐标以及用户设备的高度获得用户设备的第三坐标。
用户设备可以根据预设的摄像头参数、基准线的尺寸以及摄像头的俯仰角,确定用户设备的摄像头的焦平面法线矢量。如图16所示,用户设备可以根据垂直基准线AB距离成像中心的距离,计算焦平面法线矢量相对于垂直基准线在水平面上的偏转。垂直基准线AB偏离了相机透镜轴线之后,其成像A′B′偏离成像中心一定距离。相机轴线相对于垂直基准线AB的偏转角度可以为arctan(PC/OC);其中,P点是垂直基准线的成像与成像中心之间的交点。若物距远大于相距,则OC为摄像头的焦距f,偏转角度可以为arctan(PC/f)。
S508、根据第三坐标、可连接设备的第四坐标以及焦平面法线矢量,确定可连接设备的位置信息。
用户设备还可以获取可连接设备的第四坐标。上述可连接设备可以包括位置固定的可连接设备,以及位置可变的可连接设备。对于位置固定的可连接设备,其坐标也是固定的;用户设备可以通过接收上述可连接设备发送的广播消息获得可连接设备的第四坐标,也可以通过访问服务器,接收服务器发送的定位信息,从上述定位信息中获取可连接设备的第四坐标。对于位置可变的可连接设备,上述可连接设备上可以设置有UWB信标;用户设备可以在多个空间位置上测量与可连接设备之间的距离,然后根据测量到的多个距离值确定可连接设备的第四坐标。
在上述步骤的基础上,用户设备可以根据第三坐标、可连接设备的第四坐标以及焦平 面法线矢量,确定可连接设备的位置信息。
上述设备控制方法,用户设备可以在不存在UWB信标的场景中,获取可连接设备的位置信息,提高了设备控制方法的适用性。
在一个实施例中,提供一种设备控制方法,如图17所示,上述方法包括:
S602、获取当前场景中各个可连接设备的位置信息。
S604、根据各所述位置信息以及所述用户设备的视场范围,确定位于所述视场范围内的候选可连接设备。
S606、将所述候选可连接设备的控制子界面,按照所述位置信息叠加显示在所述视场范围内的现实场景图像上,生成所述第一界面。
S608、响应于图像采集指令,显示第一界面。
S610、若检测到第一界面上的连接控件被触发,则生成连接请求。
S612、将所述连接请求发送至所述目标可连接设备。
S614、若检测到所述操作控件被触发,则显示所述目标可连接设备的第一操作子界面。
S616、若检测到所述第一操作子界面上的操作项被触发,则生成与被触发的操作项对应的操作控制信号。
S618、根据所述操作控制信号对所述目标可连接设备进行操作控制。
上述设备控制方法,其实现原理和技术效果参见上述实施例,在此不做赘述。
应该理解的是,虽然如上的各实施例所涉及的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,如上的各实施例所涉及的流程图中的至少一部分步骤可以包括多个步骤或者多个阶段,这些步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤中的步骤或者阶段的至少一部分轮流或者交替地执行。
基于同样的发明构思,本申请实施例还提供了一种用于实现上述所涉及的设备控制方法的设备控制装置。该装置所提供的解决问题的实现方案与上述方法中所记载的实现方案相似,故下面所提供的一个或多个设备控制装置实施例中的具体限定可以参见上文中对于设备控制方法的限定,在此不再赘述。
在一个实施例中,如图18所示,提供了一种设备控制装置,包括:
显示模块10,用于响应于图像采集指令,显示第一界面;第一界面包括现实场景图像以及叠加显示在现实场景图像上的现实场景中可连接设备的控制子界面;
控制模块20,用于基于控制子界面获取触发操作,根据触发操作控制目标可连接设备。
在一个实施例中,在上述实施例的基础上,上述控制子界面包括可连接设备的设备信息以及用于连接可连接设备的连接控件。
在一个实施例中,在上述实施例的基础上,上述设备信息包括可连接设备的设备名称,和/或,可连接设备的设备类型。
在一个实施例中,在上述实施例的基础上,上述控制模块20具体用于:若检测到连接控件被触发,则生成连接请求;连接请求用于请求与目标可连接设备建立通信连接;将连接请求发送至目标可连接设备。
在一个实施例中,在上述实施例的基础上,控制子界面还包括可连接设备对应的操作控件;上述控制模块20具体用于:若检测到操作控件被触发,则显示目标可连接设备的第一操作子界面;第一操作子界面包括与目标可连接设备的设备类型匹配的操作项。
在一个实施例中,在上述实施例的基础上,上述控制模块20还用于:在与目标可连接设备建立通信连接之后,从第一界面切换至第二界面;第二界面包括现实场景图像以及叠加显示在现实场景图像上的目标可连接设备的第二操作子界面;第二操作子界面包括与目标可连接设备的设备类型匹配的操作项。
在一个实施例中,在上述实施例的基础上,上述控制模块20还用于:若检测到操作项被触发,则生成与被触发的操作项对应的操作控制信号;根据操作控制信号对目标可连接设备进行操作控制。
在一个实施例中,在上述实施例的基础上,上述控制模块20具体用于:通过与可连接设备之间的通信连接,向目标可连接设备发送操作控制信号。
在一个实施例中,在上述实施例的基础上,上述控制模块20具体用于:向服务器发送操作控制信号,以通过服务器对目标可连接设备进行操作控制。
在一个实施例中,在上述实施例的基础上,控制子界面在现实场景图像中的显示位置与可连接设备在现实场景中的位置对应。
在一个实施例中,在上述实施例的基础上,如图19所示,上述装置还包括生成模块30,用于:获取当前场景中各个可连接设备的位置信息;位置信息包括可连接设备与用户设备之间的距离,以及可连接设备相对于用户设备的方位信息;根据各位置信息以及用户设备的视场范围,确定位于视场范围内的候选可连接设备;将候选可连接设备的控制子界面,按照位置信息叠加显示在视场范围内的现实场景图像上,生成第一界面。
在一个实施例中,在上述实施例的基础上,生成模块30具体用于:获取设置于当前场景中的多个定位信标发送的多个定位坐标;根据多个定位坐标确定用户设备的第一坐标;获取可连接设备的第二坐标;第二坐标为根据可连接设备发送的广播信息确定的,或,根据服务器发送的定位信息确定的;获取用户设备的摄像头的焦平面法线矢量;根据第一坐标、第二坐标以及焦平面法线矢量,确定可连接设备的位置信息。
在一个实施例中,在上述实施例的基础上,生成模块30具体用于:显示提示信息;提示信息用于提示用户按照预设路径移动用户设备;获取用户设备在移动过程中的多个移动坐标,并根据多个移动坐标确定用户设备的摄像头朝向;根据摄像头朝向以及摄像头的俯仰角度,获得焦平面法线矢量。
在一个实施例中,在上述实施例的基础上,生成模块30具体用于:采集现实场景图像,识别现实场景图像中的定位基准线;定位基准线包括垂直基准线;根据预设的当前场景的模型参数,确定定位基准线的尺寸;根据预设的摄像头参数、基准线的尺寸以及摄像头的俯仰角,确定用户设备的第三坐标以及用户设备的摄像头的焦平面法线矢量;根据第三坐标、可连接设备的第四坐标以及焦平面法线矢量,确定可连接设备的位置信息。
在一个实施例中,在上述实施例的基础上,生成模块30具体用于:根据摄像头的俯仰角度以及摄像头参数,确定用户设备的高度,以及用户设备与垂直基准线之间的距离;根据用户设备与至少两个垂直基准线之间的距离,确定用户设备在水平面的坐标,并根据水平面的坐标以及用户设备的高度获得用户设备的第三坐标;根据垂直基准线在摄像头的成像平面上距离成像中心的距离,计算摄像头的焦平面法线矢量。
上述设备控制装置,其实现原理和技术效果参见上述方法实施例,在此不做赘述。
上述设备控制装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于计算机设备中的处理器中,也可以以软件形式存储于计算机设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。
在一个实施例中,提供了一种用户设备,其内部结构图可以如图20所示。该用户设备包括处理器、存储器、输入/输出接口、通信接口、显示单元和输入装置。其中,处理器、存储器和输入/输出接口通过系统总线连接,通信接口、显示单元和输入装置通过输入/输出接口连接到系统总线。其中,该用户设备的处理器用于提供计算和控制能力。该用户设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机程序。该内存储器为非易失性存储介质中的操作系统和计算机程序的运行提供环境。该用户设备的输入/输出接口用于处理器与外部设备之间交换信息。该用户设备的通信接口用于与外部的终端进行有线或无线方式的通信,无线方式可通过WIFI、移动蜂窝网络、NFC(近场通信)或其他技术实现。该计算机程序被处理器执行时以实现一种设备控制方 法。该用户设备的显示单元用于形成视觉可见的画面,可以是显示屏、投影装置或虚拟现实成像装置。显示屏可以是液晶显示屏或者电子墨水显示屏,该计算机设备的输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图20中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的计算机设备的限定,具体的计算机设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
本申请实施例还提供了一种计算机可读存储介质。一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行设备控制方法的步骤。
本申请实施例还提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行设备控制方法。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-Only Memory,ROM)、磁带、软盘、闪存、光存储器、高密度嵌入式非易失性存储器、阻变存储器(ReRAM)、磁变存储器(Magnetoresistive Random Access Memory,MRAM)、铁电存储器(Ferroelectric Random Access Memory,FRAM)、相变存储器(Phase Change Memory,PCM)、石墨烯存储器等。易失性存储器可包括随机存取存储器(Random Access Memory,RAM)或外部高速缓冲存储器等。作为说明而非局限,RAM可以是多种形式,比如静态随机存取存储器(Static Random Access Memory,SRAM)或动态随机存取存储器(Dynamic Random Access Memory,DRAM)等。本申请所提供的各实施例中所涉及的数据库可包括关系型数据库和非关系型数据库中至少一种。非关系型数据库可包括基于区块链的分布式数据库等,不限于此。本申请所提供的各实施例中所涉及的处理器可为通用处理器、中央处理器、图形处理器、数字信号处理器、可编程逻辑器、基于量子计算的数据处理逻辑器等,不限于此。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种设备控制方法,其特征在于,包括:
    响应于图像采集指令,显示第一界面;所述第一界面包括现实场景图像以及叠加显示在所述现实场景图像上的现实场景中可连接设备的控制子界面;
    基于所述控制子界面获取触发操作,根据所述触发操作控制目标可连接设备。
  2. 根据权利要求1所述的方法,其特征在于,所述控制子界面包括所述可连接设备的设备信息以及用于连接所述可连接设备的连接控件。
  3. 根据权利要求2所述的方法,其特征在于,所述设备信息包括所述可连接设备的设备名称,和/或,所述可连接设备的设备类型。
  4. 根据权利要求2所述的方法,其特征在于,所述基于所述控制子界面获取触发操作,根据所述触发操作控制目标可连接设备,包括:
    若检测到所述连接控件被触发,则生成连接请求;连接请求用于请求与目标可连接设备建立通信连接;
    将所述连接请求发送至所述目标可连接设备。
  5. 根据权利要求2所述的方法,其特征在于,所述控制子界面还包括所述可连接设备对应的操作控件;所述基于所述控制子界面获取触发操作,根据所述触发操作控制目标可连接设备,包括:
    若检测到所述操作控件被触发,则显示所述目标可连接设备的第一操作子界面;所述第一操作子界面包括与所述目标可连接设备的设备类型匹配的操作项。
  6. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    在与所述目标可连接设备建立通信连接之后,从所述第一界面切换至第二界面;所述第二界面包括所述现实场景图像以及叠加显示在所述现实场景图像上的所述目标可连接设备的第二操作子界面;所述第二操作子界面包括与所述目标可连接设备的设备类型匹配的操作项。
  7. 根据权利要求5或6所述的方法,其特征在于,所述方法还包括:
    若检测到所述操作项被触发,则生成与被触发的操作项对应的操作控制信号;
    根据所述操作控制信号对所述目标可连接设备进行操作控制。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述操作控制信号对所述目标可连接设备进行操作控制,包括:
    通过与所述可连接设备之间的通信连接,向所述目标可连接设备发送所述操作控制信号。
  9. 根据权利要求7所述的方法,其特征在于,所述根据所述操作控制信号对所述目标可连接设备进行操作控制,包括:
    向服务器发送所述操作控制信号,以通过所述服务器对所述目标可连接设备进行操作控制。
  10. 根据权利要求1-6任一项所述的方法,其特征在于,所述控制子界面在所述现实场景图像中的显示位置与所述可连接设备在所述现实场景中的位置对应。
  11. 根据权利要求10所述的方法,其特征在于,所述响应于图像采集指令,显示第一界面之前,还包括:
    获取当前场景中各个可连接设备的位置信息;所述位置信息包括所述可连接设备与用户设备之间的距离,以及所述可连接设备相对于所述用户设备的方位信息;
    根据各所述位置信息以及所述用户设备的视场范围,确定位于所述视场范围内的候选可连接设备;
    将所述候选可连接设备的控制子界面,按照所述位置信息叠加显示在所述视场范围内 的现实场景图像上,生成所述第一界面。
  12. 根据权利要求11所述的方法,其特征在于,所述将所述候选可连接设备的控制子界面,按照所述位置信息叠加显示在所述视场范围内的现实场景图像上,生成所述第一界面,包括:
    将所述候选可连接设备的控制子界面,按照所述位置信息采用列表方式叠加显示在所述视场范围内的现实场景图像上,生成所述第一界面。
  13. 根据权利要求12所述的方法,其特征在于,所述获取当前场景中各个可连接设备的位置信息,包括:
    获取设置于当前场景中的多个定位信标发送的多个定位坐标;
    根据所述多个定位坐标确定所述用户设备的第一坐标;
    获取所述可连接设备的第二坐标;所述第二坐标为根据可连接设备发送的广播信息确定的,或,根据服务器发送的定位信息确定的;
    获取所述用户设备的摄像头的焦平面法线矢量;
    根据所述第一坐标、第二坐标以及所述焦平面法线矢量,确定所述可连接设备的位置信息。
  14. 根据权利要求13所述的方法,其特征在于,所述获取所述用户设备的摄像头的焦平面法线矢量,包括:
    显示提示信息;所述提示信息用于提示用户按照预设路径移动所述用户设备;
    获取所述用户设备在移动过程中的多个移动坐标,并根据所述多个移动坐标确定所述用户设备的摄像头朝向;
    根据所述摄像头朝向以及所述摄像头的俯仰角度,获得所述焦平面法线矢量。
  15. 根据权利要求11所述的方法,其特征在于,所述获取当前场景中各个可连接设备的位置信息,包括:
    采集所述现实场景图像,识别所述现实场景图像中的定位基准线;所述定位基准线包括垂直基准线;
    根据预设的当前场景的模型参数,确定所述定位基准线的尺寸;
    根据预设的摄像头参数、所述基准线的尺寸以及摄像头的俯仰角,确定所述用户设备的第三坐标以及所述用户设备的摄像头的焦平面法线矢量;
    根据所述第三坐标、所述可连接设备的第四坐标以及所述焦平面法线矢量,确定所述可连接设备的位置信息。
  16. 根据权利要求15所述的方法,其特征在于,所述根据预设的摄像头参数、所述基准线的尺寸以及摄像头的俯仰角,确定所述用户设备的第三坐标以及所述用户设备的摄像头的焦平面法线矢量,包括:
    根据所述摄像头的俯仰角度以及所述摄像头参数,确定所述用户设备的高度,以及所述用户设备与所述垂直基准线之间的距离;
    根据所述用户设备与至少两个垂直基准线之间的距离,确定所述用户设备在水平面的坐标,并根据所述水平面的坐标以及所述用户设备的高度获得所述用户设备的第三坐标;
    根据所述垂直基准线在所述摄像头的成像平面上距离成像中心的距离,计算所述摄像头的焦平面法线矢量。
  17. 一种设备控制装置,其特征在于,应用于用户设备,包括:
    显示模块,用于响应于图像采集指令,显示第一界面;所述第一界面包括现实场景图像以及叠加显示在所述现实场景图像上的现实场景中可连接设备的控制子界面;
    控制模块,用于基于所述控制子界面获取触发操作,根据所述触发操作控制目标可连接设备。
  18. 一种用户设备,包括存储器及处理器,所述存储器中储存有计算机程序,其特征在于,所述计算机程序被所述处理器执行时,使得所述处理器执行如权利要求1至16中 任一项所述的设备控制方法的步骤。
  19. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至16中任一项所述的方法的步骤。
  20. 一种计算机程序产品,包括计算机程序,其特征在于,该计算机程序被处理器执行时实现权利要求1至16中任一项所述的方法的步骤。
PCT/CN2022/139285 2022-01-25 2022-12-15 设备控制方法、装置、用户设备和计算机可读存储介质 WO2023142755A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210084628.9A CN114422644A (zh) 2022-01-25 2022-01-25 设备控制方法、装置、用户设备和计算机可读存储介质
CN202210084628.9 2022-01-25

Publications (1)

Publication Number Publication Date
WO2023142755A1 true WO2023142755A1 (zh) 2023-08-03

Family

ID=81277393

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/139285 WO2023142755A1 (zh) 2022-01-25 2022-12-15 设备控制方法、装置、用户设备和计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN114422644A (zh)
WO (1) WO2023142755A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114422644A (zh) * 2022-01-25 2022-04-29 Oppo广东移动通信有限公司 设备控制方法、装置、用户设备和计算机可读存储介质
CN116193017A (zh) * 2022-11-23 2023-05-30 珠海格力电器股份有限公司 交互方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138069A1 (en) * 2012-05-17 2015-05-21 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for unified scene acquisition and pose tracking in a wearable display
CN110780598A (zh) * 2019-10-24 2020-02-11 深圳传音控股股份有限公司 一种智能设备控制方法、装置、电子设备及可读存储介质
CN111045344A (zh) * 2019-12-31 2020-04-21 维沃移动通信有限公司 一种家居设备的控制方法及电子设备
CN113852646A (zh) * 2020-06-10 2021-12-28 漳州立达信光电子科技有限公司 一种智能设备的控制方法、装置、电子设备及系统
CN114422644A (zh) * 2022-01-25 2022-04-29 Oppo广东移动通信有限公司 设备控制方法、装置、用户设备和计算机可读存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6973233B2 (ja) * 2017-05-17 2021-11-24 オムロン株式会社 画像処理システム、画像処理装置および画像処理プログラム
CN108520552A (zh) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN108550190A (zh) * 2018-04-19 2018-09-18 腾讯科技(深圳)有限公司 增强现实数据处理方法、装置、计算机设备和存储介质
CN109218610A (zh) * 2018-08-15 2019-01-15 北京天元创新科技有限公司 一种基于增强现实的运营商网络资源展示方法与装置
CN111815786A (zh) * 2020-06-30 2020-10-23 北京市商汤科技开发有限公司 信息显示方法、装置、设备和存储介质
CN111880657B (zh) * 2020-07-30 2023-04-11 北京市商汤科技开发有限公司 一种虚拟对象的控制方法、装置、电子设备及存储介质
CN113885345B (zh) * 2021-10-29 2024-03-19 广州市技师学院(广州市高级技工学校、广州市高级职业技术培训学院、广州市农业干部学校) 基于智能家居仿真控制系统的交互方法、装置、设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138069A1 (en) * 2012-05-17 2015-05-21 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for unified scene acquisition and pose tracking in a wearable display
CN110780598A (zh) * 2019-10-24 2020-02-11 深圳传音控股股份有限公司 一种智能设备控制方法、装置、电子设备及可读存储介质
CN111045344A (zh) * 2019-12-31 2020-04-21 维沃移动通信有限公司 一种家居设备的控制方法及电子设备
CN113852646A (zh) * 2020-06-10 2021-12-28 漳州立达信光电子科技有限公司 一种智能设备的控制方法、装置、电子设备及系统
CN114422644A (zh) * 2022-01-25 2022-04-29 Oppo广东移动通信有限公司 设备控制方法、装置、用户设备和计算机可读存储介质

Also Published As

Publication number Publication date
CN114422644A (zh) 2022-04-29

Similar Documents

Publication Publication Date Title
JP7190042B2 (ja) シャドウレンダリング方法、装置、コンピュータデバイスおよびコンピュータプログラム
JP6798019B2 (ja) パノラマ画像の表示制御方法、装置及び記憶媒体
WO2023142755A1 (zh) 设备控制方法、装置、用户设备和计算机可读存储介质
US20190041972A1 (en) Method for providing indoor virtual experience based on a panorama and a 3d building floor plan, a portable terminal using the same, and an operation method thereof
US10554883B2 (en) VR system, communication method, and non-transitory computer-readable medium
US11776185B2 (en) Server, user terminal, and service providing method, and control method thereof for displaying photo images within a map
JP6421670B2 (ja) 表示制御方法、表示制御プログラム、及び情報処理装置
EP2974509B1 (en) Personal information communicator
US10855916B2 (en) Image processing apparatus, image capturing system, image processing method, and recording medium
TWI555390B (zh) 控制電子設備的方法與電子裝置
US20220076469A1 (en) Information display device and information display program
WO2020042968A1 (zh) 一种对象信息的获取方法、装置以及存储介质
WO2017133147A1 (zh) 一种实景地图的生成方法、推送方法及其装置
JP2016522895A (ja) 測位・ナビゲーション方法、装置、プログラム、及び記録媒体
EP3962090A1 (en) Communication terminal, image communication system, method for displaying image, and carrier means
US20190289206A1 (en) Image processing apparatus, image capturing system, image processing method, and recording medium
US20240087157A1 (en) Image processing method, recording medium, image processing apparatus, and image processing system
US20190311525A1 (en) Augmented reality object cluster rendering and aggregation
CN114092655A (zh) 构建地图的方法、装置、设备及存储介质
CN110070617B (zh) 数据同步方法、装置、硬件装置
EP4030790A1 (en) Method and apparatus for generating semantic map, and readable storage medium
CN111754564B (zh) 视频展示方法、装置、设备及存储介质
CN111369684B (zh) 目标跟踪方法、装置、设备及存储介质
WO2023088127A1 (zh) 室内导航方法、服务器、装置和终端
JP2018097694A (ja) 送信制御プログラム、送信制御方法および情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22923541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE