WO2023051459A1 - 一种会议系统、方法及相关设备 - Google Patents

一种会议系统、方法及相关设备 Download PDF

Info

Publication number
WO2023051459A1
WO2023051459A1 PCT/CN2022/121334 CN2022121334W WO2023051459A1 WO 2023051459 A1 WO2023051459 A1 WO 2023051459A1 CN 2022121334 W CN2022121334 W CN 2022121334W WO 2023051459 A1 WO2023051459 A1 WO 2023051459A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display device
screen
remote control
light spot
Prior art date
Application number
PCT/CN2022/121334
Other languages
English (en)
French (fr)
Inventor
王伟杰
吕跃强
杨阳
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023051459A1 publication Critical patent/WO2023051459A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present application relates to the field of computer technology, in particular to a conference system, method and related equipment.
  • display devices such as electronic whiteboards are usually used to display the same content on display devices in different places, so that participants in different places can see the content currently being explained.
  • the user usually uses a remote control pen such as a laser pen or an infrared pen to point to the screen of the local display device, and projects a light spot on the screen to indicate the current area that needs attention.
  • the light spot cannot be displayed at the corresponding position on the display device (remote display device) located in other places to indicate the current area that needs attention. Therefore, how to simultaneously indicate the current area that needs attention on the display devices in different places is an urgent need. Solved technical problems.
  • the application discloses a conference system, method and related equipment.
  • an image acquisition module into a remote control pen, information synchronization of remote conferences in mobile office scenarios can be realized, and applicable scenarios of remote conferences can be increased.
  • the present application provides a conference system
  • the conference system includes a first display device, a remote control pen, a processing module and one or more second display devices
  • the remote control pen includes a transmitter and an image acquisition module
  • the remote control pen is used for Visible light is emitted to the first display device through the emitter to present light spots on the screen of the first display device, and the first image is collected through the image acquisition module; wherein the first image includes the image presented on the screen of the first display device The light spot;
  • the processing module is used to acquire the first image collected by the above-mentioned image acquisition module, and determine the position information of the light spot in the first image on the screen of the first display device;
  • the above-mentioned one or more second display devices are used to The light spots are displayed on the respective screens according to the positions indicated by the position information.
  • the image acquisition module is integrated into the remote control pen.
  • the image acquisition module on the remote control pen can acquire the image of the display device, and determine the The position of the light spot on the screen to display the light spot on the remote display device.
  • the above processing module is located in the remote control pen; when the processing module is located in the remote control pen, after the remote control pen determines the above position information through the processing module, it is also used to send a determination to the one or more second display devices location information.
  • the above-mentioned processing module is located in the first display device; when the processing module is located in the first display device, the above-mentioned remote control pen is also used to send the first image to the first display device after the first image is collected by the image acquisition module. Send the first image; the first display device is used to receive the first image sent by the remote control pen, determine the position information of the light spot in the first image on the screen of the first display device through the processing module; and display to one or more second The device sends location information.
  • the remote control pen Due to the small size of the remote control pen, it is usually powered by a battery, and the processing module is set on the local first display device. After the remote control pen collects the image, it is sent to the processing module in the nearby display device for further processing to determine the above
  • the location information can reduce the energy consumption of the remote control pen and make the work of the remote control pen more stable.
  • the above conference system further includes a server, and the processing module is located on the server; when the processing module is located on the server, the above-mentioned remote control pen is also used to send the first image to the server after the first image is collected by the image collection module The first image; the server is used to receive the first image sent by the remote control pen, determine the position information of the light spot in the first image on the screen of the first display device through the processing module; and send the above-mentioned one or more second display devices Send confirmed location information.
  • the remote control pen Due to the small size of the remote control pen, it is usually powered by batteries. After the remote control pen collects the image, it sends it to the server for further processing to determine the above position information, which can reduce the energy consumption of the remote control pen and make the work of the remote control pen more stable.
  • the data capability on the server side is stronger, which can speed up the data processing, prevent the stuck phenomenon caused by the insufficient data processing capability of the remote control pen, and improve the quality of the meeting.
  • the above-mentioned processing module is specifically configured to: determine a region of interest (region of interest, ROI) in the first image, where the region of interest includes the above-mentioned light spot; The second image corresponding to the displayed screen, and search for the region of interest on the second image to obtain the target region matching the region of interest; then determine the position information of the target region on the second image, according to the location information of the target region in the second image Determine the position information of the light spot on the screen of the first display device based on the position information on the first display device.
  • a region of interest region of interest, ROI
  • the second image corresponding to the displayed screen, and search for the region of interest on the second image to obtain the target region matching the region of interest; then determine the position information of the target region on the second image, according to the location information of the target region in the second image Determine the position information of the light spot on the screen of the first display device based on the position information on the first display device.
  • the position information of the target area on the second image refers to the coordinates of the target area on the second image
  • the position information of the light spot on the screen of the first display device refers to the distance between the light spot and the edge of the actual screen.
  • the position information of the light spot on the screen of the first display device can refer to the distance between the light spot and the edge of the actual screen distance.
  • the angle of the image taken by the remote control pen also changes.
  • the position information on the screen of the device can avoid calibrating the image acquisition module and calculating the transformation matrix after the shooting angle of the remote control pen changes, thereby improving the data processing speed and improving the applicability of the conference system.
  • the processing module is specifically configured to: if the first image includes a full screen of the first display device, determine the position of the above-mentioned light spot in the first image, and determine the position of the full screen in the first The position in the image; then determine the position information of the above-mentioned light spot on the screen of the first display device according to the position of the light spot in the first image and the position of the complete screen in the first image.
  • the position of the complete screen of the first display device in the first image is determined by the pixel coordinates of the four corners of the frame of the complete screen in the first image; The position of is determined by the pixel coordinates of the spot in the first image.
  • the angle of the image taken by the remote control pen also changes differently.
  • the location information on the remote control pen can avoid calibrating the image acquisition module and calculating the transformation matrix after the shooting angle of the remote control pen changes, thereby improving the data processing speed and improving the applicability of the conference system.
  • the present application provides a remote control pen, which is used in a conference system including a first display device, the remote control pen, a processing module, and one or more second display devices, and the remote control pen includes a transmitter and An image acquisition module; wherein the emitter is used to emit visible light to the first display device, so as to present light spots on the screen of the first display device; the image acquisition module is used to acquire a first image, and the first image includes the image of the first display device The light spot presented on the screen, so that the processing module determines position information of the light spot on the screen of the first display device according to the first image.
  • the above-mentioned remote control pen further includes the above-mentioned processing module, which is used to acquire the first image collected by the image acquisition module; determine the region of interest including the above-mentioned light spot in the first image; and then acquire the second A second image corresponding to the screen currently displayed on the screen of the display device, searching the above-mentioned interest area on the second image to obtain a target area matching the above-mentioned interest area; finally determining the position information of the target area on the second image, The position information of the light spot on the screen of the first display device is determined according to the position information of the target area on the second image.
  • the above-mentioned processing module which is used to acquire the first image collected by the image acquisition module; determine the region of interest including the above-mentioned light spot in the first image; and then acquire the second A second image corresponding to the screen currently displayed on the screen of the display device, searching the above-mentioned interest area on the second image to obtain a target area matching the above-mentioned
  • the above-mentioned remote control pen further includes a processing module, the processing module is configured to acquire the first image collected by the image acquisition module, and determine the light spot when the first image includes the entire screen of the first display device position in the first image, and determine the position of the complete screen in the first image; determine the above-mentioned light spot on the screen of the first display device according to the position of the light spot in the first image and the position of the complete screen in the first image location information.
  • the position of the complete screen of the first display device in the first image is determined by the pixel coordinates of the four corners of the frame of the complete screen in the first image; The position of is determined by the pixel coordinates of the spot in the first image.
  • the present application provides a remote conference method, which is used in a conference system including a remote control pen and a processing module, wherein the remote control pen emits visible light to the first display device to present light spots on the screen of the first display device
  • the remote control pen collects the first image, the first image includes the light spots presented on the screen of the first display device; the processing module acquires the first image, and determines the position information of the light spots in the first image on the screen of the first display device; Make one or more second display devices display light spots on their respective screens according to the location information.
  • the above processing module is located in the remote control pen; after the remote control pen determines the above position information through the processing module, the above method further includes: the remote control pen sends the above position information to one or more second display devices.
  • the above processing module is located in the first display device; after the remote control pen captures the first image, it sends the collected first image to the first display device; the first display device receives the first image sent by the remote control pen. image, using a processing module to determine the position information of the light spot in the first image on the screen of the first display device; and send the position information to one or more second display devices.
  • the conference system above further includes a server, and the processing module is located at the server; after the remote control pen collects the first image, it sends the first image to the server; the server receives the first image sent by the remote control pen, and passes the processing module Determining position information of the light spot in the first image on the screen of the first display device; and sending the determined position information to one or more second display devices.
  • the determination of the position information of the light spot in the first image on the screen of the first display device includes: a processing module determines the region of interest including the light spot in the first image; and then acquires the first Display the second image corresponding to the screen currently displayed on the screen of the device, search the above-mentioned interest area on the second image, and obtain the target area matching the interest area; finally determine the position information of the target area on the second image, according to the target The position information of the area on the second image determines the position information of the light spot on the screen of the first display device.
  • the processing module determining position information of the light spot on the screen of the first display device in the first image includes: when the processing module determines that the first image includes a complete screen of the first display device, Determine the position of the light spot in the first image, and determine the position of the complete screen in the first image; determine the position of the light spot in the first display device according to the position of the light spot in the first image and the position of the complete screen in the first image Location information on the screen.
  • the relative position of the complete screen in the first image is determined by the pixel coordinates of the four corners of the frame of the complete screen in the first image; the relative position of the light spot in the first image is determined by The pixel coordinates of the light spots in the first image are determined.
  • FIG. 1 is a schematic diagram of a system architecture of a conference system provided by an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a conference system provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of another conference system provided by an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of another conference system provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a conference method provided in an embodiment of the present application.
  • Fig. 6 is a schematic diagram of determining the position information of the light spot provided by the embodiment of the present application.
  • Fig. 7 is a schematic diagram of another method for determining the position information of the light spot provided by the embodiment of the present application.
  • Fig. 8 is a schematic structural diagram of a remote control pen provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of another remote control pen provided by an embodiment of the present application.
  • FIG. 1 is a schematic diagram of a system architecture of a conference system provided in an embodiment of the present application.
  • the conference system includes a first display device 10 , one or more second display devices 20 , a remote control pen 30 and a processing module 40 .
  • the first display device 10 and the second display device 20 are used for displaying content such as text, picture, and playing video.
  • the first display device 10 is used as the local display device, that is, the display device in the area where the current user is located in scenarios such as remote conferences or distance education, and one or more second display devices 20 are remote display devices. end display device.
  • the remote control pen 30 is used to emit visible light to the screen of the first display device to project a light spot on the screen to indicate the area that needs attention at present, and collect an image including the light spot on the screen.
  • the processing module 40 is used to obtain the image collected by the remote control pen, and determine the position information of the light spot on the screen according to the image.
  • the second display device 20 is configured to obtain the position information determined by the processing module 40, and display light spots at corresponding positions on the screen according to the position information, so as to indicate the current area that needs attention.
  • the processing module 40 can be located in the remote control pen 30, can also be located in the display device, and can also be located in a computing device (such as a server).
  • a computing device such as a server
  • FIG. 2 is a schematic structural diagram of a conference system provided by an embodiment of the present application.
  • the remote control pen 30 includes a transmitter 310 , an image acquisition module 320 , a processing module 40 and a communication module 330 .
  • the emitting window of the emitter 310 and the camera of the image acquisition module 320 are set on the same side of the remote control pen 30, and the emitter 310 can be a laser emitter or an infrared emitter, etc., and is used to generate visible light with better focusing effect.
  • the emitter 310 can point to the screen of the first display device under the control of the user, and emit visible light to project a light spot on the screen of the first display device, indicating a current area that needs attention.
  • the image acquisition module 320 is configured to acquire images at a preset acquisition frequency when the emitter 310 emits visible light, and the acquired image includes the light spot projected by the emitter 310 on the screen.
  • the processing module 40 is configured to acquire the image collected by the image collection module 320 and determine the position information of the light spot on the screen of the first display device 10 .
  • the communication module 330 is configured to send the position information of the light spot determined by the processing module 40 to one or more second display devices.
  • the display devices include a display module 110 and a communication module 120 .
  • the display module 110 is used to display content such as text, images, and playing videos;
  • the communication module 120 is used to receive data that needs to be displayed, for example, when the display device is a remote second display device, the communication module 120 is used to receive the above-mentioned location information .
  • the communication module 330 of the remote control pen 30 can directly send to one or more second display devices 20 at the far end; the position information of the light spot can also be sent to the local first display device 10 first, and then the second display device 20 A display device sends position information of light spots to one or more second display devices 20 through the communication module 120, as shown in FIG. 2 .
  • the communication module 330 in the remote control pen 30 can be a communication module supporting a short-distance communication protocol such as a Bluetooth communication module.
  • the processing module 40 is located in a display device.
  • FIG. 3 is a schematic structural diagram of another conference system provided by an embodiment of the present application.
  • the remote control pen 30 includes a transmitter 310 , an image acquisition module 320 and a communication module 330 .
  • the functions of the transmitter 310 and the image acquisition module 320 can refer to the relevant description of the remote control pen 30 in the above-mentioned FIG.
  • the processing module 40 in the first display device 10 determines the position information of the light spot in the image.
  • the display device (including the first display device 10 and one or more second display devices 20 ) includes a display module 110 , a communication module 120 and a processing module 40 .
  • the display module 110 is used to display content such as text and images
  • the communication module 120 is used to receive images sent by the communication module 330 of the remote control pen 30
  • the processing module 40 is used to communicate to the communication module 120
  • the received image is processed to determine the position information of the light spot in the image
  • the communication module 120 is also configured to send the position information of the light spot to one or more second display devices 20 at the remote end.
  • the communication module 120 is used to receive the position information of the light spot sent by the first display device 10; the display module 110 is used to display the light spot at a corresponding position on the screen according to the position information.
  • the remote control pen 30 when the processing module 40 is located in the display device, the remote control pen 30 only needs to send the collected images to the first display device 10, and the communication module 330 in the remote control pen 30 can be a Bluetooth communication module, etc. that support short-distance communication protocols communication module.
  • the remote control pen Due to the small size of the remote control pen, it is usually powered by a battery, and the processing module is set on the local first display device. After the remote control pen collects the image, it is sent to the processing module in the nearby display device for further processing to determine the above
  • the location information can reduce the energy consumption of the remote control pen and make the work of the remote control pen more stable.
  • the above-mentioned processing module 40 may also be located in a server, as shown in FIG. 4 , which is a schematic structural diagram of a conference system provided in an embodiment of the present application.
  • the remote control pen 30 includes a transmitter 310 , an image acquisition module 320 and a communication module 330 .
  • the functions of the transmitter 310 and the image acquisition module 320 can refer to the relevant description of the remote control pen 30 in the above-mentioned FIG.
  • the device 10 sends the image collected by the image collection module 320, and the communication module 120 of the first display device 10 sends the image to the server 50, so that the processing module 40 in the server 50 determines the location information of the light spots in the image.
  • the server 50 includes a communication module 510 and a processing module 40 .
  • the communication module 510 is used to receive the image sent by the communication module 330 of the remote control pen 30, or receive the image sent by the communication module 120 of the first display device 10;
  • the processing module 40 is used to process the image received by the communication module 510, and determine the image The position information of the light spot;
  • the communication module 510 is also configured to send the position information of the light spot to one or more second display devices 20 at the far end.
  • the display device (including the first display device 10 and one or more second display devices 20 ) includes a display module 110 and a communication module 120 .
  • the communication module 120 is used to receive the position information of the light spot sent by the server 50; the display module 110 is used to display the light spot at a corresponding position according to the position information.
  • the remote control pen Due to the small size of the remote control pen, it is usually powered by batteries. After the remote control pen collects the image, it sends it to the server for further processing to determine the above position information, which can reduce the energy consumption of the remote control pen and make the work of the remote control pen more stable.
  • the data capability on the server side is stronger, which can speed up the data processing, prevent the stuck phenomenon caused by the insufficient data processing capability of the remote control pen, and improve the quality of the meeting.
  • the image acquisition module By integrating the image acquisition module into the remote control pen, when the user uses the remote control pen to indicate the area that needs attention on the display device, the image of the display device can be collected through the image acquisition module on the remote control pen, and then the processing module can determine the spot on the screen. to display the spot on the remote display device.
  • This solution is well suited for mobile office scenarios.
  • FIG. 5 is a schematic flowchart of a conference method provided by an embodiment of the present application, and the conference method includes S501 to S504 .
  • the remote control pen emits visible light to the first display device, and presents light spots on the screen of the first display device.
  • the user can use the remote control pen 30 to point to the screen of the first display device 10, and trigger the trigger button on the remote control pen 30, so that the transmission on the remote control pen 30
  • the detector 310 emits visible light to project a light spot on the screen of the first display device 10, indicating the area that needs attention at present.
  • the remote control pen collects a first image including light spots.
  • the processing module 40 determines the position information of the light spot on the screen of the first display device 10 based on the image collected by the image acquisition module 320, wherein the above-mentioned first image is any one collected by the image acquisition module 320 including the light spot Image.
  • the processing module acquires the first image, and determines position information of the light spot on the screen of the first display device in the first image.
  • the processing module 40 can be located in the remote control pen 30 , the display device or the server. For details, reference can be made to the relevant descriptions in FIGS. 2 to 4 , which will not be repeated here.
  • FIG. 6 is a schematic diagram of determining position information of a light spot provided by an embodiment of the present application.
  • the processing module 40 obtains the first image collected by the image collection module 320, wherein the first image may include the complete screen of the first display device 10, or may only include a part of the screen of the first display device 10, the first image in FIG. 6
  • the shaded parts in indicate the parts of the screen that are captured.
  • the processing module 40 first recognizes the light spot from the first image, for example, the light spot is generally a red or green circular light spot, and the processing module 40 can recognize the light spot in the first image through color features and shape features.
  • the processing module 40 determines a region of interest (region of interest, ROI) including the above-mentioned light spot, and acquires a second image corresponding to the picture currently displayed on the screen of the first display device 10.
  • the size of the ROI is smaller than the sizes of the first image and the second image.
  • the first image is an image with a resolution of 2140*1920
  • the second image is an image with a resolution of 1920*1080 pixels.
  • the ROI may only be 10*10 pixels.
  • the ROI can be a rectangle or a circle. The application examples do not make specific limitations.
  • the processing module 40 After acquiring the ROI and the second image, the processing module 40 searches the second image for a target area matching the ROI to obtain a target area matching the ROI. Then the processing module 40 determines the position information of the target area in the second image, and uses the position information of the target area on the second image as the position information of the light spot on the screen of the first display device.
  • the upper left corner of the second image can be used as the coordinate origin O
  • the horizontal direction of the second image can be used as the x-axis
  • the vertical direction can be used as the y-axis to establish the first coordinate system as shown in Figure 6.
  • search for the region matching the ROI in the second image with a preset step size.
  • the ROI area is a rectangular area of 120*100, and a sliding window is set, and the size of the sliding window is the same as that of the ROI.
  • the processing module 40 determines whether the second area matches the ROI, if not, the sliding window continues to intercept the next area, and matches with the ROI area until a target area matching the ROI is found.
  • the processing module 40 determines the coordinates of the four vertices of the target area in the second image, based on the coordinates of the four vertices of the target area, the resolution of the second image, and the size of the display device screen, determine the A location information is displayed on the screen of the device.
  • the coordinates of a vertex of the target area are (a1, b1)
  • the resolution of the second image is w1*h1
  • the screen of the display device is w2 centimeters in length and h2 centimeters in height
  • the light spot is on the screen of the first display device
  • the position satisfies the following relationship:
  • a2 represents the distance between the above-mentioned one vertex and the left edge of the screen
  • b2 represents the distance between the above-mentioned one vertex and the top-side edge of the screen.
  • the processing module 40 can also determine the position information of the ROI from the second image according to other methods.
  • the angle of the image taken by the remote control pen also changes.
  • the position information on the screen of the device can avoid calibrating the image acquisition module and calculating the transformation matrix after the shooting angle of the remote control pen changes, thereby improving the data processing speed and improving the applicability of the conference system.
  • the processing module 40 takes the upper left corner of the first image as the coordinate origin O, and takes the first image
  • the horizontal direction of an image is the x-axis
  • the vertical direction is the y-axis
  • a second coordinate system as shown in FIG. 7 is established.
  • the processing module 40 can first recognize the complete screen of the first display device 10 from the first image, and obtain the coordinates of the four vertices (A, B, C, D in FIG. 7 ) of the complete screen, that is, the complete screen is in the first position in the image.
  • the slope of the straight line corresponding to the four borders of the screen in the first image can be calculated according to the coordinates of each vertex.
  • the slope of the straight line AB is calculated as k1
  • the slope of the straight line CD is k2
  • the slope of AC is k3
  • the slope of straight line BD is k4.
  • the processing module 40 can determine a straight line L1 according to the slope k5 and the coordinates M (x0, y0), determine a straight line L2 according to the slope k6 and the coordinates M (x0, y0), and there will be a line between the straight line L1 and the straight line AC
  • straight line L1 and straight line BD will have an intersection point F
  • straight line L2 and straight line AB will have an intersection point G
  • straight line L2 and straight line CD will have an intersection point H.
  • the processing module 40 can respectively determine the coordinates of the above four intersection points in the second coordinate system.
  • b3 represents the intercept of the straight line AC on the y-axis
  • b4 represents the intercept of the straight line L1 on the y-axis.
  • the coordinates of the intersection point E of the straight line AC and the straight line L1 in the second coordinate system can be obtained based on the above two straight line equations. Through the same method as above, the coordinates of other intersection points in the second coordinate system can be obtained.
  • the processing module 40 can calculate the distance between the coordinate M of the light spot and the four intersection points.
  • the processing module 40 acquires the actual length and height of the screen of the first display device to calculate the position information of the light spot on the screen of the first display device. It can be understood that the setting of the coordinate system in the above solution is only an example, not a limitation. Under the same principle, if the coordinate system settings are different, the position information can be obtained by transforming the corresponding calculation formula.
  • the method for determining the above position information by the processing module 40 is introduced. It should be understood that for other images collected by the image collection module 320, the above-mentioned The same method is used to determine the position information of the light spot in the image.
  • the second display device acquires the location information, and displays the light spots on the screen according to the location information.
  • the device where the processing module 40 is located sends the position information to one or more second display devices 20, and sends the position information to one or more second display devices 20.
  • the method of sending the location information reference may be made to the relevant descriptions in the above-mentioned FIG. 2 to FIG. 4 , which will not be repeated here.
  • the second display device 20 that has received the location information displays a light spot on its respective screen according to the location indicated by the location information, to indicate the current area that needs attention.
  • Figure 8 is a schematic structural diagram of a remote control pen provided in the embodiment of the present application
  • the remote control pen 800 includes a transmitter 810, an image acquisition module 820, a memory 830 and a communication interface 840, the transmitter 810, The module 820 , memory 830 and communication interface 840 are connected to each other through a bus 850 .
  • the emitter 810 can emit visible light to project a light spot on the screen of the first display device 10 shown in FIG.
  • the image of the light spot so that the processing module 40 determines the position information of the light spot on the screen of the first display device 10 based on the image collected by the image acquisition module 820; the memory 830 is used for buffering the image collected by the image acquisition module 820; the communication interface 840 is used for The image collected by the image collection module 820 is sent to the device where the processing module 40 is located.
  • Memory 830 can be non-volatile memory, for example, read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), Electrically Erasable Programmable Read-Only Memory (electrically EPROM, EEPROM) or flash memory.
  • the memory 830 can also be a volatile memory, which can be random access memory (random access memory, RAM), which is used as an external cache.
  • RAM random access memory
  • static RAM static random access memory
  • dynamic RAM dynamic RAM
  • DRAM dynamic RAM
  • high bandwidth memory high bandwidth memory
  • HBM high bandwidth memory
  • synchronous dynamic random access memory synchronous DRAM, SDRAM
  • double data rate synchronous dynamic random access memory double data rate SDRAM, DDR SDRAM
  • enhanced synchronous dynamic random access memory enhanced SDRAM, ESDRAM
  • synchronous Connect dynamic random access memory direct rambus RAM, DR RAM
  • the communication interface 840 can be a wired interface or a wireless interface for communicating with other modules or devices; the wired interface can be an Ethernet interface, a local interconnect network (LIN), etc., and the wireless interface can be a cellular network interface or use Wireless LAN interface, etc.
  • the wired interface can be an Ethernet interface, a local interconnect network (LIN), etc.
  • the wireless interface can be a cellular network interface or use Wireless LAN interface, etc.
  • the bus 850 may be a peripheral component interconnect (PCI) bus or an extended industry standard architecture (EISA) bus or the like.
  • the bus 850 can be divided into an address bus, a data bus, a control bus, a memory bus, and the like. For ease of representation, only one thick line is used in FIG. 8 , but it does not mean that there is only one bus or one type of bus.
  • the remote control pen 800 further includes a processor 860 , and the transmitter 810 , the image acquisition module 820 , the memory 830 , the communication interface 840 and the processor 860 are connected to each other through a bus 850 .
  • the processor 860 is configured to implement the functions implemented by the processing module 40 in the method embodiment shown in FIG. 5 above, which will not be repeated here.
  • the processor 860 can have multiple implementation forms, for example, the processor 860 can be a central processing unit (central processing unit, CPU), a neural network processor (neural-network processing unit, NPU) or an image processing unit (graphics processing unit, GPU). ), etc., the processor 860 may also be a single-core processor or a multi-core processor.
  • CPU central processing unit
  • NPU neural network processor
  • GPU graphics processing unit
  • the processor 860 may also be a single-core processor or a multi-core processor.
  • the processor 860 can also be other modules with computing functions, such as application-specific integrated circuit (application-specific integrated circuit, ASIC), programmable logic device (programmable logic device, PLD), digital signal processor (digital signal processor, DSP) Or any combination thereof, the above-mentioned PLD can be a complex program logic device (complex programmable logical device, CPLD), field-programmable gate array (field-programmable gate array, FPGA), general array logic (generic array logic, GAL) or its arbitrary combination.
  • application-specific integrated circuit application-specific integrated circuit, ASIC
  • programmable logic device programmable logic device
  • DSP digital signal processor
  • the above-mentioned PLD can be a complex program logic device (complex programmable logical device, CPLD), field-programmable gate array (field-programmable gate array, FPGA), general array logic (generic array logic, GAL) or its arbitrary combination.
  • CPLD complex programmable logical device
  • FPGA field-programmable gate array
  • the remote control pen 800 also includes a processor 860
  • the memory 830 is also used to store program codes.
  • the stored program code is used by the processor 860 to execute the operation steps implemented by the processing module 40 in the method embodiment shown in FIG. 5 above.
  • the remote control pen 800 may contain more or fewer components than those shown in FIG. 8 or FIG. 9 , or have a different arrangement of components.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores instructions, and when it runs on the processor, it can implement the method steps in the above-mentioned method embodiments, and the computer can For the specific implementation of the processor reading the storage medium executing the above method steps, reference may be made to the specific operations of the above method embodiments, which will not be repeated here.
  • the above-mentioned embodiments may be implemented in whole or in part by software, hardware, firmware or other arbitrary combinations.
  • the above-described embodiments may be implemented in whole or in part in the form of computer program products.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded or executed on the computer, the processes or functions according to the embodiments of the present invention will be generated in whole or in part.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center that includes one or more sets of available media.
  • the available media may be magnetic media (eg, floppy disks, hard disks, magnetic tape), optical media, or semiconductor media.
  • the semiconductor medium may be a solid state drive (SSD).
  • the steps in the method of the embodiment of the present application can be adjusted in order, merged or deleted according to actual needs; the modules in the device of the embodiment of the present application can be divided, combined or deleted according to actual needs.

Abstract

本申请提供一种会议系统、方法及相关设备,该会议系统包括第一显示设备、遥控笔、处理模块与一个或多个第二显示设备,遥控笔包括发射器与图像采集模块;遥控笔通过发射器向第一显示设备发出可见光,以在第一显示设备的屏幕上呈现光斑,并通过图像采集模块采集包括光斑的第一图像;处理模块用于确定第一图像中光斑在第一显示设备的屏幕上的位置信息;第二显示设备用于在各自的屏幕上根据位置信息指示的位置显示光斑。通过将图像采集模块集成到遥控笔上,使得本地显示设备在移动到任何位置时都能够通过遥控笔采集显示设备的图像,实现移动办公场景中远程会议的信息同步,增加远程会议的适用场景。

Description

一种会议系统、方法及相关设备
本申请要求于2021年09月30日提交中国专利局、申请号为202111166847.3、申请名称为“一种会议系统、方法及相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,尤其涉及一种会议系统、方法及相关设备。
背景技术
随着网络技术的发展,远程会议、远程教育等应用越来越广泛。当前通常是使用电子白板等显示设备,在不同地方的显示设备上显示相同的内容,以使不同地方的参与人员能够看到当前讲解的内容。用户在进行讲解时,通常会使用激光笔或者红外笔等遥控笔指向本地显示设备的屏幕,在屏幕上投影一个光斑,以指示当前需要关注的区域。但是,位于其他地方的显示设备(远端显示设备)上无法在对应的位置显示光斑以指示当前需要关注的区域,因此,如何在不同地方的显示设备上同步指示当前需要关注的区域是一个亟待解决的技术问题。
发明内容
本申请公开了一种会议系统、方法及相关设备,通过将图像采集模块集成到遥控笔上,能够实现移动办公场景中远程会议的信息同步,增加远程会议的适用场景。
第一方面,本申请提供一种会议系统,该会议系统包括第一显示设备、遥控笔、处理模块与一个或多个第二显示设备,遥控笔包括发射器与图像采集模块,遥控笔用于通过发射器向第一显示设备发出可见光,以在第一显示设备的屏幕上呈现光斑,并通过图像采集模块采集第一图像;其中,第一图像包括所述第一显示设备的屏幕上呈现的所述光斑;处理模块用于获取上述图像采集模块采集的第一图像,并确定第一图像中光斑在第一显示设备的屏幕上的位置信息;上述一个或多个第二显示设备用于在各自的屏幕上根据所述位置信息指示的位置显示光斑。
本申请中,图像采集模块集成到遥控笔上,在用户用遥控笔指示显示设备上需要关注的区域时,能够通过遥控笔上的图像采集模块采集显示设备的图像,并基于采集到的图像确定光斑在屏幕上的位置,以在远端显示设备上显示光斑。
在一种可能的实现方式中,上述处理模块位于遥控笔;在处理模块位于遥控笔时,遥控笔通过处理模块确定上述位置信息之后,还用于向上述一个或多个第二显示设备发送确定的位置信息。
在一种可能的实现方式中,上述处理模块位于第一显示设备;在处理模块位于第一显示设备时,上述遥控笔在通过图像采集模块采集第一图像后,还用于向第一显示设备发送第一图像;第一显示设备用于接收遥控笔发送的第一图像,通过处理模块确定第一图像中光斑在第一显示设备的屏幕上的位置信息;并向一个或多个第二显示设备发送位置信息。
由于遥控笔体积较小,通常使用电池供电,将处理模块设置于本地的第一显示设备,遥 控笔在采集到图像之后,发送给近处的显示设备中的处理模块进行进一步的处理以确定上述位置信息,能够降低遥控笔的能耗,使遥控笔的工作更加稳定。
在一种可能的实现方式中,上述会议系统还包括服务器,处理模块位于所述服务器;在处理模块位于服务器时,上述遥控笔在通过图像采集模块采集第一图像后,还用于向服务器发送所述第一图像;服务器用于接收遥控笔发送的第一图像,通过处理模块确定第一图像中光斑在第一显示设备的屏幕上的位置信息;并向上述一个或多个第二显示设备发送确定的位置信息。
由于遥控笔体积较小,通常使用电池供电,遥控笔在采集到图像之后,发送给服务器进行进一步的处理以确定上述位置信息,能够降低遥控笔的能耗,使遥控笔的工作更加稳定,同时服务器侧的数据能力更强,能够加快数据处理的速度,防止由于遥控笔数据处理能力不足导致的卡顿现象,提高会议质量。
在一种可能的实现方式中,上述处理模块具体用于:在第一图像中确定感兴趣区域(region of interest,ROI),该感兴趣区域包括上述光斑;然后获取第一显示设备的屏幕当前显示的画面对应的第二图像,并在第二图像上搜索感兴趣区域,得到与感兴趣区域匹配的目标区域;再确定目标区域在第二图像上的位置信息,根据目标区域在第二图像上的位置信息确定光斑在第一显示设备的屏幕上的位置信息。其中,目标区域在第二图像上的位置信息是指目标区域在第二图像上的坐标,上述光斑在第一显示设备的屏幕上的位置信息是指光斑距离实际屏幕的边缘的距离。根据目标区域在第二图像上的坐标、第二图像的分辨率以及第一显示设备的屏幕的长和高,能够光斑在第一显示设备的屏幕上的位置信息是指光斑距离实际屏幕的边缘的距离。
随着用户的位置以及遥控笔指向的变化,遥控笔拍摄图像的角度也在不同的变化,通过在图像中截取包括光斑的ROI,在图像中搜索与ROI匹配的区域进而确定光斑在第一显示设备的屏幕上的位置信息,能够避免在遥控笔拍摄角度变化后对图像采集模块进行校准、计算变换矩阵等,从而提高数据处理速度,提高该会议系统的适用性。
在一种可能的实现方式中,处理模块具体用于:在第一图像中包括第一显示设备的完整屏幕的情况下,确定上述光斑在第一图像中的位置,并确定完整屏幕在第一图像中的位置;然后根据光斑在第一图像中的位置和完整屏幕在第一图像中的位置确定上述光斑在第一显示设备的屏幕上的位置信息。
在一种可能的实现方式中,上述第一显示设备的完整屏幕在第一图像中的位置由完整屏幕的边框的四个角在第一图像中的像素坐标确定;上述光斑在第一图像中的位置由光斑在第一图像中的像素坐标确定。
随着用户的位置以及遥控笔指向的变化,遥控笔拍摄图像的角度也在不同的变化,通过在完整屏幕在图像中的坐标以及光斑在图像中的坐标,确定光斑在第一显示设备的屏幕上的位置信息,能够避免在遥控笔拍摄角度变化后对图像采集模块进行校准、计算变换矩阵等,从而提高数据处理速度,提高该会议系统的适用性。
第二方面,本申请提供一种遥控笔,该遥控笔用于包括第一显示设备、所述遥控笔、处理模块与一个或多个第二显示设备的会议系统,该遥控笔包括发射器与图像采集模块;其中,发射器用于向第一显示设备发出可见光,以在第一显示设备的屏幕上呈现光斑;图像采集模块用于采集第一图像,且该第一图像包括第一显示设备的屏幕上呈现的光斑,以使处理模块根据第一图像确定光斑在第一显示设备的屏幕上的位置信息。
在一种可能的实现方式中,上述遥控笔还包括上述处理模块,该处理模块用于获取图像采集模块采集的第一图像;在第一图像中确定包括上述光斑的感兴趣区域;然后获取第一显示设备的屏幕当前显示的画面对应的第二图像,在第二图像上搜索上述感兴趣区域,得到与上述感兴趣区域匹配的目标区域;最后确定目标区域在第二图像上的位置信息,根据目标区域在第二图像上的位置信息确定光斑在第一显示设备的屏幕上的位置信息。
在一种可能的实现方式中,上述遥控笔还包括处理模块,该处理模块用于获取图像采集模块采集的第一图像,在第一图像中包括第一显示设备完整屏幕的情况下,确定光斑在第一图像中的位置,并确定完整屏幕在第一图像中的位置;根据光斑在第一图像中的位置和完整屏幕在第一图像中的位置确定上述光斑在第一显示设备的屏幕上的位置信息。
在一种可能的实现方式中,上述第一显示设备的完整屏幕在第一图像中的位置由完整屏幕的边框的四个角在第一图像中的像素坐标确定;上述光斑在第一图像中的位置由光斑在第一图像中的像素坐标确定。
第三方面,本申请提供一种远程会议方法,该方法用于包括遥控笔和处理模块的会议系统,其中,遥控笔向第一显示设备发出可见光,以在第一显示设备的屏幕上呈现光斑;遥控笔采集第一图像,该第一图像包括第一显示设备的屏幕上呈现的光斑;处理模块获取第一图像,确定第一图像中光斑在第一显示设备的屏幕上的位置信息;以使一个或多个第二显示设备根据位置信息在各自的屏幕上显示光斑。
在一种可能的实现方式中,上述处理模块位于遥控笔;遥控笔通过处理模块确定上述位置信息之后,上述方法还包括:遥控笔向一个或多个第二显示设备发送上述位置信息。
在一种可能的实现方式中,上述处理模块位于第一显示设备;遥控笔在采集第一图像后,向第一显示设备发送采集的第一图像;第一显示设备接收遥控笔发送的第一图像,通过处理模块确定第一图像中光斑在第一显示设备的屏幕上的位置信息;并向一个或多个第二显示设备发送位置信息。
在一种可能的实现方式中,上述会议系统还包括服务器,处理模块位于服务器;遥控笔在采集第一图像后,向服务器发送第一图像;服务器接收遥控笔发送的第一图像,通过处理模块确定第一图像中光斑在第一显示设备的屏幕上的位置信息;并向一个或多个第二显示设备发送确定的位置信息。
在一种可能的实现方式中,上述确定第一图像中光斑在第一显示设备的屏幕上的位置信息,包括:处理模块在第一图像中确定包括上述光斑的感兴趣区域;然后获取第一显示设备的屏幕当前显示的画面对应的第二图像,在第二图像上搜索上述感兴趣区域,得到与感兴趣区域匹配的目标区域;最终确定目标区域在第二图像上的位置信息,根据目标区域在第二图像上的位置信息确定光斑在第一显示设备的屏幕上的位置信息。
在一种可能的实现方式中,处理模块确定第一图像中光斑在第一显示设备的屏幕上的位置信息,包括:处理模块在确定第一图像中包括第一显示设备完整屏幕的情况下,确定光斑在第一图像中的位置,并确定完整屏幕在第一图像中的位置;根据光斑在第一图像中的位置和完整屏幕在第一图像中的位置确定上述光斑在第一显示设备的屏幕上的位置信息。在一种可能的实现方式中,上述完整屏幕在第一图像中的相对位置由完整屏幕的边框的四个角在第一图像中的像素坐标确定;上述光斑在第一图像中的相对位置由光斑在第一图像中的像素坐标确定。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种会议系统的系统架构示意图;
图2是本申请实施例提供的一种会议系统的结构示意图;
图3是本申请实施例提供的另一种会议系统的结构示意图;
图4是本申请实施例提供的另一种会议系统的结构示意图;
图5是本申请实施例提供的一种会议方法的流程示意图;
图6是本申请实施例提供的一种确定光斑的位置信息的示意图;
图7是本申请实施例提供的另一种确定光斑的位置信息的示意图;
图8是本申请实施例提供的一种遥控笔的结构示意图;
图9是本申请实施例提供的另一种遥控笔的结构示意图。
具体实施方式
下面结合附图对本申请提供的会议系统的相关设备以及方法进行详细介绍。
如图1所示,图1是本申请实施例提供的一种会议系统的系统架构示意图。该会议系统包括第一显示设备10、一个或多个第二显示设备20、遥控笔30与处理模块40。其中,第一显示设备10和第二显示设备20用于显示文字、图片、播放视频等内容。本申请实施例中,以第一显示设备10为本地显示设备,即在进行远程会议或远程教育等场景中,当前的用户所在的区域的显示设备,一个或多个第二显示设备20为远端显示设备。遥控笔30用于向第一显示设备的屏幕发射可见光,以在屏幕上投影一个光斑,指示当前需要关注的区域,并采集包括上述屏幕上的光斑的图像。处理模块40用于获取遥控笔采集的图像,根据该图像确定上述光斑在屏幕上的位置信息。第二显示设备20用于获取处理模块40确定的位置信息,并根据该位置信息在屏幕的对应位置上显示光斑,以指示当前需要关注的区域。
本申请中,处理模块40可以位于遥控笔30中,也可以位于显示设备中,还可以位于一个计算设备(例如服务器)中,下面分别以处理模块40位于上述三种设备中对本申请提供的会议系统的工作流程进行介绍。
在一种可能的实现方式中,处理模块40位于遥控笔30中。如图2所示,图2是本申请实施例提供的一种会议系统的结构示意图。遥控笔30包括发射器310、图像采集模块320、处理模块40和通信模块330。其中,发射器310的发射窗口与图像采集模块320的摄像头设置于遥控笔30的同一侧,发射器310可以是激光发射器或者红外光发射器等,用于产生聚光效果较好的可见光。发射器310能够在用户的控制下,指向第一显示设备的屏幕,并发射可见光,以在第一显示设备的屏幕上投影一个光斑,指示当前需要关注的区域。图像采集模块320用于在发射器310发射可见光时,以预设采集频率采集图像,且采集的图像中包括发射器310在屏幕上投射的光斑。处理模块40用于获取图像采集模块320采集的图像,确定上述光斑在第一显示设备10的屏幕上的位置信息。通信模块330用于向一个或多个第二显示设备发送处理模块40确定的光斑的位置信息。
显示设备(包括第一显示设备10和第二显示设备20)包括显示模块110和通信模块120。显示模块110用于显示文字、图像、播放视频等内容;通信模块120用于接收需要显示的数 据,例如,当显示设备是远端的第二显示设备时,通信模块120用于接收上述位置信息。
需要说明的是,遥控笔30的通信模块330可以直接向远端的一个或多个第二显示设备20发送;也可以将光斑的位置信息先发送给本地的第一显示设备10,然后由第一显示设备通过通信模块120向一个或多个第二显示设备20发送光斑的位置信息,如图2中所示。当遥控笔30仅需要向第一显示设备发送光斑的位置信息时,遥控笔30中的通信模块330可以是蓝牙通信模块等支持短距离通信协议的通信模块。
在一种可能的实现方式中,处理模块40位于显示设备中。如图3所示,图3是本申请实施例提供的另一种会议系统的结构示意图。遥控笔30包括发射器310、图像采集模块320和通信模块330。其中,发射器310和图像采集模块320的功能可参照上述图2中对遥控笔30的相关描述;通信模块330用于向本地的第一显示设备10发送图像采集模块320采集的图像,以使第一显示设备10中的处理模块40确定图像中光斑的位置信息。
显示设备(包括第一显示设备10和一个或多个第二显示设备20)包括显示模块110、通信模块120和处理模块40。当显示设备是本地的第一显示设备时,显示模块110用于显示文字、图像等内容;通信模块120用于接收遥控笔30的通信模块330发送的图像;处理模块40用于对通信模块120接收到的图像进行处理,确定图像中光斑的位置信息;通信模块120还用于向远端的一个或多个第二显示设备20发送光斑的位置信息。
当显示设备是远端的第二显示设备20时,通信模块120用于接收第一显示设备10发送的光斑的位置信息;显示模块110用于根据位置信息在屏幕的对应位置显示光斑。
需要说明的是,当处理模块40位于显示设备中时,遥控笔30仅需要向第一显示设备10发送采集的图像,遥控笔30中的通信模块330可以是蓝牙通信模块等支持短距离通信协议的通信模块。
由于遥控笔体积较小,通常使用电池供电,将处理模块设置于本地的第一显示设备,遥控笔在采集到图像之后,发送给近处的显示设备中的处理模块进行进一步的处理以确定上述位置信息,能够降低遥控笔的能耗,使遥控笔的工作更加稳定。
在一种可能的实现方式中,上述处理模块40还可以位于服务器中,如图4所示,图4是本申请实施例提供的一种会议系统的结构示意图。遥控笔30包括发射器310、图像采集模块320和通信模块330。其中,发射器310和图像采集模块320的功能可参照上述图2中对遥控笔30的相关描述;通信模块330用于向服务器50发送图像采集模块320采集的图像,或者向本地的第一显示设备10发送图像采集模块320采集的图像,第一显示设备10的通信模块120再将图像发送给服务器50,以使服务器50中的处理模块40确定图像中光斑的位置信息。
服务器50包括通信模块510和处理模块40。通信模块510用于接收遥控笔30的通信模块330发送的图像,或者接收第一显示设备10的通信模块120发送的图像;处理模块40用于对通信模块510接收到的图像进行处理,确定图像中光斑的位置信息;通信模块510还用于向远端的一个或多个第二显示设备20发送光斑的位置信息。
显示设备(包括第一显示设备10和一个或多个第二显示设备20)包括显示模块110和通信模块120。当显示设备是远端的第二显示设备20时,通信模块120用于接收服务器50发送的光斑的位置信息;显示模块110用于根据位置信息在对应位置显示光斑。
由于遥控笔体积较小,通常使用电池供电,遥控笔在采集到图像之后,发送给服务器进行进一步的处理以确定上述位置信息,能够降低遥控笔的能耗,使遥控笔的工作更加稳定,同时服务器侧的数据能力更强,能够加快数据处理的速度,防止由于遥控笔数据处理能力不 足导致的卡顿现象,提高会议质量。
通过将图像采集模块集成到遥控笔上,在用户用遥控笔指示显示设备上需要关注的区域时,能够通过遥控笔上的图像采集模块采集显示设备的图像,进而通过处理模块确定光斑在屏幕上的位置,以在远端显示设备上显示光斑。这种方案可以很好地适用于移动办公场景。
下面结合附图对本申请实施例提供的会议方法进行介绍,该会议方法能够应用于图2至图4中任意一种会议系统中。如图5所示,图5是本申请实施例提供的一种会议方法的流程示意图,该会议方法包括S501至S504。
S501.遥控笔向第一显示设备发射可见光,在第一显示设备的屏幕上呈现光斑。
用户在基于第一显示设备10的屏幕上显示的内容进行讲解时,用户能够使用遥控笔30指向第一显示设备10的屏幕,并触发遥控笔30上的触发按键,使遥控笔30上的发射器310发射可见光,以在第一显示设备10的屏幕上投影一个光斑,指示当前需要关注的区域。
S502.遥控笔采集包括光斑的第一图像。
用户在触发遥控笔30上的触发按键使遥控笔30上的发射器310发射可见光时,能够启动遥控笔上的图像采集模块320,使图像采集模块320以预设采集频率采集包括发射器投射的光斑的图像,以使处理模块40基于图像采集模块320采集的图像确定光斑在第一显示设备10的屏幕上的位置信息,其中,上述第一图像是图像采集模块320采集的任意一张包括光斑的图像。
S503.处理模块获取第一图像,确定第一图像中光斑在第一显示设备的屏幕上的位置信息。
处理模块40能够位于上述遥控笔30、显示设备或者服务器中,具体可参照上述图2至图4中的相关描述,在此不再赘述。
如图6所示,图6是本申请实施例提供的一种确定光斑的位置信息的示意图。处理模块40获取图像采集模块320采集的第一图像,其中,第一图像中可以包括第一显示设备10的完整屏幕,也可以只包括第一显示设备10屏幕的部分,图6中第一图像中的阴影部分表示拍摄到的屏幕的部分。处理模块40首先从第一图像中识别出光斑,例如光斑一般为红色或绿色的圆形光斑,处理模块40能够通过颜色特征和形状特征识别出第一图像中的光斑。然后处理模块40确定一个包括上述光斑的感兴趣区域(region of interest,ROI),并获取第一显示设备10的屏幕当前显示的画面对应的第二图像。其中,ROI的尺寸小于第一图像以及第二图像的尺寸。例如第一图像为分辨率为2140*1920的图像,第二图像为分辨率为1920*1080像素的图像,ROI可能仅为10*10像素,其中,ROI可以为矩形或圆形等形状,本申请实施例不做具体限定。
处理模块40在获取ROI和第二图像之后,在第二图像上搜索与ROI匹配的目标区域,得到与ROI匹配的目标区域。然后处理模块40确定该目标区域在第二图像中的位置信息,将目标区域在第二图像上的位置信息作为光斑在第一显示设备的屏幕上的位置信息。在搜索目标区域时,可以以第二图像的左上角为坐标原点O,以第二图像的水平方向为x轴,竖直方向为y轴,建立如图6中所示的第一坐标系,从图像左上角开始,以预设步长在第二图像中搜索与ROI匹配的区域。例如,ROI区域是一个120*100的矩形区域,设置一个滑动窗口,该滑动窗口尺寸与ROI尺寸相同。通过滑动窗口先从图像左上角获取一块与ROI尺寸相同的第一区域,该第一区域四个顶点在第二图像中的坐标分别为(0,0)、(0,100)、(120,0)和(120,100),确定第一区域与ROI是否匹配,如果不匹配,则将滑动窗口沿x轴方向滑动预设步长,得到第二区域,如果预设步长为40,则第二区域四个顶点在第二图像中的坐标 分别为(40,0)、(40,100)、(160,0)和(160,100)。处理模块40再确定第二区域与ROI是否匹配,如果不匹配,滑动窗口继续截取下一个区域,与ROI区域进行匹配,直至查找到与ROI匹配的目标区域。
处理模块40在确定目标区域之后,确定该目标区域四个顶点在第二图像中的坐标,基于目标区域四个顶点的坐标、第二图像的分辨率以及显示设备屏幕的尺寸,确定光斑在第一显示设备的屏幕上的位置信息。示例性的,目标区域一个顶点的坐标为(a1,b1),第二图像的分辨率为w1*h1,显示设备的屏幕长w2厘米,高h2厘米,则光斑在第一显示设备的屏幕上的位置满足如下关系:
Figure PCTCN2022121334-appb-000001
其中,a2表示上述一个顶点距离屏幕左侧边缘的距离,b2表示上述一个顶点距离屏幕上侧边缘的距离。通过上述方法能够获取目标区域四个顶点各自距离屏幕边缘的距离,从而得到光斑在第一显示设备的屏幕上的位置信息。
上述只是本申请实施例介绍的一种图像中搜索ROI的方法,应理解,处理模块40还能够根据其他方法从第二图像中确定ROI的位置信息。
随着用户的位置以及遥控笔指向的变化,遥控笔拍摄图像的角度也在不同的变化,通过在图像中截取包括光斑的ROI,在图像中搜索与ROI匹配的区域进而确定光斑在第一显示设备的屏幕上的位置信息,能够避免在遥控笔拍摄角度变化后对图像采集模块进行校准、计算变换矩阵等,从而提高数据处理速度,提高该会议系统的适用性。
在一种可能的实现方式中,在图像采集模块320采集的第一图像中包括第一显示设备10完整的屏幕的情况下,处理模块40以第一图像的左上角为坐标原点O,以第一图像的水平方向为x轴,竖直方向为y轴,建立如图7中所示的第二坐标系。处理模块40首先能够从第一图像中识别出第一显示设备10的完整屏幕,并得到完整屏幕四个顶点(如图7中A、B、C、D)的坐标,即完整屏幕在第一图像中的位置。在得到四个顶点的坐标之后,根据各个顶点的坐标能够计算得到屏幕四个边框对应的直线在第一图像中的斜率,例如计算得到直线AB的斜率为k1,直线CD的斜率为k2,直线AC的斜率为k3,直线BD的斜率为k4。处理模块40计算直线AB和直线CD的斜率的均值k5=(k1+k2)/2,计算直线AB和直线CD的斜率的均值k6=(k3+k4)/2,并识别出光斑,得到光斑在第二坐标系中的坐标M(x0,y0)。如图7所示,处理模块40能够根据斜率k5和坐标M(x0,y0)确定一条直线L1,根据斜率k6和坐标M(x0,y0)确定一条直线L2,直线L1和直线AC会有一个交点E,直线L1和直线BD会有一个交点F;直线L2和直线AB会有一个交点G,直线L2和直线CD会有一个交点H。处理模块40能够分别确定上述四个交点在第二坐标系中的坐标。
示例性的,通过顶点A和顶点C能够确定直线AC的直线方程为y=k1*x+b3,根据光斑坐标M(x0,y0)和斜率k5能够确定直线L1的直线方程为y=k5*x+b4。其中,b3表示直线AC在y轴上的截距,b4表示直线L1在y轴上的截距。基于上述两个直线方程能够得到直线AC与直线L1的交点E在第二坐标系的坐标。通过上述相同的方法,能够得到其他各个交点在第二坐标系的坐标。
在确定光斑在第二坐标系中的坐标M(x0,y0)以及上述四个交点(E、F、G和H)的 坐标之后,处理模块40能够分别计算光斑坐标M与四个交点之间的直线距离,例如计算得到线段ME=d1,MF=d2,MG=d3,MH=d4。上述四个距离能够表示光斑在实际屏幕上的位置,即光斑在实际屏幕上,垂直方向上距离上边框的距离s1与距离下边框的距离s2满足s1/s2=d1/d2;水平方向上距离左边框的距离s3与距离右边框的距离s4满足s3/s4=d3/d4。处理模块40获取第一显示设备的屏幕的实际长度和高度,即可计算出光斑在第一显示设备的屏幕上的位置信息。可以理解,上述方案中对坐标系的设置只是一种举例,并非限定。在同样原理下,若坐标系设置不同,那么可以通过变换相应的计算公式来获得位置信息。
本申请实施例中,以图像采集模块320采集的一张第一图像为例,对处理模块40确定上述位置信息的方法进行介绍,应理解,对于图像采集模块320采集的其他图像,能够使用上述相同的方法进行处理以确定图像中光斑的位置信息。
S504.第二显示设备获取位置信息,根据位置信息在屏幕上显示光斑。
处理模块40在确定光斑在第一显示设备的屏幕上的位置信息之后,通过处理模块40所在的设备向一个或多个第二显示设备20发送位置信息,向一个或多个第二显示设备20发送位置信息的方法可以参照上述图2至图4中的相关描述,在此不再赘述。
接收到位置信息的第二显示设备20根据位置信息指示的位置在各自的屏幕上显示一个光斑,以指示当前需要关注的区域。
对于上述方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,其次,本领域技术人员也应该知悉,说明书中所描述的实施例中所涉及的动作并不一定是本发明所必须的。
本领域的技术人员根据以上描述的内容,能够想到的其他合理的步骤组合,也属于本发明的保护范围内。
如图8所示,图8是本申请实施例提供的一种遥控笔的结构示意图,该遥控笔800包括发射器810、图像采集模块820、存储器830和通信接口840,发射器810、图像采集模块820、存储器830和通信接口840通过总线850相互连接。其中,
发射器810能够发射可见光,以在图1所示的第一显示设备10的屏幕上投影一个光斑,指示当前需要关注的区域;图像采集模块820能够以预设采集频率采集包括发射器810投射的光斑的图像,以使处理模块40基于图像采集模块820采集的图像确定光斑在第一显示设备10的屏幕上的位置信息;存储器830用于缓存图像采集模块820采集的图像;通信接口840用于将图像采集模块820采集的图像发送给处理模块40所在的设备。
存储器830可以是非易失性存储器,例如,只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。存储器830也可以是易失性存储器,易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、高带宽存储器(high bandwidth memory,HBM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
通信接口840可以为有线接口或无线接口,用于与其他模块或设备进行通信;有线接口可以是以太接口、局域互联网络(local interconnect network,LIN)等,无线接口可以是蜂窝网络接口或使用无线局域网接口等。
总线850可以是外设部件互连标准(peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。总线850可以分为地址总线、数据总线、控制总线、内存总线等。为便于表示,图8中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
在一种可能的实现方式中,如图9所示,遥控笔800还包括处理器860,发射器810、图像采集模块820、存储器830、通信接口840和处理器860通过总线850相互连接。处理器860用于实现上述图5所示的方法实施例中处理模块40实现的功能,在此不再赘述。
处理器860可以有多种实现形式,例如处理器860可以为中央处理器(central processing unit,CPU)、神经网络处理器(neural-network processing unit,NPU)或图像处理器(graphics processing unit,GPU)等,处理器860还可以是单核处理器或多核处理器。
处理器860还可以是其他具有计算功能的模块,例如专用集成电路(application-specific integrated circuit,ASIC)、可编程逻辑器件(programmable logic device,PLD)、数字信号处理器(digital signal processor,DSP)或其任意组合,上述PLD可以是复杂程序逻辑器件(complex programmable logical device,CPLD),现场可编程门阵列(field-programmable gate array,FPGA),通用阵列逻辑(generic array logic,GAL)或其任意组合。
当遥控笔800还包括处理器860时,存储器830还用于存储程序代码。存储的程序代码以便于处理器860调用以执行上述图5所述方法实施例中处理模块40实现的操作步骤。此外,遥控笔800可能包含相比于图8或图9展示的更多或者更少的组件,或者有不同的组件配置方式。
具体的,上述遥控笔800实现的方法可参照上述方法实施例中遥控笔30执行的相关操作,在此不再赘述。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当其在处理器上运行时,可以实现上述方法实施例中的方法步骤,所述计算机可读存储介质的处理器在执行上述方法步骤的具体实现可参照上述方法实施例的具体操作,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其它实施例的相关描述。
上述实施例,可以全部或部分地通过软件、硬件、固件或其他任意组合来实现。当使用软件实现时,上述实施例可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载或执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以为通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集合的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质、或者半导体介质。半导体介质可以是固态硬盘(solid  state drive,SSD)。
本申请实施例方法中的步骤可以根据实际需要进行顺序调整、合并或删减;本申请实施例装置中的模块可以根据实际需要进行划分、合并或删减。
以上对本申请实施例进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (20)

  1. 一种会议系统,其特征在于,包括第一显示设备、遥控笔、处理模块与一个或多个第二显示设备,所述遥控笔包括发射器与图像采集模块;其中,
    所述遥控笔,用于通过所述发射器向所述第一显示设备发出可见光,以在所述第一显示设备的屏幕上呈现光斑;
    通过所述图像采集模块采集第一图像,所述第一图像包括所述第一显示设备的屏幕上呈现的所述光斑;
    所述处理模块,用于获取所述第一图像,确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息;
    所述一个或多个第二显示设备,用于根据所述位置信息在屏幕上显示光斑。
  2. 根据权利要求1所述的系统,其特征在于,所述处理模块位于所述遥控笔;
    所述遥控笔,还用于向所述一个或多个第二显示设备发送所述位置信息。
  3. 根据权利要求1所述的系统,其特征在于,所述处理模块位于所述第一显示设备;
    所述遥控笔,还用于向所述第一显示设备发送所述第一图像;
    所述第一显示设备,用于接收所述遥控笔发送的所述第一图像,通过所述处理模块确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息;
    向所述一个或多个第二显示设备发送所述位置信息。
  4. 根据权利要求1所述的系统,其特征在于,所述会议系统还包括服务器,所述处理模块位于所述服务器;
    所述遥控笔,还用于向所述服务器发送所述第一图像;
    所述服务器,用于接收所述遥控笔发送的所述第一图像,通过所述处理模块确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息;
    向所述一个或多个第二显示设备发送所述位置信息。
  5. 根据权利要求1至4任一项所述的系统,其特征在于,所述处理模块具体用于:
    在所述第一图像中确定感兴趣区域,所述感兴趣区域包括所述光斑;
    获取所述第一显示设备的屏幕当前显示的画面对应的第二图像,在所述第二图像上搜索所述感兴趣区域,得到与所述感兴趣区域匹配的目标区域;
    确定所述目标区域在所述第二图像上的位置信息,根据所述目标区域在所述第二图像上的位置信息确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  6. 根据权利要求1至4任一项所述的系统,其特征在于,所述处理模块具体用于:
    在所述第一图像中包括所述第一显示设备完整屏幕的情况下,确定所述光斑在所述第一图像中的位置,并确定所述完整屏幕在所述第一图像中的位置;
    根据所述光斑在所述第一图像中的位置和所述完整屏幕在所述第一图像中的位置,确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  7. 根据权利要求6所述的系统,其特征在于,所述完整屏幕在所述第一图像中的位置由所述完整屏幕的边框的四个角在所述第一图像中的像素坐标确定;所述光斑在所述第一图像中的位置由光斑在所述第一图像中的像素坐标确定。
  8. 一种遥控笔,其特征在于,用于包括第一显示设备、所述遥控笔、处理模块与一个或多个第二显示设备的会议系统,所述遥控笔包括发射器与图像采集模块;
    所述发射器,用于向第一显示设备发出可见光,以在所述第一显示设备的屏幕上呈现光斑;
    所述图像采集模块,用于采集第一图像,所述第一图像包括所述第一显示设备的屏幕上呈现的所述光斑,以使所述处理模块根据所述第一图像确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  9. 根据权利要求8所述的遥控笔,其特征在于,所述遥控笔还包括所述处理模块,所述处理模块用于:
    获取所述第一图像;
    在所述第一图像中确定感兴趣区域,所述感兴趣区域包括所述光斑;
    获取所述第一显示设备的屏幕当前显示的画面对应的第二图像,在所述第二图像上搜索所述感兴趣区域,得到与所述感兴趣区域匹配的目标区域;
    确定所述目标区域在所述第二图像上的位置信息,根据所述目标区域在所述第二图像上的位置信息确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  10. 根据权利要求8所述的遥控笔,其特征在于,所述遥控笔还包括所述处理模块,所述处理模块用于:
    获取所述第一图像,在所述第一图像中包括所述第一显示设备完整屏幕的情况下,确定所述光斑在所述第一图像中的位置,并确定完整屏幕在所述第一图像中的位置;
    根据所述光斑在所述第一图像中的位置和所述完整屏幕在所述第一图像中的位置,确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  11. 根据权利要求10所述的遥控笔,其特征在于,所述完整屏幕在所述第一图像中的位置由所述完整屏幕的边框的四个角在所述第一图像中的像素坐标确定;所述光斑在所述第一图像中的位置由光斑在所述第一图像中的像素坐标确定。
  12. 根据权利要求8至11任一项所述的遥控笔,其特征在于,所述遥控笔还包括通信模块,所述通信模块用于向所述一个或多个第二显示设备发送所述位置信息,以使所述一个或多个第二显示设备根据所述位置信息在屏幕上显示光斑。
  13. 一种会议方法,其特征在于,用于包括遥控笔和处理模块的会议系统,包括:
    所述遥控笔向第一显示设备发出可见光,以在所述第一显示设备的屏幕上呈现光斑;
    所述遥控笔采集第一图像,所述第一图像包括所述第一显示设备的屏幕上呈现的所述光斑;
    所述处理模块获取所述第一图像,确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息,以使一个或多个第二显示设备根据所述位置信息在屏幕上显示光斑。
  14. 根据权利要求13所述的方法,其特征在于,所述处理模块位于所述遥控笔;所述方法还包括:
    所述遥控笔向所述一个或多个第二显示设备发送所述位置信息。
  15. 根据权利要求13所述的方法,其特征在于,所述处理模块位于所述第一显示设备;所述处理模块获取所述第一图像之前,还包括:
    所述遥控笔向所述第一显示设备发送所述第一图像;
    所述第一显示设备接收所述遥控笔发送的所述第一图像;
    所述处理模块获取所述第一图像,确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息之后,还包括:
    所述第一显示设备向所述一个或多个第二显示设备发送所述位置信息。
  16. 根据权利要求13所述的方法,其特征在于,所述会议系统还包括服务器,所述处理模块位于所述服务器;所述处理模块获取所述第一图像之前,还包括:
    所述遥控笔向所述服务器发送所述第一图像;
    所述服务器接收所述遥控笔发送的所述第一图像;
    所述处理模块获取所述第一图像,确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息之后,还包括:
    所述服务器向所述一个或多个第二显示设备发送所述位置信息。
  17. 根据权利要求13至16任一项所述的方法,其特征在于,所述确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息,包括:
    在所述第一图像中确定感兴趣区域,所述感兴趣区域包括所述光斑;
    获取所述第一显示设备的屏幕当前显示的画面对应的第二图像,在所述第二图像上搜索所述感兴趣区域,得到与所述感兴趣区域匹配的目标区域;
    确定所述目标区域在所述第二图像中的位置信息,根据所述目标区域在所述第二图像上的位置信息确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  18. 根据权利要求13至16任一项所述的方法,其特征在于,所述确定所述第一图像中所述光斑在所述第一显示设备的屏幕上的位置信息,包括:
    在所述第一图像中包括所述第一显示设备完整屏幕的情况下,确定所述光斑在所述第一图像中的位置,并确定完整屏幕在所述第一图像中的位置;
    根据所述光斑在所述第一图像中的位置和所述完整屏幕在所述第一图像中的位置,确定所述光斑在所述第一显示设备的屏幕上的位置信息。
  19. 根据权利要求18所述的方法,其特征在于,所述完整屏幕在所述第一图像中的位置由所述完整屏幕的边框的四个角在所述第一图像中的像素坐标确定;所述光斑在所述第一图 像中的位置由光斑在所述第一图像中的像素坐标确定。
  20. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时,执行如权利要求13-19中由遥控笔执行的方法。
PCT/CN2022/121334 2021-09-30 2022-09-26 一种会议系统、方法及相关设备 WO2023051459A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111166847.3A CN115933927A (zh) 2021-09-30 2021-09-30 一种会议系统、方法及相关设备
CN202111166847.3 2021-09-30

Publications (1)

Publication Number Publication Date
WO2023051459A1 true WO2023051459A1 (zh) 2023-04-06

Family

ID=85780435

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/121334 WO2023051459A1 (zh) 2021-09-30 2022-09-26 一种会议系统、方法及相关设备

Country Status (2)

Country Link
CN (1) CN115933927A (zh)
WO (1) WO2023051459A1 (zh)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076949A (zh) * 2013-03-29 2014-10-01 华为技术有限公司 一种激光笔光束同步方法及相关设备、系统
CN107491192A (zh) * 2017-08-08 2017-12-19 吉林大学 一种摄像鼠标激光笔定位系统及定位方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076949A (zh) * 2013-03-29 2014-10-01 华为技术有限公司 一种激光笔光束同步方法及相关设备、系统
CN107491192A (zh) * 2017-08-08 2017-12-19 吉林大学 一种摄像鼠标激光笔定位系统及定位方法

Also Published As

Publication number Publication date
CN115933927A (zh) 2023-04-07

Similar Documents

Publication Publication Date Title
US20230037595A1 (en) Image management system, image management method, and computer program product
US20230393721A1 (en) Method and Apparatus for Dynamically Displaying Icon Based on Background Image
WO2020063139A1 (zh) 脸部建模方法、装置、电子设备和计算机可读介质
US9852336B2 (en) Relative positioning of a mobile computing device in a network
WO2019233229A1 (zh) 一种图像融合方法、装置及存储介质
US10389975B2 (en) Communication terminal, communication system, moving-image outputting method, and recording medium storing program
CN108286945B (zh) 基于视觉反馈的三维扫描系统和方法
WO2021115479A1 (zh) 显示控制方法及电子设备
US10673911B2 (en) Displaying regions of user interest in sharing sessions
US10044979B2 (en) Acquiring regions of remote shared content with high resolution
WO2017133147A1 (zh) 一种实景地图的生成方法、推送方法及其装置
WO2023093438A1 (zh) 图像显示方法、装置、电子设备及计算机可读存储介质
WO2017028674A1 (zh) 可视化远程控制可触控设备的方法、系统和相关设备
WO2021035485A1 (zh) 拍摄防抖方法、装置、终端及存储介质
EP3177008A1 (en) Communication terminal, communication system, communication control method, and carrier means
WO2021037286A1 (zh) 一种图像处理方法、装置、设备及存储介质
CN110463177A (zh) 文档图像的校正方法及装置
CN114640833A (zh) 投影画面调整方法、装置、电子设备和存储介质
US20180018398A1 (en) Positioning content in computer-generated displays based on available display space
WO2023051459A1 (zh) 一种会议系统、方法及相关设备
WO2019090734A1 (zh) 拍摄方法及装置、移动终端及计算机可读存储介质
US9723206B1 (en) Enabling a true surround view of a 360 panorama via a dynamic cylindrical projection of the panorama
TW201621629A (zh) 具有低資料傳輸量的鏡射顯示系統及其方法
WO2023125353A1 (zh) 全景视频的传输方法、装置及存储介质
WO2023142732A1 (zh) 图像处理方法、装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22874844

Country of ref document: EP

Kind code of ref document: A1