CN212361513U - Mounting device - Google Patents

Mounting device Download PDF

Info

Publication number
CN212361513U
CN212361513U CN202020647882.1U CN202020647882U CN212361513U CN 212361513 U CN212361513 U CN 212361513U CN 202020647882 U CN202020647882 U CN 202020647882U CN 212361513 U CN212361513 U CN 212361513U
Authority
CN
China
Prior art keywords
terminal device
mounting device
present application
hole
mounting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020647882.1U
Other languages
Chinese (zh)
Inventor
严尧
李晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN202020647882.1U priority Critical patent/CN212361513U/en
Application granted granted Critical
Publication of CN212361513U publication Critical patent/CN212361513U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

This application is applicable to control technical field, provides a mounting device, and this mounting device includes: the auxiliary support comprises a main body support, a connecting part and an auxiliary support which is slidably arranged on the main body support through the connecting part. The technical problem of poor adaptability of the installation device in the related art is solved.

Description

Mounting device
Technical Field
The application belongs to the technical field of monitoring, and particularly relates to an installation device.
Background
With the continuous development of scientific technology, intelligent classroom technology for applying information technology to classroom learning has been rapidly developed, for example, technology for monitoring and analyzing classroom behavior of students.
At present, the classroom behaviors of students are monitored and analyzed by adopting the following main schemes: the terminal that contains the camera gathers the monitoring data and uploads to the platform, and the platform is through this monitoring data of analysis discernment student action information to implement management function for educator or manager and facilitate.
The terminal including the camera in this scheme usually needs to be installed on the wall for the collection monitoring data. The scheme of the current terminal installation mainly comprises that the terminal is fixed on a wall through an installation device, and the installation device is poor in adaptability and cannot meet the requirements of different terminals.
Disclosure of Invention
The embodiment of the application provides a mounting device, which can solve the technical problem of poor adaptability of the mounting device in the related art.
The embodiment of the application provides a mounting device, includes: the auxiliary support comprises a main body support, a connecting part and an auxiliary support which is slidably arranged on the main body support through the connecting part.
Optionally, the main body bracket comprises a vertical flat plate and a horizontal flat plate which are connected, the auxiliary bracket is slidably mounted on the horizontal flat plate through the connecting component, and the vertical flat plate is used for being fixed on a wall.
Optionally, the main body support further comprises a bending structure, and the vertical flat plate is connected with the horizontal flat plate through the bending structure.
Optionally, a first through hole is formed in the vertical flat plate, and an arc-shaped through groove is formed below the first through hole.
Optionally, a sliding groove is formed in the horizontal plate, and the auxiliary bracket is detachably mounted on the horizontal plate through the connecting part penetrating through the sliding groove.
Optionally, the horizontal plate includes a main body portion and an extension portion extending outward along the main body portion in a horizontal direction, and the sliding groove is opened on the extension portion.
Optionally, the auxiliary support includes a bearing portion, the bearing portion is used for bearing the terminal device to be installed, an installation through hole is formed in the bearing portion, and the connecting component penetrates through the sliding groove and the installation through hole.
Optionally, a groove or a first through hole is formed in the bottom of the terminal device to be installed, and the connecting component penetrates through the sliding groove and the installation through hole and then is inserted into the groove or the first through hole.
Optionally, the auxiliary bracket further comprises a baffle portion disposed at an end portion of the bearing portion.
Optionally, the connecting member is a screw.
The mounting device provided by the embodiment of the application comprises a main body support, a connecting part and an auxiliary support, wherein the auxiliary support is slidably mounted on the main body support through the connecting part. The auxiliary bracket for bearing the terminal equipment to be installed is slidably installed on the main body bracket through the connecting part, on one hand, the distance between the auxiliary bracket and the wall can be changed, so that the installation device can be adapted to the terminal equipment with different sizes, and the adaptability of the installation device is improved; on the other hand, terminal equipment can slide on main part support along with auxiliary stand, can avoid terminal equipment to be sheltered from by other objects of installation on the wall body of next door, has promoted environmental suitability greatly.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a concentration detection system provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a concentration detection system provided in another embodiment of the present application;
fig. 3 is a schematic flowchart of a concentration detection method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a concentration detection apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to another embodiment of the present application;
FIG. 8 is a schematic structural diagram of a mounting device provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of a terminal device mounted on a wall through a mounting device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
An application scenario of the embodiments of the present application is first illustrated by way of non-limiting example. Fig. 1 shows a concentration detection system according to an embodiment of the present application.
As shown in fig. 1, a concentration detection system is deployed in the application scenario, and the concentration detection system includes a terminal device 11, a server 12, and a central control platform 13. The terminal device 11 and the server 12 are connected in communication via a wired or wireless communication network. The server 12 is connected to the central control platform 13 in a communication manner through a wired or wireless communication network.
The Wireless communication network includes, but is not limited to, Wireless Fidelity (Wi-Fi), ZigBee (ZigBee), bluetooth, Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), fifth generation Mobile communication network (5th generation Mobile network, 5G), and future communication networks.
The central control platform 13 is used for controlling the server 12 to configure online contents. The server 12 is configured to transmit the online content to the terminal device 11 under the control of the central control platform 13. The terminal device 11 is configured to display the online content, collect image data of the viewer, and calculate concentration information of the viewer based on the image data. The terminal device 11 is further configured to upload the calculated concentration degree information of the viewer to the central control platform 13 through the server 12. It should be noted that the image data of the viewer collected by the terminal device 11 includes one or more of a 2D image, a depth image, video data, and the like. The terminal device 11 may be a terminal device comprising a depth camera.
It should be noted that, on the basis of the application scenario shown in fig. 1, in some embodiments of the present application, the central control platform 13 may display or publish the concentration degree information on the central control screen, so that a manager or a user (or referred to as an operator) can more intuitively view the concentration degree information of the detection object. In some embodiments of the present application, the server 12 may include a local server or a cloud server. In some embodiments of the present application, the server 12 in the application scenario shown in fig. 1 may be replaced by a plurality of servers, as shown in fig. 2, the main server 121 is connected to the central control platform 13, the main server 121 is connected to one or more node servers 122, and the node servers 122 are connected to the terminal device 11. The central control platform 13 is used for controlling the main server 121 to configure online content on the terminal device 11 through the node server 122. The main server 121 configures online content to the terminal device 11 through the node server 122 under the control of the central control platform 13.
The concentration degree detection system provided by the embodiment of the application completes bidirectional information flow transmission through interaction among a plurality of devices. Multimedia information is transmitted in the forward direction, concentration degree data is fed back in the reverse direction, and teaching and monitoring are completed by one set of system at the same time. On one hand, hardware equipment or a system does not need to be respectively deployed for teaching and monitoring, and the hardware cost is greatly reduced. On the other hand, compared with the prior art in which the terminal device is simply used for collecting monitoring data, the terminal device in the embodiment of the present application can locally complete the calculation of the concentration degree information in addition to collecting monitoring data, and this process can be realized through software without changing hardware, so that the embodiment of the present application is easy to implement, and the hardware cost is further controlled.
It can be understood that, in the embodiment of the present application, calculating the information such as the concentration degree of the viewer is performed in the terminal device. This is because the data collected by the depth camera includes video format, and/or picture format, etc., the data amount is very large, and various algorithms are preferentially executed on the terminal device to calculate the concentration of the viewer in consideration of network data transmission. In other embodiments of the present application, the concentration degree may also be calculated in a server or a central control platform, and the present application is not limited to this specifically.
It will be appreciated that the concentration detection system may be applied in a variety of scenarios, such as teaching, training, or conferencing, to detect the concentration of the participants. Accordingly, the terminal device in the concentration detection system should be installed in the corresponding scene to collect the image data of the participant for the calculation of the concentration information.
To detect the specific application scenario of the concentration degree of the student in the teaching process, as a specific example of the application scenario shown in fig. 1. The central control platform has no strong storage function, and contents to be online, such as teaching information like videos, can be stored in a cloud server. The central control platform, which may also be referred to as a video configuration platform, configures and uploads specific video content, such as a certain teaching video, in the cloud server. And the cloud server distributes the online content to the terminal equipment of each classroom through the node server of the campus. For example, an operator (or user) of the central control platform determines the online content of the terminal device through the server, and one or more video contents can be online at one time. An operator of the terminal device, such as a teacher, selects a certain video content that is already on-line on the terminal device for playing, and a display device communicatively coupled to the terminal device plays the video content. The terminal device collects image data, such as 2D images and/or depth images and/or video data, of students in the teacher, calculates concentration degree information of the students based on the 2D images and/or the depth images and/or the video data, and uploads the concentration degree information to the central control platform through the server.
It should be noted that the online content may not be input by the operator of the central control platform. The server may store therein a lot of content such as videos, documents, or pictures, etc. The operator of the central control platform can decide which contents are distributed to students for viewing, and the selected contents are online contents.
It should be understood that, in the embodiments of the present application, the server includes, but is not limited to, a stand-alone server, a distributed server, a server cluster or a cloud server, and the like.
It should be understood that, in the embodiment of the present application, the terminal device and the display apparatus may be separate devices or may be different components in the same device. This is not limited by the present application.
It should also be understood that the number of terminal devices, the number of servers, the number of central control platforms, and the like illustrated in fig. 1 and 2 should not be construed as specifically limiting the present application. Those skilled in the art will understand that the number of the terminal devices, the number of the servers, the number of the central control platforms, and the like can be selectively set according to actual requirements.
Fig. 3 shows a flowchart for implementing a method for detecting concentration according to an embodiment of the present application. By way of example and not limitation, the method may be applied in a terminal device as shown in fig. 1 or fig. 2. The method includes steps S310 to S340.
S310, the terminal equipment receives online content transmitted by a server and displays the online content; and the online content is sent to the terminal equipment by the server under the control of the central control platform.
And S320, the terminal equipment collects the image data of the viewer.
Wherein the image data includes, but is not limited to: at least one of a 2D image, a depth image, and video data.
In an embodiment of the application, the terminal device acquires image data of a viewer through a depth camera.
S330, the terminal device calculates the concentration degree information of the watcher according to the image data.
In an embodiment of the present application, the concentration information includes, but is not limited to, one or more of facial information of the viewer, a direction of a line of sight of the viewer, and a length of time the viewer is facing the display device.
The facial information of the viewer mainly includes facial expressions, lip movements, eye closing degree, eye blinking frequency and the like of the viewer. In one embodiment, the control processing device firstly locks the face area in the video format and/or the picture format, cuts and fragments the facial expression, motion, human eye information and the like, realizes the preliminary extraction and classification of the information, and processes the information to obtain the characteristic behavior information. For example, when the facial expression and lip movement of the person are less, the concentration of the on-line content of the viewer is considered to be higher, or the eye closing degree and the blinking frequency are higher, the concentration of the on-line content of the viewer is considered to be lower.
The control processing device can acquire the face image of each viewer through the video and/or 2D and/or depth image collected by the depth camera according to the sight line orientation of the viewers. In an embodiment of the application, the sight direction of a viewer is determined by preferentially adopting a depth face image. Specifically, first, 3D information (such as 3D point cloud) of a human face is calculated by using the depth image, and information such as a human face orientation, a key point 3D coordinate and the like can be acquired according to the 3D information; secondly, identifying the detail features of human eyes, such as the pupil center, a flicker point (a fixed spot formed by light reflected by human eye cornea after infrared light irradiation in an infrared camera), a pupil, an iris and the like, according to the 2D image, and further obtaining the 3D coordinates of the detail features of the human eyes based on the human face 3D information and the relationship between the 2D image and the depth image, such as mutual coincidence, or the corresponding relationship between pixels of the two images after registration and the like; finally, the human eye sight line direction is calculated in combination with the 3D coordinates of one or more human eye detail features.
For the duration of time that the viewer faces the display device, the duration of time that the viewer's eye line faces the display device needs to be counted in combination with the above-mentioned human eye line direction.
S340, the terminal equipment sends the concentration degree information to the central control platform through the server.
Fig. 4 shows a block diagram of the concentration detection apparatus provided in the embodiment of the present application, and only shows the relevant parts of the embodiment of the present application for convenience of explanation. The concentration degree detection device is configured on the terminal equipment.
Referring to fig. 4, the concentration degree detection apparatus includes:
a receiving module 41, configured to receive online content transmitted by a server, where the online content is sent to the terminal device by the server under the control of a central control platform;
a display device 42 for displaying the online content;
the acquisition and calculation module 43 is configured to acquire image data of a viewer, and calculate concentration information of the viewer according to the image data;
and the sending module 44 is configured to send the concentration degree information to the central control platform through the server.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In an embodiment of the present application, as shown in fig. 5, a schematic structural diagram of a terminal device 5 provided in an embodiment of the present application is shown. As shown in fig. 5, the terminal device 5 includes: at least one processor 50 (only one processor is shown in fig. 5), a memory 51, and a computer program 52 stored in the memory 51 and executable on the at least one processor 50, the processor 50 implementing the steps of the concentration detection method, such as the steps S310 to S340 described in fig. 3, when executing the computer program 52. Alternatively, the processor 50 implements the functions of the modules/units in the concentration detection apparatus, such as the modules 41 to 44 shown in fig. 4, when executing the computer program 52.
Illustratively, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 52 in the terminal device 5.
The terminal device 5 may include, but is not limited to, a processor 50 and a memory 51. It will be understood by those skilled in the art that fig. 5 is only an example of the terminal device 5, and does not constitute a limitation to the terminal device 5, and may include more or less components than those shown, or combine some components, or different components, for example, the terminal device 5 may further include an input-output device, a network access device, a bus, etc. Specifically, in an embodiment of the present application, the terminal device may further include a depth camera for acquiring image data of the observer. In an embodiment of the present application, the terminal device may further include a display device, configured to display the online content.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal device 5, such as a hard disk or a memory of the terminal device 5. The memory 51 may also be an external storage device of the terminal device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the terminal device 5. The memory 51 is used for storing the computer program and other programs and data required by the terminal device 5. The memory 51 may also be used to temporarily store data that has been output or is to be output.
Fig. 6 is a schematic structural diagram of a terminal device according to another embodiment of the present application, and as shown in fig. 6, the terminal device includes a depth camera 111 and a control processing device 112 connected to the depth camera 111.
The depth camera 111 is used to acquire at least one of 2D images, depth images, and video data.
The control processing device 112 is configured to control the depth camera 111 to capture at least one of a 2D image, a depth image, and video data of the viewer, and receive and process the at least one of the 2D image, the depth image, and the video data to calculate concentration degree information of the viewer, and upload the concentration degree information to the server. The 2D image includes, but is not limited to, an infrared image, a color image, and the like.
In an embodiment of the present application, with continued reference to fig. 6, the depth camera 111 includes a projection module 1111, an RGB module 1112, and an imaging module 1113. The projection module 1111 is used for emitting a structured light beam to a viewer; and the imaging module 1113 is used for imaging the structured light beam to acquire a 3D image of a viewer. And the RGB module 1112 is configured to collect a color image of the viewer.
Generally, the projection module 1111 may project a visible light image or a non-visible light image. Invisible light images, such as infrared speckle images. When the projection module 1111 projects the invisible light image, correspondingly, the imaging module 1113 may be an infrared camera. It should be understood that the infrared camera may capture infrared speckle images as the projection module 1111 projects the infrared speckle images. When the projection module 1111 is turned off, the infrared camera may also capture infrared images.
As one non-limiting example of the present application, the baseline of the depth camera 111 may be 75 mm.
The control processing device 112 may be an ARM control processing device integrated with a specific function, or may be an off-the-shelf Programmable Gate Array (FPGA) integrated in a System on Chip (SoC) core, a Digital Signal Processor (DSP), or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like.
In an embodiment of the present application, the control processing device 112 in the terminal device is connected to and controls the depth camera 111 through a circuit board (not shown in fig. 6) and a connector (not shown in fig. 6). The circuit board may be a flexible circuit board FPC, a printed circuit board PCB, a rigid-flex board, or the like, and the connector may include any form such as a board-to-board (BTB) connector, a Zero Insertion Force (ZIF) connector, or the like.
In some embodiments of the present application, the terminal device comprises a display means in addition to the depth camera 111 and the processing control means 112. The display device is communicatively coupled with the control processing device. The display device is used for displaying the online content.
The display device and the control processing device are communicatively coupled, including but not limited to wired or wireless connections. The display device is, for example, a projector or a display screen.
It should be understood that, in the embodiment of the present application, the depth camera, the control processing device, and the display device may be independent devices, may also be disposed in the same device, and may also be disposed in two devices, which is not limited in this application.
Fig. 7 is a structural exploded view of the terminal device shown in fig. 6. As shown in fig. 7, the terminal device further includes a body 113, and the body 113 has a hollow cavity to form an accommodating space. The control processing device 112 is accommodated in the hollow cavity of the body 113.
In an embodiment of the present application, with continued reference to fig. 7, the body 113 includes a semi-enclosed housing 1131 with an opening at the bottom, and a bottom cover 1132 adapted to fit the opening of the semi-enclosed housing 1131. The semi-enclosed housing 1131 and the bottom cover 1132 form a cavity having an accommodating space, and the control processing device 112 is accommodated in the cavity.
The semi-enclosed housing 1131 and the bottom cover 1132 may be adapted by one or more of snapping, riveting, fastening with screws, and the like. For example, the semi-enclosed housing 1131 and the bottom cover 1132 may be adapted by a slot and a buckle, may also be adapted by interference fit, and may also be adapted by clamping and screw fixation.
It should be understood that fig. 7 is only an exemplary illustration, and the application is not limited to the manner in which the half-closed housing 1131 and the bottom cover 1132 are adapted.
It should also be understood that fig. 7 is only exemplary in this regard, and that in other embodiments of the present application, the body 113 may also be integrally formed, or cooperatively formed by three or more housings. The body 113 may be closed or may have one or more openings. The structure and shape of the body 113 are not particularly limited in this application.
In an embodiment of the present application, with reference to fig. 7, two sides of the semi-enclosed housing 1131 are provided with a plurality of two-side heat dissipation holes 11311 for heat dissipation, and the bottom cover 1132 is provided with a plurality of bottom heat dissipation holes 11321 for heat dissipation.
In an embodiment of the present application, as shown in fig. 7, the terminal device further includes a heat sink 114. The heat sink 114 is disposed on the control processing device 112.
In an embodiment of the present application, the heat sink 114 may further have a plurality of fins thereon, so as to further enhance the heat dissipation effect.
In an embodiment of the present application, the heat sink 113 may be an aluminum heat sink.
The fin is derived to the heat that fin 113 will control processing apparatus 112 on, the heat on the fin is for the air heating in the cavity, hot-air rises upwards and is derived through the both sides louvre 11311 of semi-enclosed casing 1131, because air convection cold air will get into in the cavity in order to compensate through bottom louvre 11321, accommodation space in the cavity has provided sufficient place for the air convection, thereby the good heat dissipation of control processing apparatus 112 has been guaranteed, and then make control processing apparatus 112 long-time parallel work, for example, drive the depth camera and carry out data acquisition, the operation is concentrated in the while of processes such as degree algorithm.
It should also be understood that, in the embodiments of the present application, fig. 5, fig. 6, and fig. 7 are only examples of the terminal device, and are not to be construed as specific limitations on the structure of the terminal device, and other structures of the terminal device may also be provided, and/or further components may be included.
As described above, the terminal device in the concentration detection system should be installed in an actual application scene, and the following description will be made of an installation apparatus that can be used to install the terminal device.
It should be appreciated that in some embodiments of the present application, the concentration detection system may further include a mounting device, which may be used to mount the terminal device.
Referring to fig. 8, the present application also provides a mounting device 14, wherein the mounting device 14 includes a main body bracket 141, a connecting member 142, and an auxiliary bracket 143 slidably mounted on the main body bracket 141 through the connecting member 142.
The auxiliary support 143 may be used to support a terminal device. Generally, the main body frame 141 is installed on a vertical wall, the terminal device is installed on the auxiliary frame 143, and the auxiliary frame 143 is horizontally installed on the main body frame 141, as shown in fig. 9, fig. 9 is a schematic structural view of the terminal device 11 installed on the wall by the installation apparatus 14.
With continued reference to fig. 8, the main body support 141 includes a vertical plate 1411, a bending structure 1412, and a horizontal plate 1413 integrally formed with the vertical plate 1411 by the bending structure 1412. The auxiliary bracket is slidably mounted on the lateral horizontal plate 1413 by a connecting member. The vertical plate 1411 is used to be fixed to a vertical wall. The vertical plate 1411 is provided with a first through hole 14111, and an arc-shaped through groove 14112 is arranged below the first through hole 14111.
When the vertical flat plate is installed, the vertical flat plate is attached to the wall surface, firstly, one screw nail is used for fixing the vertical flat plate in the first through hole, then, the inclination angle of the vertical flat plate is adjusted, after the angle is determined, the other screw nail is used for fixing the vertical flat plate (the main body support) in the arc-shaped through groove, and thus, the vertical flat plate (the main body support) is fixed through the two screws and the inclination angle of the vertical flat plate on the vertical plane is determined.
It should be understood that fig. 8 is only exemplary, and in other embodiments of the present application, the main body support 141 may not include the bending structure 1412, and may include the connected vertical flat plate 1411 and the horizontal flat plate 1413. In other embodiments of the present application, the vertical plate 1411 and the horizontal plate 1413 may be integrally formed, or may be separately formed and then fixed, and the fixing manner includes, but is not limited to, welding and/or riveting.
In an embodiment of the present application, the distance between the first through hole 14111 and the lowest point of the arc-shaped through groove 14112 may be 20 to 30mm, and preferably 25 mm. An included angle between a line segment formed by the first through hole 14111 and the lowest point of the arc-shaped through groove 14112 and a line segment formed by the first through hole 14111 and any end point of the arc-shaped through groove 14112 is equal, and can be 25 to 35 degrees, and is preferably 30 degrees. With this arrangement, when the angle is set to 30 degrees, the main body support 141 can be deflected by ± 30 ° in the vertical direction, that is, the terminal device can be deflected by ± 30 ° in the vertical direction.
In an embodiment of the present application, referring to fig. 8, the horizontal plate 1413 is a Y-shaped structure, and includes a main body portion 14131 and an extension portion 14132 extending horizontally outward along the main body portion 14131.
In an embodiment of the present application, referring to fig. 8, the main body portion 14131 is formed with a plurality of regularly or irregularly arranged heat dissipating through holes 141311, and the extension portion 14132 is formed with a strip-shaped sliding slot 141321. The auxiliary bracket 143 is detachably mounted on the extension 14132, and can slide back and forth on the strip-shaped slide groove 141321.
In an embodiment of the present application, referring to fig. 8, the auxiliary support 143 includes a bearing part 1431 and a blocking part 1432. In other embodiments of the present application, the auxiliary bracket 143 may not include the blocking portion 1432.
The bearing part 1431 is used for bearing the terminal device, and a mounting through hole 14311 is opened on the bearing part 1431, the mounting through hole 14311 is matched with the sliding slot 141321, so that the connecting component 142 can penetrate through the mounting through hole 14311 through the sliding slot 141321 until reaching the terminal device, and further the terminal device can slide back and forth on the sliding slot. It should be noted that the bottom of the terminal device may be provided with a groove or a through hole adapted to the connecting member 142, and one end of the connecting member 142 may be inserted into the groove or the through hole. In an embodiment of the present application, the connecting component 142 is inserted into one end of the groove or the through hole, and a thread adapted to the groove or the through hole may be disposed on the connecting component, so as to prevent the terminal device from sliding off.
The baffle portion 1432 is disposed at an end of the bearing portion 1431, and is used for abutting against the terminal device, so as to prevent the terminal device from sliding down. The number of the baffle portions 1432 may be one or more, and when the number of the baffle portions 1432 is plural, the plurality of baffle portions 1432 may be provided at different end portions of the bearing portion 1431, or may be provided at the same end portion of the bearing portion 1431 as shown in fig. 8.
In one embodiment of the present application, the connecting member 142 may be a screw, and in order to enable the screw to slide on the sliding groove 141321, the width of the sliding groove 141321 should be slightly larger than the diameter of the screw rod of the screw and slightly smaller than the diameter of the nut of the screw.
In an embodiment of the present application, the auxiliary bracket 143 may be horizontally deflected by ± 45 ° with the connection member 142 as a fulcrum. In other embodiments of the present application, the deflection angle in the horizontal direction may also be another angle, for example, any angle in the range of 30 to 50 degrees, and the deflection angle may be selectively set according to the actual application scenario and the requirement, which is not limited in this application.
From the above, the terminal device is installed through the upper installation device, so that the terminal device can deflect in the vertical direction and the horizontal direction, the terminal device can be installed at various positions in a classroom (or other scenes), and the depth camera can be adjusted to an angle capable of identifying the main area of the classroom. In addition, the terminal device can slide back and forth in the sliding groove, that is to say, the terminal device slides in the sliding groove to adjust the front-back distance, and the field of view (FOV) of the depth camera can be guaranteed not to be shielded by other objects (such as wall-mounted sound box, projector and the like) installed on the wall body beside, so that the accuracy and the environmental suitability of the detection concentration degree scheme provided by the embodiment of the application are greatly improved.
Embodiments of the present application further provide a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the steps in the method for detecting concentration may be implemented.
The embodiment of the present application provides a computer program product, which when running on a terminal device, enables the terminal device to implement the steps in the concentration detection method embodiment when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a network terminal, recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A mounting device, comprising: the main part support, adapting unit, and through adapting unit slidable mounting in auxiliary stand on the main part support, the auxiliary stand is used for bearing terminal equipment, the main part support is used for installing in the wall.
2. The mounting device of claim 1, wherein said main body bracket includes an attached vertical plate and a lateral horizontal plate, said auxiliary bracket being slidably mounted to said lateral horizontal plate by said connecting member, said vertical plate being adapted to be secured to a wall.
3. The mounting device of claim 2, wherein the body support further comprises a bent structure, and the vertical flat plate and the horizontal flat plate are connected by the bent structure.
4. The mounting device as claimed in claim 2 or 3, wherein the vertical plate is provided with a first through hole, and an arc-shaped through groove is arranged below the first through hole.
5. The mounting device as claimed in claim 2 or 3, wherein the horizontal plate has a sliding slot, and the auxiliary bracket is detachably mounted to the horizontal plate via the connecting member penetrating the sliding slot.
6. The mounting device of claim 5, wherein the transverse horizontal plate includes a main body portion and an extension portion extending horizontally outward along the main body portion, the runner opening onto the extension portion.
7. The mounting device as claimed in claim 5, wherein the auxiliary bracket includes a carrying portion for carrying a terminal device to be mounted, the carrying portion has a mounting through hole, and the connecting member passes through the sliding groove and the mounting through hole.
8. The mounting device according to claim 7, wherein a groove or a first through hole is provided at the bottom of the terminal device to be mounted, and the connecting member is inserted into the groove or the first through hole after penetrating the sliding groove and the mounting through hole.
9. The mounting device of claim 7 or 8, wherein the auxiliary bracket further comprises a baffle portion provided at an end of the carrier portion.
10. A mounting device according to any one of claims 1 to 3, wherein the attachment means is a screw.
CN202020647882.1U 2020-04-24 2020-04-24 Mounting device Active CN212361513U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020647882.1U CN212361513U (en) 2020-04-24 2020-04-24 Mounting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020647882.1U CN212361513U (en) 2020-04-24 2020-04-24 Mounting device

Publications (1)

Publication Number Publication Date
CN212361513U true CN212361513U (en) 2021-01-15

Family

ID=74133990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020647882.1U Active CN212361513U (en) 2020-04-24 2020-04-24 Mounting device

Country Status (1)

Country Link
CN (1) CN212361513U (en)

Similar Documents

Publication Publication Date Title
US9460351B2 (en) Image processing apparatus and method using smart glass
JP2017507557A (en) Process for improving the quality of experience for users who view high-definition video streams on their devices
DE112016006066T5 (en) ANALYSIS OF ENVIRONMENTAL LIGHT FOR PICTURE TRACKING
CN113115110B (en) Video synthesis method and device, storage medium and electronic equipment
CN102918547A (en) Remote gaze control system and method
CN107809563B (en) Blackboard writing detection system, method and device
Mackin et al. The visibility of motion artifacts and their effect on motion quality
CN111597916A (en) Concentration degree detection method, terminal device and system
CN112543344A (en) Live broadcast control method and device, computer readable medium and electronic equipment
CN212361513U (en) Mounting device
CN110933350A (en) Electronic cloud mirror recording and broadcasting system, method and device
CN105340258A (en) Location detection device
Zhang et al. Full-reference stability assessment of digital video stabilization based on riemannian metric
CN113301367B (en) Audio and video processing method, device, system and storage medium
CN113784084A (en) Processing method and device
CN109076251B (en) Teleconferencing transmission
Lampi et al. An automatic cameraman in a lecture recording system
CN115315939A (en) Information processing apparatus, information processing method, and program
JP2018170706A (en) Video information transmission program, video information transmission method, and video information transmission device
CN115580691A (en) Image rendering and synthesizing system for virtual film production
KR20170066185A (en) Method, Apparatus and System for Transmitting Video Based On Multiple Cameras
CN111988520B (en) Picture switching method and device, electronic equipment and storage medium
WO2021226821A1 (en) Systems and methods for detection and display of whiteboard text and/or an active speaker
CN113411543A (en) Multi-channel monitoring video fusion display method and system
CN112887620A (en) Video shooting method and device and electronic equipment

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant