CN113822167A - Method, apparatus, and medium for classifying target object based on machine vision recognition - Google Patents
Method, apparatus, and medium for classifying target object based on machine vision recognition Download PDFInfo
- Publication number
- CN113822167A CN113822167A CN202110993734.4A CN202110993734A CN113822167A CN 113822167 A CN113822167 A CN 113822167A CN 202110993734 A CN202110993734 A CN 202110993734A CN 113822167 A CN113822167 A CN 113822167A
- Authority
- CN
- China
- Prior art keywords
- electronic fence
- target object
- triangular
- margin
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000004590 computer program Methods 0.000 claims description 17
- 238000010586 diagram Methods 0.000 description 16
- 238000011161 development Methods 0.000 description 2
- 238000001802 infusion Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Landscapes
- Image Analysis (AREA)
Abstract
The application relates to a method, equipment and medium for classifying a target object based on machine vision recognition. The classification method of the target object identified based on the machine vision comprises the following steps: acquiring coordinates of a target object; calculating the position of the coordinate of the target object on a frame plane, and judging whether the coordinate of the target object is in an electronic fence in a pre-configured attribution range; and if so, determining that the target object belongs to the corresponding electronic fence. The method provided by the application can determine the classification of the target object identified based on the machine vision, and further endows the machine vision identification technology with actual business capability.
Description
Technical Field
The present application relates to the field of vision recognition technology, and in particular, to a method, an apparatus, and a medium for classifying a target object based on machine vision recognition.
Background
Machine vision is a branch of the rapid development of artificial intelligence. In brief, machine vision is to use a machine to replace human eyes for measurement and judgment. Machine vision is widely used in medical services because of its high accuracy, no intervention, etc.
In the related art, how to determine the classification of the target object recognized based on the machine vision becomes an obstacle for the machine vision technology to be efficiently combined with the business function.
At present, no effective solution is provided for how to determine the classification of the target object identified based on machine vision in the related art.
Disclosure of Invention
The embodiment of the application provides a method, equipment and medium for classifying a target object based on machine vision recognition, and aims to solve the problem of determining the classification of the target object based on machine vision recognition in the related art.
In a first aspect, an embodiment of the present application provides a method for classifying a target object identified based on machine vision, where the method includes:
acquiring coordinates of a target object;
calculating the position of the coordinate of the target object on a frame plane, and judging whether the coordinate of the target object is in an electronic fence in a pre-configured attribution range;
if yes, determining that the target object belongs to the electronic fence.
In some of these embodiments, said calculating the location of the coordinates of the target object on the frame plane comprises:
establishing a planar rectangular coordinate system based on a frame where a target object is located, wherein the planar rectangular coordinate system takes a vertex at the upper left corner as an origin, an abscissa is positive towards the right, and an ordinate is positive downwards;
selecting a central point of the target object as a coordinate of the target object, and calculating a coordinate position (X, Y) of the central point.
In some embodiments, in a case that the electronic fence is a rectangular electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range includes:
if X, Y satisfiesThe target object belongs to the rectangular electronic fence, wherein left is an abscissa of a left margin of the rectangular electronic fence, right is an abscissa of a right margin of the rectangular electronic fence, top is an ordinate of an upper margin of the rectangular electronic fence, and bottom is an ordinate of a lower margin of the rectangular electronic fence.
In some embodiments, in a case that the electronic fence is a left triangle electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range includes:
judging X, Y whether (X-left)/(Y-top) is equal to or less than (right-left)/(bottom-top), if yes, the target object belongs to the left triangular electronic fence, wherein left is an abscissa of a left edge distance of the left triangular electronic fence, right is an abscissa of a right edge distance of the left triangular electronic fence, top is an ordinate of an upper edge distance of the left triangular electronic fence, and bottom is an ordinate of a lower edge distance of the left triangular electronic fence.
In some embodiments, in a case that the electronic fence is a right triangle electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range includes:
judging X, Y whether (X-left)/(Y-top) > (right-left)/(bottom-top) is satisfied, if so, attributing the target object to the right triangular electronic fence, wherein left is an abscissa of a left margin of the right triangular electronic fence, right is an abscissa of a right margin of the right triangular electronic fence, top is an ordinate of an upper margin of the right triangular electronic fence, and bottom is an ordinate of a lower margin of the right triangular electronic fence.
In some embodiments, in a case that the electronic fence is an upper triangular electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range includes:
and judging X, Y whether (right-X)/(Y-top) is equal to or less than (right-left)/(bottom-top), if so, attributing the target object to the upper triangular electronic fence, wherein left is an abscissa of a left margin of the upper triangular electronic fence, right is an abscissa of a right margin of the upper triangular electronic fence, top is an ordinate of an upper margin of the upper triangular electronic fence, and bottom is an ordinate of a lower margin of the upper triangular electronic fence.
In some embodiments, in a case that the electronic fence is a lower triangular electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range includes:
judging X, Y whether (right-X)/(Y-top) > (right-left)/(bottom-top) is satisfied, if so, attributing the target object to the lower triangular electronic fence, wherein left is an abscissa of a left margin of the lower triangular electronic fence, right is an abscissa of a right margin of the lower triangular electronic fence, top is an ordinate of an upper margin of the lower triangular electronic fence, and bottom is an ordinate of the lower margin of the lower triangular electronic fence.
In some embodiments, in the case that the electronic fence is an electronic fence combined by a rectangle and a triangle, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range includes:
and judging whether the coordinates of the target object are in the rectangle or the triangle, if so, belonging to the combined electronic fence.
In a second aspect, the present application provides an electronic apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor, when executing the computer program, implements the classification method for a target object identified based on machine vision according to the first aspect.
In a third aspect, the present application provides a storage medium, on which a computer program is stored, which when executed by a processor, implements the classification method for a target object identified based on machine vision as described in the first aspect above.
Compared with the related art, the beneficial effects of the application are that: establishing a plane rectangular coordinate system based on a frame where a target object is located and identified by machine vision, acquiring coordinates of the target object, calculating the position of the coordinates of the target object on a frame plane, judging whether the coordinates of the target object are in an electronic fence in a pre-configured attribution range, if so, determining that the target object belongs to the electronic fence, and correspondingly, determining the classification of the target object based on the classification of the electronic fence. The method provided by the application can determine the classification of the target object identified based on the machine vision, and further endows the machine vision identification technology with actual business capability.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart of a method for classifying a target object based on machine vision recognition according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a target object identified based on machine vision in accordance with an embodiment of the present application;
FIG. 3 is a schematic diagram of coordinates of a target object belonging to a rectangular electronic fence according to an embodiment of the present application;
FIG. 4 is a schematic diagram of coordinates of a target object belonging to a left triangular electronic fence according to an embodiment of the present application;
FIG. 5 is a schematic diagram of coordinates of a target object belonging to a right triangular electronic fence according to an embodiment of the present application;
FIG. 6 is a schematic diagram of coordinates of a target object belonging to an upper triangular electronic fence according to an embodiment of the present application;
FIG. 7 is a schematic diagram of coordinates of a target object belonging to a lower triangular electronic fence according to an embodiment of the present application;
fig. 8 is an internal structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The embodiment of the application provides a classification method of a target object based on machine vision recognition, which aims to solve the problem of determining the classification of the target object based on machine vision recognition.
Fig. 1 is a flowchart of a classification method for a target object identified based on machine vision according to an embodiment of the present application, and referring to fig. 1, the method may include steps S101 to S103.
Step S101, coordinates of the target object are acquired.
Step S102, calculating the position of the coordinates of the target object on the frame plane.
Step S103, determining whether the coordinates of the target object are in the electronic fence in the pre-configured attribution range, if yes, executing step S104.
And step S104, determining that the target object belongs to the electronic fence.
In summary, the coordinates of the target object are obtained, the position of the coordinates of the target object on the frame plane is calculated, whether the coordinates of the target object are in the electronic fence in the pre-configured attribution range is judged, if yes, the target object is determined to belong to the electronic fence, and accordingly, the classification of the target object is determined based on the classification of the electronic fence. The method provided by the application can determine the classification of the target object identified based on the machine vision, and further endows the machine vision identification technology with actual business capability.
Fig. 2 is a schematic diagram of a target object identified based on machine vision according to an embodiment of the present disclosure, and as shown in fig. 2, the target object identified based on machine vision is completely included in a re-electronic fence by taking a central point as a reference, so that the classification of the target object can be accurately determined.
It should be noted that the electronic fence according to the embodiment of the present application is used for determining whether the coordinates of the target object are within the pre-configured attribution range, and the shape of the electronic fence may be a rectangle, a triangle, or any combination of a rectangle and a triangle. Here, for example, the medical service scenario may be taken as an example, the target object is the movement of a nurse or a patient, and an electronic fence corresponding to the movement of the nurse is preset, for example, the nurse sets an electronic fence range for the movement of the patient for infusion, and by determining that the coordinate position corresponding to the movement captured by the machine recognition is within the infusion electronic fence range, the target object corresponding to the movement can be determined as the movement of the nurse.
As an alternative embodiment, in step S102, a planar rectangular coordinate system is established based on the frame where the target object is located, where the planar rectangular coordinate system uses the top left vertex as an origin, the abscissa is positive to the right, and the ordinate is positive downward, the center point of the target object is selected as the coordinate of the target object, and the coordinate position (C, Y) of the center point is calculated. The target object can be placed in the attribution range of the electronic fence comprehensively by taking the central point of the target object as the coordinates of the target object, and accordingly the obtained attribution range of the target object is more accurate.
In some embodiments, fig. 3 is a schematic diagram illustrating that the coordinates of the target object belong to a rectangular electronic fence according to the embodiment of the present application, and in the case that the electronic fence is a rectangular electronic fence, as shown in fig. 3, when the coordinate position X, Y of the target object is satisfiedThe target object belongs to the rectangular electronic fence, where left is the abscissa of the left margin of the rectangular electronic fence, right is the abscissa of the right margin of the rectangular electronic fence, top is the ordinate of the upper margin of the rectangular electronic fence, and bottom is the ordinate of the lower margin of the rectangular electronic fence. In the case where the coordinates of the target object are within the rectangular electronic fence, then the classification of the target object is the same as the classification of the rectangular electronic fence, and in the case where the coordinates of the target object are not within the rectangular electronic fence, then the classification of the target object is the same as the classification of the rectangular electronic fenceThe classes are different.
In some embodiments, fig. 4 is a schematic diagram illustrating that the coordinates of the target object belong to a left triangular fence according to the embodiment of the present application, and as shown in fig. 4, in the case that the fence is a left triangular fence, the angle (α) formed by the coordinates (X, Y) of the target object and the vertical axis or the horizontal axis, the triangle correlation angle (β) needs to satisfy: tan (α) < tan (β), embodied in the coordinate position relationship: in the case that the coordinate position C, Y of the target object satisfies (C-left)/(Y-top) ≦ (right-left)/(bottom-top), then the target object is assigned to the left triangular fence, i.e., the target object is classified the same as the left triangular fence. In the case that the coordinate position X, Y of the target object does not satisfy (X-left)/(Y-top) ≦ right-left)/(bottom-top), then the target object classification does not belong to the left triangular fence, i.e., the target object classification is not the same as the left triangular fence classification.
In some embodiments, fig. 5 is a schematic diagram illustrating that the coordinates of the target object belong to a right triangle fence according to the embodiment of the present application, and as shown in fig. 5, in the case that the fence is a right triangle fence, the angle (α) formed by the coordinates (X, Y) of the target object and the vertical axis or the horizontal axis, the triangle correlation angle (β) needs to satisfy: tan (α) < tan (β), embodied in the coordinate position relationship: judging X, Y whether (X-left)/(Y-top) > (right-left)/(bottom-top) is satisfied, if so, the target object belongs to the right triangular electronic fence, namely the target object is classified as the same as the right triangular electronic fence; if not, the target object is not classified into the right triangular electronic fence, namely the classification of the target object is different from the classification of the right triangular electronic fence.
In some embodiments, fig. 6 is a schematic diagram illustrating that the coordinates of the target object belong to an upper triangular fence according to the embodiment of the present application, and as shown in fig. 6, in the case that the fence is an upper triangular fence, an angle (α) formed by the coordinates (X, Y) of the target object and a vertical axis or a horizontal axis, a triangle correlation angle (β) is required to satisfy: tan (α) < tan (β), embodied in the coordinate position relationship: and judging X, Y whether (right-X)/(Y-top) is less than or equal to (right-left)/(bottom-top), if so, attributing the target object to the upper triangular electronic fence, namely, the target object is classified as the same as the upper triangular electronic fence.
In some embodiments, fig. 7 is a schematic diagram illustrating that the coordinates of the target object belong to a lower triangular electronic fence according to the embodiment of the present application, and as shown in fig. 7, in the case that the electronic fence is a lower triangular electronic fence, an angle (α) formed by the coordinates (X, Y) of the target object and a vertical axis or a horizontal axis, a triangle correlation angle (β) is required to satisfy: tan (α) < tan (β), embodied in the coordinate position relationship: and (8) judging X, Y whether (right-X)/(Y-top) > (right-left)/(bottom-top) is satisfied, if so, attributing the target object to the lower triangular electronic fence, namely, the classification of the target object is the same as that of the lower triangular electronic fence.
Further, in the case that the electronic fence is an electronic fence formed by combining a rectangle and a triangle, whether the coordinates of the target object are within the rectangle or the triangle is determined, and if so, the target object belongs to the combined electronic fence, that is, the classification of the target object is the same as the classification of the combined electronic fence. Specifically, if the classified or classified electronic fence to which a target object belongs is a set of N geometric figures (rectangles or triangles):
when the fencing is classification of the electronic fence, the coordinates (X, Y) of a target object are determined to belong to a certain classification, and it must be satisfied that:
(X,Y)∈shape1|(X,Y)∈shape2|...
the above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the classification method of the target object identified based on machine vision in the foregoing embodiments, the embodiments of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any one of the above embodiments of a method for classifying a target object based on machine vision recognition.
An embodiment of the present application also provides an electronic device, which may be a terminal. The electronic device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the electronic device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement a method for classification of a target object identified based on machine vision. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the electronic equipment, an external keyboard, a touch pad or a mouse and the like.
In an embodiment, fig. 8 is a schematic internal structure diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 8, there is provided an electronic device, which may be a server, and its internal structure diagram may be as shown in fig. 8. The electronic device comprises a processor, a network interface, an internal memory and a non-volatile memory connected by an internal bus, wherein the non-volatile memory stores an operating system, a computer program and a database. The processor is used for providing calculation and control capability, the network interface is used for communicating with an external terminal through network connection, the internal memory is used for providing an environment for an operating system and the running of a computer program, the computer program is executed by the processor to realize a classification method of a target object identified based on machine vision, and the database is used for storing data.
Those skilled in the art will appreciate that the structure shown in fig. 8 is a block diagram of only a portion of the structure relevant to the present disclosure, and does not constitute a limitation on the electronic device to which the present disclosure may be applied, and that a particular electronic device may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A method for classifying a target object identified based on machine vision, the method comprising:
acquiring coordinates of a target object;
calculating the position of the coordinate of the target object on a frame plane, and judging whether the coordinate of the target object is in an electronic fence in a pre-configured attribution range;
if yes, determining that the target object belongs to the electronic fence.
2. The method of claim 1, wherein the calculating the location of the coordinates of the target object on a frame plane comprises:
establishing a planar rectangular coordinate system based on a frame where a target object is located, wherein the planar rectangular coordinate system takes a vertex at the upper left corner as an origin, an abscissa is positive towards the right, and an ordinate is positive downwards;
selecting a central point of the target object as a coordinate of the target object, and calculating a coordinate position (X, Y) of the central point.
3. The method of claim 2, wherein in the case that the electronic fence is a rectangular electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range comprises:
if X, Y satisfiesThe target object belongs to the rectangular electronic fence, wherein left is the abscissa of the left margin of the rectangular electronic fence, and right isThe abscissa of the right margin of the rectangular electronic fence is the ordinate of the upper margin of the rectangular electronic fence, top is the ordinate of the lower margin of the rectangular electronic fence, and bottom is the ordinate of the lower margin of the rectangular electronic fence.
4. The method of claim 2, wherein in the case that the electronic fence is a left triangle electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range comprises:
judging X, Y whether (X-left)/(Y-top) is equal to or less than (right-left)/(bottom-top), if yes, the target object belongs to the left triangular electronic fence, wherein left is an abscissa of a left edge distance of the left triangular electronic fence, right is an abscissa of a right edge distance of the left triangular electronic fence, top is an ordinate of an upper edge distance of the left triangular electronic fence, and bottom is an ordinate of a lower edge distance of the left triangular electronic fence.
5. The method of claim 2, wherein in the case that the electronic fence is a right triangle electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a preconfigured home range comprises:
judging X, Y whether (X-left)/(Y-top) > (right-left)/(bottom-top) is satisfied, if so, attributing the target object to the right triangular electronic fence, wherein left is an abscissa of a left margin of the right triangular electronic fence, right is an abscissa of a right margin of the right triangular electronic fence, top is an ordinate of an upper margin of the right triangular electronic fence, and bottom is an ordinate of a lower margin of the right triangular electronic fence.
6. The method of claim 2, wherein in the case that the electronic fence is an upper triangular electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a pre-configured home range comprises:
and judging C, Y whether (right-X)/(Y-top) is equal to or less than (right-left)/(bottom-top), if so, attributing the target object to the upper triangular electronic fence, wherein left is an abscissa of a left margin of the upper triangular electronic fence, right is an abscissa of a right margin of the upper triangular electronic fence, top is an ordinate of an upper margin of the upper triangular electronic fence, and bottom is an ordinate of a lower margin of the upper triangular electronic fence.
7. The method of claim 2, wherein, in the case that the electronic fence is a lower triangular electronic fence, the determining whether the coordinates of the target object are within an electronic fence of a pre-configured home range comprises:
judging X, Y whether (right-X)/(Y-top) > (right-left)/(bottom-top) is satisfied, if so, attributing the target object to the lower triangular electronic fence, wherein left is an abscissa of a left margin of the lower triangular electronic fence, right is an abscissa of a right margin of the lower triangular electronic fence, top is an ordinate of an upper margin of the lower triangular electronic fence, and bottom is an ordinate of the lower margin of the lower triangular electronic fence.
8. The method according to any one of claims 1 or 2, wherein in a case where the electronic fence is an electronic fence combined by a rectangle and a triangle, the determining whether the coordinates of the target object are within an electronic fence of a pre-configured home range comprises:
and judging whether the coordinates of the target object are in the rectangle or the triangle, if so, belonging to the combined electronic fence.
9. An electronic device comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method for classifying a target object identified based on machine vision according to any one of claims 1 to 8.
10. A storage medium, in which a computer program is stored, wherein the computer program is configured to execute the method for classifying a target object identified based on machine vision according to any one of claims 1 to 8 when the computer program runs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110993734.4A CN113822167A (en) | 2021-08-27 | 2021-08-27 | Method, apparatus, and medium for classifying target object based on machine vision recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110993734.4A CN113822167A (en) | 2021-08-27 | 2021-08-27 | Method, apparatus, and medium for classifying target object based on machine vision recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113822167A true CN113822167A (en) | 2021-12-21 |
Family
ID=78913646
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110993734.4A Pending CN113822167A (en) | 2021-08-27 | 2021-08-27 | Method, apparatus, and medium for classifying target object based on machine vision recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113822167A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109803231A (en) * | 2019-01-25 | 2019-05-24 | 广东电网有限责任公司信息中心 | Electric apparatus monitoring method, device and computer equipment in fence |
CN111144232A (en) * | 2019-12-09 | 2020-05-12 | 国网智能科技股份有限公司 | Transformer substation electronic fence monitoring method based on intelligent video monitoring, storage medium and equipment |
WO2020107433A1 (en) * | 2018-11-28 | 2020-06-04 | Beijing Didi Infinity Technology And Development Co., Ltd. | System and method for determining whether object belongs to target geo-fence |
-
2021
- 2021-08-27 CN CN202110993734.4A patent/CN113822167A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020107433A1 (en) * | 2018-11-28 | 2020-06-04 | Beijing Didi Infinity Technology And Development Co., Ltd. | System and method for determining whether object belongs to target geo-fence |
CN109803231A (en) * | 2019-01-25 | 2019-05-24 | 广东电网有限责任公司信息中心 | Electric apparatus monitoring method, device and computer equipment in fence |
CN111144232A (en) * | 2019-12-09 | 2020-05-12 | 国网智能科技股份有限公司 | Transformer substation electronic fence monitoring method based on intelligent video monitoring, storage medium and equipment |
Non-Patent Citations (1)
Title |
---|
栏目:JAVASCRIPT: ""判断用户点击是否在指定区域内"", pages 1 - 5, Retrieved from the Internet <URL:https://codercto.com/a/66252.html> * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107633208B (en) | Electronic device, the method for face tracking and storage medium | |
CN105528576A (en) | Method and device for inputting fingerprint | |
CN112102340B (en) | Image processing method, apparatus, electronic device, and computer-readable storage medium | |
CN110287836B (en) | Image classification method and device, computer equipment and storage medium | |
CN110992243B (en) | Intervertebral disc cross-section image construction method, device, computer equipment and storage medium | |
CN113635311B (en) | Method and system for out-of-hand calibration of eye for fixing calibration plate | |
CN111914783A (en) | Method and device for determining human face deflection angle, computer equipment and medium | |
CN112417985A (en) | Face feature point tracking method, system, electronic equipment and storage medium | |
CN111754429A (en) | Motion vector post-processing method and device, electronic device and storage medium | |
CN116385745A (en) | Image recognition method, device, electronic equipment and storage medium | |
CN113538291B (en) | Card image inclination correction method, device, computer equipment and storage medium | |
CN115019382A (en) | Region determination method, apparatus, device, storage medium, and program product | |
CN113256735B (en) | Camera calibration method and system based on binocular calibration | |
CN111080697A (en) | Method, device, computer equipment and storage medium for detecting direction of target object | |
CN113822167A (en) | Method, apparatus, and medium for classifying target object based on machine vision recognition | |
CN108882191A (en) | Object positioning method, device, computer equipment and storage medium | |
CN111340788A (en) | Hardware trojan layout detection method and device, electronic equipment and readable storage medium | |
CN109063601B (en) | Lip print detection method and device, computer equipment and storage medium | |
CN112579810A (en) | Printed circuit board classification method and device, computer equipment and storage medium | |
CN108596127B (en) | Fingerprint identification method, identity verification method and device and identity verification machine | |
CN114332297A (en) | Image drawing method and device, computer equipment and storage medium | |
CN111881907B (en) | Frame regression positioning method and device and electronic equipment | |
CN113140042B (en) | Three-dimensional scanning splicing method and device, electronic device and computer equipment | |
CN115457308A (en) | Fine-grained image recognition method and device and computer equipment | |
CN114494052A (en) | Book counting method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |