CN110779567A - Indoor and outdoor scene recognition method based on multi-module fusion - Google Patents

Indoor and outdoor scene recognition method based on multi-module fusion Download PDF

Info

Publication number
CN110779567A
CN110779567A CN201911066345.6A CN201911066345A CN110779567A CN 110779567 A CN110779567 A CN 110779567A CN 201911066345 A CN201911066345 A CN 201911066345A CN 110779567 A CN110779567 A CN 110779567A
Authority
CN
China
Prior art keywords
indoor
module
sequence
calculating
calculation formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911066345.6A
Other languages
Chinese (zh)
Other versions
CN110779567B (en
Inventor
毛科技
华子雯
汪敏豪
徐瑞吉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201911066345.6A priority Critical patent/CN110779567B/en
Publication of CN110779567A publication Critical patent/CN110779567A/en
Application granted granted Critical
Publication of CN110779567B publication Critical patent/CN110779567B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Abstract

An indoor and outdoor scene recognition method based on multi-module fusion comprises the following steps: step 1, calculating indoor confidence of each detection module, and step 2, judging indoor and outdoor scenes according to the indoor confidence. The invention ensures the accuracy of indoor and outdoor scene identification and also ensures the strong universality. Firstly, calculating corresponding indoor confidence coefficient through information such as light intensity, temperature, humidity, geomagnetism and the like acquired by a sensor; then, calculating the indoor and outdoor ticket number through a function according to the indoor confidence coefficient calculated by each module; and finally, judging whether the current environment is indoor or outdoor according to the ticket number obtained indoors and outdoors.

Description

Indoor and outdoor scene recognition method based on multi-module fusion
Technical Field
The invention relates to an indoor and outdoor scene recognition method.
Background
The wearable mobile environment monitoring system can provide information of the air quality of the surrounding environment for people in real time due to the portability of the wearable mobile environment monitoring system. However, the air quality of indoor environment is different from that of outdoor environment, such as indoor detection of formaldehyde and outdoor detection of PM 2.5. Therefore, it is a challenge how to accurately distinguish indoor and outdoor scenes.
Most of the indoor and outdoor scene recognition algorithms which are common at present are realized based on image recognition, environmental information characteristics and equipment pre-deployment. The algorithms can accurately distinguish indoor scenes from outdoor scenes according to the environment of characteristics, but when the environment is particularly complex, the accuracy of the algorithms is sharply reduced. The invention can weaken the influence of complex environment on indoor and outdoor scene discrimination, can accurately discriminate indoor and outdoor scenes, and has stronger universality.
Disclosure of Invention
The invention provides a multi-module fusion indoor and outdoor scene recognition method, which aims to overcome the defects in the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a multi-module fusion indoor and outdoor scene recognition method comprises the following steps:
step 1: calculating indoor confidence degrees of the light intensity module, the temperature module, the humidity module and the geomagnetic module;
step 2: and judging indoor and outdoor scenes according to the calculated indoor confidence of each module.
The step 1 specifically comprises the following steps:
1) and (4) light intensity module indoor confidence calculation. Calculating a corresponding time zone according to the longitude of the current area, wherein the difference between the longitudes of the time zones is 15 degrees, and the calculation formula is as follows:
Figure BDA0002259478310000021
where Z is the current time zone and l is the current longitude.
Calculating the local noon time according to the difference between the local longitude and the local time zone longitude, wherein the calculation formula is as follows:
N=12-(l/15-Z) (2-2)
wherein N is noon time.
Judging between two of the four solar terms, and calculating the solar incident angle β of the current date according to a certain weight, wherein the calculation formula is as follows:
Figure BDA0002259478310000022
where x is the current date, d 1And d 2For two solar terms, α and γ are the solar angles of incidence for the two solar terms, and day is a function of the number of days between which the difference between the two dates was calculated.
And D, calculating the daytime time D according to the following calculation formula:
Figure BDA0002259478310000023
where β is the angle of incidence of the sun and q is the dimension of the current location.
Dividing the daytime time equally by the midday time to obtain sunrise time R and sunset time S, wherein the calculation formula is as follows:
R=N-D/2 (2-5)
S=N+D/2 (2-6)
finally, calculating the indoor confidence of the light intensity according to the light intensity L and the light intensity threshold value T
C LThe calculation formula is as follows:
Figure BDA0002259478310000031
wherein, T 12000-10 × H, H is the current humidity, and T2 is 200.
2) Temperature module indoor confidence calculation. According to the temperature sequence [ T 1,T 2,…,T n]N is the length of the temperature sequence, and the average value A of the temperature sequence is calculated TThe calculation formula is as follows:
Figure BDA0002259478310000032
according to the calculated temperature sequenceMean value A TAnd a current temperature t, calculating the variance of the temperature sequence: the calculation formula is as follows:
Figure BDA0002259478310000033
wherein V TRepresenting the variance of the temperature sequence.
Mean value A according to the temperature sequence TVariance V of sum temperature sequence TCalculating the indoor confidence C based on the temperature TThe calculation formula is as follows:
Figure BDA0002259478310000034
where σ denotes a threshold value, σ 0Is set to 4, σ 1Set to 0.03.
3) Humidity module indoor confidence calculation. According to the humidity sequence [ H 1,H 2,…,H p]P is the length of the humidity sequence, the average A of the humidity sequence is calculated HThe calculation formula is as follows:
from the calculated average A of the humidity series HAnd current humidity h, calculating the variance of the humidity sequence: the calculation formula is as follows:
Figure BDA0002259478310000042
wherein V HRepresenting the variance of the humidity sequence.
Mean value A from humidity series HVariance V of sum humidity sequence HCalculating the indoor confidence C based on humidity HThe calculation formula is as follows:
Figure BDA0002259478310000043
where σ represents a threshold value,σ 2Is set to 5, σ 3Is set to 1.
4) And calculating indoor confidence of the geomagnetic module. According to the sequence of geomagnetic intensity [ M ] 1,M 2,…,M q]Q is the length of the geomagnetic intensity sequence, and a geomagnetic variance sequence [ V ] is calculated 1,V 2,…,V m]And m is the length of the variance of the geomagnetic intensity, and the calculation formula is shown as (2-14).
Figure BDA0002259478310000044
Wherein V uThe value of the u-th geomagnetic variance in the geomagnetic variance sequence is from 1 to m, k is the size of a sliding window for calculating the geomagnetic variance, and the value is Is 20, A MuThe calculation formula is shown as (2-15) for the average value of the geomagnetic intensity under the sliding window.
Figure BDA0002259478310000046
After calculating geomagnetic variance sequence [ V ] 1,V 2,…,V m]Then, the maximum geomagnetic variance V in the sequence is calculated maxThe calculation formula is shown as (2-16).
V max=max([V 1,V 2,...,V m]) (2-16)
Where max is a function that calculates the maximum value in the sequence of the geomagnetic variances.
After calculating the maximum variance V of the earth magnetism maxThen, compare V maxAnd V tSize of (V) tSet to 14. If V maxGreater than V tIf the current scene is judged to be indoor, otherwise, the current scene is judged to be outdoor, and the indoor confidence coefficient C based on the geomagnetism MCan be calculated from the equations (2-17).
Figure BDA0002259478310000051
The step 2 specifically comprises the following steps:
1) obtaining indoor confidence C of temperature module TIndoor confidence C of optical module LIndoor confidence C of humidity module HAnd indoor confidence C of geomagnetic module MThen, voting is carried out according to the confidence coefficient of each module, and the number of votes which are judged to be indoor is calculated and counted iAnd the outdoor ticket count oThe calculation formulas are shown in (2-18) and (2-19).
count i=class(C L)+class(C T)+class(C H)+class(C M) (2-18)
count o=class(1-C L)+class(1-C T)+class(1-C H)+class(1-C M)
(2-19)
The class function is used for judging whether the confidence of the current module is greater than a threshold value, the threshold value is set to be 0.5, if the confidence of the current module is greater than the threshold value, the result is 1, otherwise, the result is 0, and the formula is shown as (2-20).
Figure BDA0002259478310000052
2) In calculating count iAnd count oAfter the value of (1), the count is compared iAnd count oThe size of (2). If count iGreater than count oJudging that the current scene is indoor; if count iLess than count oJudging that the current scene is outdoor; if count iIs equal to count oThen, calculating the indoor confidence C of indoor multi-module fusion FThe calculation formula of (2) to (21) is shown.
Figure BDA0002259478310000061
Calculating the indoor confidence coefficient C of multi-module fusion FAfter the value of (1), compare C FAnd the size of threshold. If C is present FIf the current scene is greater than threshold, judging that the current scene is indoor; if C is present FIf the current scene is smaller than threshold, judging that the current scene is outdoor; if C is present FIf the threshold is equal, the category of the current scene cannot be judged.
Preferably, n is set to 20; m is set to 5.
The general indoor and outdoor scene identification method identifies the indoor and outdoor scenes through image identification and environmental information characteristics, and the methods can accurately identify the indoor and outdoor scenes. However, when the environment is complicated, the accuracy of these methods is reduced, which causes difficulty in recognizing indoor and outdoor scenes.
The invention has the advantages that: the indoor and outdoor scene recognition accuracy is guaranteed, and meanwhile, the strong universality is also guaranteed. Firstly, calculating corresponding indoor confidence coefficient through information such as light intensity, temperature, humidity, geomagnetism and the like acquired by a sensor; then, calculating the indoor and outdoor ticket number through a function according to the indoor confidence coefficient calculated by each module; and finally, judging whether the current environment is indoor or outdoor according to the ticket number obtained indoors and outdoors.
Drawings
FIG. 1 is a general flow diagram of the present invention.
FIG. 2 is a block diagram of the detection module of the present invention.
Detailed Description
The technical scheme of the invention is further explained by combining the attached drawings.
An indoor and outdoor scene recognition method based on multi-module fusion comprises the following steps:
1. an indoor and outdoor scene recognition method based on multi-module fusion comprises the following steps:
step 1: calculate the indoor confidence of light intensity module, temperature module, humidity module and earth magnetism module, specifically include:
11) and (4) light intensity module indoor confidence calculation. Calculating a corresponding time zone according to the longitude of the current area, wherein the difference between the longitudes of the time zones is 15 degrees, and the calculation formula is as follows:
Figure BDA0002259478310000071
where Z is the current time zone and l is the current longitude.
Calculating the local noon time N according to the difference between the local longitude and the local time zone longitude, wherein the calculation formula is as follows:
N=12-(l/15-Z) (2-2)
judging between two of the four solar terms, and calculating the solar incident angle β of the current date according to a certain weight, wherein the calculation formula is as follows:
Figure BDA0002259478310000072
where x is the current date, d 1And d 2For two solar terms, α and γ are the solar angles of incidence for the two solar terms, and day is a function of the number of days between which the difference between the two dates was calculated.
And D, calculating the daytime time D according to the following calculation formula:
Figure BDA0002259478310000073
where β is the angle of incidence of the sun and q is the dimension of the current location.
Dividing the daytime time equally by the midday time to obtain sunrise time R and sunset time S, wherein the calculation formula is as follows:
R=N-D/2 (2-5)
S=N+D/2 (2-6)
finally, calculating the indoor confidence coefficient C of the light intensity according to the light intensity L and the light intensity threshold value T LThe calculation formula is as follows:
Figure BDA0002259478310000081
wherein, T 12000-10 × H, H is the current humidity, and T2 is 200.
12) Temperature module indoor confidence calculation. According to the temperature sequence [ T 1,T 2,…,T n]N is the length of the temperature sequence, and the average value A of the temperature sequence is calculated TThe calculation formula is as follows:
from the calculated average A of the temperature series TAnd a current temperature t, calculating the variance of the temperature sequence: the calculation formula is as follows:
Figure BDA0002259478310000083
wherein V TRepresenting the variance of the temperature sequence.
Mean value A according to the temperature sequence TVariance V of sum temperature sequence TCalculating the indoor confidence C based on the temperature TThe calculation formula is as follows:
Figure BDA0002259478310000084
where σ denotes a threshold value, σ 0Is set to 4, σ 1Set to 0.03.
13) Humidity module indoor confidence calculation. According to the humidity sequence [ H 1,H 2,…,H p]P is the length of the humidity sequence, the average A of the humidity sequence is calculated HThe calculation formula is as follows:
Figure BDA0002259478310000091
from the calculated average A of the humidity series HAnd current humidity h, calculating the variance of the humidity sequence: the calculation formula is as follows:
Figure BDA0002259478310000092
wherein V HRepresenting the variance of the humidity sequence.
Mean value A from humidity series HVariance V of sum humidity sequence HCalculating the indoor confidence C based on humidity HThe calculation formula is as follows:
Figure BDA0002259478310000093
where σ denotes a threshold value, σ 2Is set to 5, σ 3Is set to 1.
14) And calculating indoor confidence of the geomagnetic module. According to the sequence of geomagnetic intensity [ M ] 1,M 2,…,M q]Q is the length of the geomagnetic intensity sequence, and a geomagnetic variance sequence [ V ] is calculated 1,V 2,…,V m]And m is the length of the variance of the geomagnetic intensity, and the calculation formula is shown as (2-14).
Figure BDA0002259478310000094
Wherein V uThe value of the u-th geomagnetic variance in the geomagnetic variance sequence is from 1 to m, k is the size of a sliding window for calculating the geomagnetic variance, and the value is
Figure BDA0002259478310000095
Is 20, A MuThe calculation formula is shown as (2-15) for the average value of the geomagnetic intensity under the sliding window.
After calculating geomagnetic variance sequence [ V ] 1,V 2,…,V m]Then, the maximum geomagnetic variance V in the sequence is calculated maxThe calculation formula is shown as (2-16).
V max=max([V 1,V 2,...,V m]) (2-16)
Where max is a function that calculates the maximum value in the sequence of the geomagnetic variances.
After calculating the maximum variance V of the earth magnetism maxThen, compare V maxAnd V tSize of (V) tSet to 14. If V maxGreater than V tIf the current scene is judged to be indoor, otherwise, the current scene is judged to be outdoor, and the indoor confidence coefficient C based on the geomagnetism MCan be calculated from the equations (2-17).
Figure BDA0002259478310000101
Step 2: according to the calculated indoor confidence of each module, the indoor and outdoor scenes are judged, and the method specifically comprises the following steps:
21) obtaining indoor confidence C of temperature module TIndoor confidence C of optical module LIndoor confidence C of humidity module HAnd indoor confidence C of geomagnetic module MThen, voting is carried out according to the confidence coefficient of each module, and the number of votes which are judged to be indoor is calculated and counted iAnd the outdoor ticket count oThe calculation formulas are shown in (2-18) and (2-19).
count i=class(C L)+class(C T)+class(C H)+class(C M) (2-18)
count o=class(1-C L)+class(1-C T)+class(1-C H)+class(1-C M)
(2-19)
The class function is used for judging whether the confidence of the current module is greater than a threshold value, the threshold value is set to be 0.5, if the confidence of the current module is greater than the threshold value, the result is 1, otherwise, the result is 0, and the formula is shown as (2-20).
Figure BDA0002259478310000102
22) In calculating count iAnd count oAfter the value of (1), the count is compared iAnd count oThe size of (2). If count iGreater than count oJudging that the current scene is indoor; if count iLess than count oJudging that the current scene is outdoor; if count iIs equal to count oThen, calculating the indoor confidence C of indoor multi-module fusion FThe calculation formula of (2) to (21) is shown.
Calculating the indoor confidence coefficient C of multi-module fusion FAfter the value of (1), compare C FAnd the size of threshold. If C is present FIf the current scene is greater than threshold, judging that the current scene is indoor; if C is present FIf the current scene is smaller than threshold, judging that the current scene is outdoor; if C is present FIf the threshold is equal, the category of the current scene cannot be judged.
n is set to 20; m is set to 5.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (2)

1. An indoor and outdoor scene recognition method based on multi-module fusion comprises the following steps:
step 1: calculate the indoor confidence of light intensity module, temperature module, humidity module and earth magnetism module, specifically include:
11) and (4) light intensity module indoor confidence calculation. Calculating a corresponding time zone according to the longitude of the current area, wherein the difference between the longitudes of the time zones is 15 degrees, and the calculation formula is as follows:
Figure FDA0002259478300000011
where Z is the current time zone and l is the current longitude.
Calculating the local noon time N according to the difference between the local longitude and the local time zone longitude, wherein the calculation formula is as follows:
N=12-(l/15-Z) (2-2)
judging between two of the four solar terms, and calculating the solar incident angle β of the current date according to a certain weight, wherein the calculation formula is as follows:
where x is the current date, d 1And d 2For two solar terms, α and γ are the solar angles of incidence for the two solar terms, and day is a function of the number of days between which the difference between the two dates was calculated.
And D, calculating the daytime time D according to the following calculation formula:
where β is the angle of incidence of the sun and q is the dimension of the current location.
Dividing the daytime time equally by the midday time to obtain sunrise time R and sunset time S, wherein the calculation formula is as follows:
R=N-D/2 (2-5)
S=N+D/2 (2-6)
finally, calculating the indoor confidence coefficient C of the light intensity according to the light intensity L and the light intensity threshold value T LThe calculation formula is as follows:
Figure FDA0002259478300000021
wherein, T 12000-10 × H, H is the current humidity, and T2 is 200.
12) Temperature module indoor confidence calculation. According to the temperature sequence [ T 1,T 2,…,T n]N is the length of the temperature sequence, and the average value A of the temperature sequence is calculated TThe calculation formula is as follows:
Figure FDA0002259478300000022
from the calculated average A of the temperature series TAnd a current temperature t, calculating the variance of the temperature sequence: the calculation formula is as follows:
wherein V TRepresenting the variance of the temperature sequence.
Mean value A according to the temperature sequence TVariance V of sum temperature sequence TCalculating the indoor confidence C based on the temperature TThe calculation formula is as follows:
Figure FDA0002259478300000024
where σ denotes a threshold value, σ 0Is set to 4, σ 1Set to 0.03.
13) Humidity module indoor confidence calculation. According to the humidity sequence [ H 1,H 2,…,H p]P is the length of the humidity sequence, the average A of the humidity sequence is calculated HThe calculation formula is as follows:
Figure FDA0002259478300000031
from the calculated average A of the humidity series HAnd current humidity h, calculating the variance of the humidity sequence: the calculation formula is as follows:
Figure FDA0002259478300000032
wherein V HRepresenting the variance of the humidity sequence.
Mean value A from humidity series HVariance V of sum humidity sequence HCalculating the basis humidityIndoor confidence coefficient C of HThe calculation formula is as follows:
Figure FDA0002259478300000033
where σ denotes a threshold value, σ 2Is set to 5, σ 3Is set to 1.
14) And calculating indoor confidence of the geomagnetic module. According to the sequence of geomagnetic intensity [ M ] 1,M 2,…,M q]Q is the length of the geomagnetic intensity sequence, and a geomagnetic variance sequence [ V ] is calculated 1,V 2,…,V m]And m is the length of the variance of the geomagnetic intensity, and the calculation formula is shown as (2-14).
Figure FDA0002259478300000034
Wherein V uThe value of the u-th geomagnetic variance in the geomagnetic variance sequence is from 1 to m, k is the size of a sliding window for calculating the geomagnetic variance, and the value is
Figure FDA0002259478300000035
Namely the number of 20, the number of the channels,
Figure FDA0002259478300000036
the calculation formula is shown as (2-15) for the average value of the geomagnetic intensity under the sliding window.
Figure FDA0002259478300000037
After calculating geomagnetic variance sequence [ V ] 1,V 2,…,V m]Then, the maximum geomagnetic variance V in the sequence is calculated maxThe calculation formula is shown as (2-16).
V max=max([V 1,V 2,...,V m]) (2-16)
Where max is a function that calculates the maximum value in the sequence of the geomagnetic variances.
After calculating the maximum variance V of the earth magnetism maxThen, compare V maxAnd V tSize of (V) tSet to 14. If V maxGreater than V tIf the current scene is judged to be indoor, otherwise, the current scene is judged to be outdoor, and the indoor confidence coefficient C based on the geomagnetism MCan be calculated from the equations (2-17).
Figure FDA0002259478300000041
Step 2: according to the calculated indoor confidence of each module, the indoor and outdoor scenes are judged, and the method specifically comprises the following steps:
21) obtaining indoor confidence C of temperature module TIndoor confidence C of optical module LIndoor confidence C of humidity module HAnd indoor confidence C of geomagnetic module MThen, voting is carried out according to the confidence coefficient of each module, and the number of votes which are judged to be indoor is calculated and counted iAnd the outdoor ticket count oThe calculation formulas are shown in (2-18) and (2-19).
count i=class(C L)+class(C T)+class(C H)+class(C M) (2-18)
count o=class(1-C L)+class(1-C T)+class(1-C H)+class(1-C M) (2-19)
The class function is used for judging whether the confidence of the current module is greater than a threshold value, the threshold value is set to be 0.5, if the confidence of the current module is greater than the threshold value, the result is 1, otherwise, the result is 0, and the formula is shown as (2-20).
Figure FDA0002259478300000042
22) In calculating count iAnd count oAfter the value of (1), the count is compared iAnd count oThe size of (2). If count iGreater than count oJudging that the current scene is indoor; if count iLess than count oJudging that the current scene is outdoor; if count iIs equal to count oThen, calculating the indoor confidence C of indoor multi-module fusion FThe calculation formula of (2) to (21) is shown.
Figure FDA0002259478300000051
Calculating the indoor confidence coefficient C of multi-module fusion FAfter the value of (1), compare C FAnd the size of threshold. If C is present FIf the current scene is greater than threshold, judging that the current scene is indoor; if C is present FIf the current scene is smaller than threshold, judging that the current scene is outdoor; if C is present FIf the threshold is equal, the category of the current scene cannot be judged.
2. The indoor and outdoor scene recognition method based on multi-module fusion as claimed in claim 1, characterized in that: n is set to 20; m is set to 5.
CN201911066345.6A 2019-11-04 2019-11-04 Indoor and outdoor scene recognition method based on multi-module fusion Active CN110779567B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911066345.6A CN110779567B (en) 2019-11-04 2019-11-04 Indoor and outdoor scene recognition method based on multi-module fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911066345.6A CN110779567B (en) 2019-11-04 2019-11-04 Indoor and outdoor scene recognition method based on multi-module fusion

Publications (2)

Publication Number Publication Date
CN110779567A true CN110779567A (en) 2020-02-11
CN110779567B CN110779567B (en) 2021-07-27

Family

ID=69388876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911066345.6A Active CN110779567B (en) 2019-11-04 2019-11-04 Indoor and outdoor scene recognition method based on multi-module fusion

Country Status (1)

Country Link
CN (1) CN110779567B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114200988A (en) * 2021-12-06 2022-03-18 深圳市时誉高精科技有限公司 Indoor thermostat management system based on big data

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050006763A (en) * 2003-07-10 2005-01-17 엘지전자 주식회사 An image display device having function for displaying environmental information and system thereof
CN102859990A (en) * 2010-04-29 2013-01-02 伊斯曼柯达公司 Indoor/outdoor scene detection using GPS
CN104457751A (en) * 2014-11-19 2015-03-25 中国科学院计算技术研究所 Method and system for recognizing indoor and outdoor scenes
CN105025440A (en) * 2015-07-09 2015-11-04 深圳天珑无线科技有限公司 Indoor/outdoor scene detection method and device
CN107076561A (en) * 2014-09-16 2017-08-18 微软技术许可有限责任公司 Indoor and outdoor transition is considered during position is determined
CN107655564A (en) * 2017-05-11 2018-02-02 南京邮电大学 A kind of indoor and outdoor surroundingses detection method of the multiple technologies fusion based on intelligent terminal
CN108268821A (en) * 2016-12-30 2018-07-10 中国移动通信集团黑龙江有限公司 A kind of indoor and outdoor scene recognition method and device
CN109871641A (en) * 2019-03-07 2019-06-11 浙江工业大学 A method of the indoor and outdoor scene Recognition based on multidimensional heat transfer agent time series
CN110220550A (en) * 2018-03-02 2019-09-10 罗伯特·博世有限公司 Method and apparatus, control unit and portable equipment for indoor/outdoor detection

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050006763A (en) * 2003-07-10 2005-01-17 엘지전자 주식회사 An image display device having function for displaying environmental information and system thereof
CN102859990A (en) * 2010-04-29 2013-01-02 伊斯曼柯达公司 Indoor/outdoor scene detection using GPS
CN107076561A (en) * 2014-09-16 2017-08-18 微软技术许可有限责任公司 Indoor and outdoor transition is considered during position is determined
CN104457751A (en) * 2014-11-19 2015-03-25 中国科学院计算技术研究所 Method and system for recognizing indoor and outdoor scenes
CN104457751B (en) * 2014-11-19 2017-10-10 中国科学院计算技术研究所 Indoor and outdoor scene recognition method and system
CN105025440A (en) * 2015-07-09 2015-11-04 深圳天珑无线科技有限公司 Indoor/outdoor scene detection method and device
CN108268821A (en) * 2016-12-30 2018-07-10 中国移动通信集团黑龙江有限公司 A kind of indoor and outdoor scene recognition method and device
CN107655564A (en) * 2017-05-11 2018-02-02 南京邮电大学 A kind of indoor and outdoor surroundingses detection method of the multiple technologies fusion based on intelligent terminal
CN110220550A (en) * 2018-03-02 2019-09-10 罗伯特·博世有限公司 Method and apparatus, control unit and portable equipment for indoor/outdoor detection
CN109871641A (en) * 2019-03-07 2019-06-11 浙江工业大学 A method of the indoor and outdoor scene Recognition based on multidimensional heat transfer agent time series

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
S.AUST等: "Seamless Indoor/Outdoor Location Cognition with Confidence in Wireless Systems", 《4TH IEEE WORKSHOP ON USER MOBILITY AND VEHICULAR NETWORKS》 *
ZHANG YANG等: "A pervasive indoor and outdoor scenario identification algorithm based on the sensing data and human activity", 《IEEE UPINLBS 2016》 *
张扬: "面向智能终端的高性能室内外场景检测技术研究与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
苏帅: "基于多模态融合的高精度室内外场景识别技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
蒋超: "基于用户行为模式的室内外场景识别技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114200988A (en) * 2021-12-06 2022-03-18 深圳市时誉高精科技有限公司 Indoor thermostat management system based on big data
CN114200988B (en) * 2021-12-06 2023-01-10 深圳市时誉高精科技有限公司 Indoor thermostat management system based on big data

Also Published As

Publication number Publication date
CN110779567B (en) 2021-07-27

Similar Documents

Publication Publication Date Title
US10565955B2 (en) Display status adjustment method, display status adjustment device and display device
CN104457751B (en) Indoor and outdoor scene recognition method and system
CN108921068B (en) Automobile appearance automatic damage assessment method and system based on deep neural network
CN107131883B (en) Full-automatic mobile terminal indoor positioning system based on vision
CN110110642A (en) A kind of pedestrian's recognition methods again based on multichannel attention feature
CN111461053A (en) System for identifying multiple growth period wheat lodging regions based on transfer learning
CN112560547A (en) Abnormal behavior judgment method and device, terminal and readable storage medium
CN104394588B (en) Indoor orientation method based on Wi Fi fingerprints and Multidimensional Scaling
CN108960145A (en) Facial image detection method, device, storage medium and electronic equipment
CN107655564A (en) A kind of indoor and outdoor surroundingses detection method of the multiple technologies fusion based on intelligent terminal
CN107133685B (en) Method and system for predicting power generation capacity of photovoltaic power generation system
CN110779567B (en) Indoor and outdoor scene recognition method based on multi-module fusion
US20220148292A1 (en) Method for glass detection in real scenes
US20190114333A1 (en) System and method for species and object recognition
US20200196268A1 (en) System and method for positioning a gateway of an architecture
CN109143408A (en) Combine short-term precipitation forecasting procedure in dynamic area based on MLP
CN112784740A (en) Gait data acquisition and labeling method and application
CN112183287A (en) People counting method of mobile robot under complex background
CN108985131A (en) A kind of target identification method and image processing equipment
CN109982239A (en) Store floor positioning system and method based on machine vision
CN109815773A (en) A kind of low slow small aircraft detection method of view-based access control model
US20040013300A1 (en) Algorithm selector
CN114674317A (en) Self-correcting dead reckoning system and method based on activity recognition and fusion filtering
CN111860331A (en) Unmanned aerial vehicle is at face identification system in unknown territory of security protection
CN112784703A (en) Multispectral-based personnel action track determination method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant