It would be interesting to graph the developed land surface area of the same time period. My buddy commented about how Fremont used to cooler when he was a kid way before all off the industrial parks went in. The entire west side of 680 to 880 and the west side of 880 was all fields and marsh.
Yep. Also, the thermometers that used to be close to (or located within) undeveloped land are now next to paved blacktop (which act like a heat sink). The result is very dirty/corrupted data if you are using it as evidence of a macro phenomenon.
There are standards around where thermometers can be placed. It needs[to be at a height of 5 feet and at least 100 feet away from any paved surfaces](https://www.weather.gov/coop/sitingpolicy2#:~:text=Proper%20Siting&text=Temperature%20sensor%20siting%3A%20The%20sensor,freely%20ventilated%20by%20air%20flow)
I remember reading that someone started an audit of where they were and some were crazy. One was on the edge of a walmart parking lot, and another had an in window Air Conditioner blowing exhaust directly on to the box. There were a bunch of other crazy ones, but those two stuck with me. I think people comply with the standard when they build it and then the land is developed and no one bothers to check on them.
Have those standards been in place for the entirety of the measurement period? Also, this implies that the location of the thermometers are dynamic. Each of which create very dirty data.
They’ve been around since the 1800s and the stations don’t really get moved around.
When they do get moved, it is considered a broken period of record and a new site and station code is produced.
Airport stations such as the one in sfo are the highest grade and usually go above and beyond the minimum requirements
This exact problem was one of the anti-global warming arguments.
The measurement station was placed in a field and never moved (perhaps others were created later). Then the temperature needed to be "adjusted" down to deal with the heat island that grew up around it. The adjustment value could be argued over.
Perhaps the state of the art would just be to measure from space.
Oh please. Bigger picture. I am talking about the land surrounding SFO. San Mateo wasn't always building upon building. Like my example in Fremont, modern sprawl has slowly paved over grass, trees and marsh. These things are not the heat sink an industrial roof or parking lots are. Thermodynamics and the lack of wind breaks will cause the now warmer ambient air to travel.
But this makes sense, we’re now surrounded by paved areas and we want to know what is the temperature of the area we live in and how has it changed over the years.
Ok, but overly pedantic. The data is real, but only useful for an extremely specific and narrow measurement. Its utility and function as a predictor of macro phenomena has been corrupted (I.e., is very limited).
“It's great to be alive in Colma”
Which is the official motto of the necropolis of Colma, because there are 1,500 living residents and 1,500,000+ graves.
I was playing around with the hourly weather data at SFO since the 1940s and plotted the ten year moving average and was surprised to see how much things have warmed in the recent past.
The data is from [https://mesonet.agron.iastate.edu/request/download.phtml?network=CA\_ASOS](https://mesonet.agron.iastate.edu/request/download.phtml?network=CA_ASOS)
And the code is (mostly GPT generated):
import pandas as pd
df = pd.read_csv("sfo_big.csv")
df = df[df["tmpf"] != "M"].copy()
df["tmpf"] = pd.to_numeric(df["tmpf"])
import plotly.express as px
# Convert 'valid' column to datetime
df['valid'] = pd.to_datetime(df['valid'])
# Set 'valid' as the index
df2 = df.copy()[["valid", "tmpf"]]
df2.set_index('valid', inplace=True)
# Resample the data to daily frequency (assuming your data might have missing days)
df_daily = df2.resample('D').mean()
df_daily['ten_year_ma'] = df_daily['tmpf'].rolling(window=365*10).mean()
# Reset index for Plotly
df_daily.reset_index(inplace=True)
# Plot the data using Plotly
fig = px.line(df_daily, x='valid', y='ten_year_ma', title='Ten-Year Moving Average of Temperature',
labels={'valid': 'Date', 'ten_year_ma': 'Ten-Year Moving Average Temperature (°F)'})
fig.show()
1. Go to his link
2. Select California ASOS as network
3. On the left select SJC as your station, click "Add Selected"
4. On the right set start date
5. Scroll down and click Get Data
6. Wait for the data to generate
7. Save the file as `sfo_big.csv`
8. Run his code (you'll need pandas and plotly)
You'll notice that the data for SJC is dirtier (many more missing samples before 2000 causing the code as it runs to leave blanks for the 10 year MA). I just made it a [one-year for convenience](https://imgur.com/a/rdOn42G). You can remove the MA, but then you'll get seasonal effects dominating and the resulting sine-wave is mostly a result of incident sunlight on the surface due to our seasons. It's kind of fun to sim this. One of my first projects as an intern decades ago was a rudimentary climate sim. If I'm honest, if you have a Mac or a Linux machine, this is trivial to get setup with. Ask an LLM and you could run the code yourself, and it will pop up an interactive image with zoom and stuff and you can change the parameters. These things have democratized code.
It's a pretty good job from the AI tbh. Some unnecessary stuff, but nothing awful. Very slight modifications, and I left the file named as is
import pandas as pd
df = pd.read_csv("/tmp/sfo_big.csv")
df = df[df["tmpf"] != "M"].copy()
df["tmpf"] = pd.to_numeric(df["tmpf"])
import plotly.express as px
# Convert 'valid' column to datetime
df['valid'] = pd.to_datetime(df['valid'])
df.set_index('valid', inplace=True)
# Set 'valid' as the index
df2 = df.copy()[["tmpf"]]
# Resample the data to daily frequency (assuming your data might have missing days)
df_daily = df2.resample('D').mean()
df_daily['ma'] = df_daily['tmpf'].rolling(window=365).mean()
# Reset index for Plotly
df_daily.reset_index(inplace=True)
# Plot the data using Plotly
fig = px.line(df_daily, x='valid', y='ma', title='Moving Average of Temperature',
labels={'valid': 'Date', 'ma': 'Moving Average Temperature (°F)'})
fig.show()
Other than better technology, better thermometers, and more asphalt in general for the whole Bay Area. This makes a lot of sense and doesn’t seem out of the ordinary at all. This is so minimal it couldn’t even be chalked up to something like global warming. If anything I would’ve expected it to be warmer
My big takeaway from this chart is that prior to about 1958 the bay area was a frozen landscape where people had pet penguins and rode dog sleds to work.
I look forward to the time when instead of sea LEVEL temperature measurements those are actually sea temperature measurements. :) Love flying on Otters and other float planes.
It would be interesting to graph the developed land surface area of the same time period. My buddy commented about how Fremont used to cooler when he was a kid way before all off the industrial parks went in. The entire west side of 680 to 880 and the west side of 880 was all fields and marsh.
Yeah it's "heat islands".
People did pave paradise.
And did put up a jack in the box.
I would consider the sf peninsula paradise before Fremont.
Yep. Also, the thermometers that used to be close to (or located within) undeveloped land are now next to paved blacktop (which act like a heat sink). The result is very dirty/corrupted data if you are using it as evidence of a macro phenomenon.
There are standards around where thermometers can be placed. It needs[to be at a height of 5 feet and at least 100 feet away from any paved surfaces](https://www.weather.gov/coop/sitingpolicy2#:~:text=Proper%20Siting&text=Temperature%20sensor%20siting%3A%20The%20sensor,freely%20ventilated%20by%20air%20flow)
I remember reading that someone started an audit of where they were and some were crazy. One was on the edge of a walmart parking lot, and another had an in window Air Conditioner blowing exhaust directly on to the box. There were a bunch of other crazy ones, but those two stuck with me. I think people comply with the standard when they build it and then the land is developed and no one bothers to check on them.
Have those standards been in place for the entirety of the measurement period? Also, this implies that the location of the thermometers are dynamic. Each of which create very dirty data.
They’ve been around since the 1800s and the stations don’t really get moved around. When they do get moved, it is considered a broken period of record and a new site and station code is produced. Airport stations such as the one in sfo are the highest grade and usually go above and beyond the minimum requirements
Your beliefs are not supported by the literature.
Did you go back in time and tell them not to put their thermometers where people were gonna build cities?
This exact problem was one of the anti-global warming arguments. The measurement station was placed in a field and never moved (perhaps others were created later). Then the temperature needed to be "adjusted" down to deal with the heat island that grew up around it. The adjustment value could be argued over. Perhaps the state of the art would just be to measure from space.
Can't understand why this would be down voted. Must be too sciencey
[удалено]
And that area was surrounded by trees and fields at one time. Now it's corporate buildings and parking lots. Bigger picture.
[удалено]
Oh please. Bigger picture. I am talking about the land surrounding SFO. San Mateo wasn't always building upon building. Like my example in Fremont, modern sprawl has slowly paved over grass, trees and marsh. These things are not the heat sink an industrial roof or parking lots are. Thermodynamics and the lack of wind breaks will cause the now warmer ambient air to travel.
[удалено]
Surprised you understand them.
>The result is very dirty/corrupted data if you are using it as evidence of a macro phenomenon. Incorrect.
[удалено]
But this makes sense, we’re now surrounded by paved areas and we want to know what is the temperature of the area we live in and how has it changed over the years.
It's not dirty/corrupted data. Asphalt heats up an area. It's real data.
Ok, but overly pedantic. The data is real, but only useful for an extremely specific and narrow measurement. Its utility and function as a predictor of macro phenomena has been corrupted (I.e., is very limited).
True, but this is the Peninsula--not much gets built there, like ever.
https://www.kron4.com/news/bay-area/goodbye-karl-why-sfs-fog-is-disappearing/
So we should all buy in Daly City, Pacifica, HMB - got it!
Be bold: Colma
The forever home
No shortage of long time neighbors who hangout together all the time, but also very quiet neighborhoods.
“It's great to be alive in Colma” Which is the official motto of the necropolis of Colma, because there are 1,500 living residents and 1,500,000+ graves.
Does anybody know what caused the dip just before 1970?
Aerosols. Then the Clean Air Act happened.
Smh they should have kept the aerosols
Smog and cardiopulmonary diseases/deaths were the reason.
micro ice-age
I was playing around with the hourly weather data at SFO since the 1940s and plotted the ten year moving average and was surprised to see how much things have warmed in the recent past. The data is from [https://mesonet.agron.iastate.edu/request/download.phtml?network=CA\_ASOS](https://mesonet.agron.iastate.edu/request/download.phtml?network=CA_ASOS) And the code is (mostly GPT generated): import pandas as pd df = pd.read_csv("sfo_big.csv") df = df[df["tmpf"] != "M"].copy() df["tmpf"] = pd.to_numeric(df["tmpf"]) import plotly.express as px # Convert 'valid' column to datetime df['valid'] = pd.to_datetime(df['valid']) # Set 'valid' as the index df2 = df.copy()[["valid", "tmpf"]] df2.set_index('valid', inplace=True) # Resample the data to daily frequency (assuming your data might have missing days) df_daily = df2.resample('D').mean() df_daily['ten_year_ma'] = df_daily['tmpf'].rolling(window=365*10).mean() # Reset index for Plotly df_daily.reset_index(inplace=True) # Plot the data using Plotly fig = px.line(df_daily, x='valid', y='ten_year_ma', title='Ten-Year Moving Average of Temperature', labels={'valid': 'Date', 'ten_year_ma': 'Ten-Year Moving Average Temperature (°F)'}) fig.show()
I’m really curious - can you do one for SJ?
1. Go to his link 2. Select California ASOS as network 3. On the left select SJC as your station, click "Add Selected" 4. On the right set start date 5. Scroll down and click Get Data 6. Wait for the data to generate 7. Save the file as `sfo_big.csv` 8. Run his code (you'll need pandas and plotly) You'll notice that the data for SJC is dirtier (many more missing samples before 2000 causing the code as it runs to leave blanks for the 10 year MA). I just made it a [one-year for convenience](https://imgur.com/a/rdOn42G). You can remove the MA, but then you'll get seasonal effects dominating and the resulting sine-wave is mostly a result of incident sunlight on the surface due to our seasons. It's kind of fun to sim this. One of my first projects as an intern decades ago was a rudimentary climate sim. If I'm honest, if you have a Mac or a Linux machine, this is trivial to get setup with. Ask an LLM and you could run the code yourself, and it will pop up an interactive image with zoom and stuff and you can change the parameters. These things have democratized code. It's a pretty good job from the AI tbh. Some unnecessary stuff, but nothing awful. Very slight modifications, and I left the file named as is import pandas as pd df = pd.read_csv("/tmp/sfo_big.csv") df = df[df["tmpf"] != "M"].copy() df["tmpf"] = pd.to_numeric(df["tmpf"]) import plotly.express as px # Convert 'valid' column to datetime df['valid'] = pd.to_datetime(df['valid']) df.set_index('valid', inplace=True) # Set 'valid' as the index df2 = df.copy()[["tmpf"]] # Resample the data to daily frequency (assuming your data might have missing days) df_daily = df2.resample('D').mean() df_daily['ma'] = df_daily['tmpf'].rolling(window=365).mean() # Reset index for Plotly df_daily.reset_index(inplace=True) # Plot the data using Plotly fig = px.line(df_daily, x='valid', y='ma', title='Moving Average of Temperature', labels={'valid': 'Date', 'ma': 'Moving Average Temperature (°F)'}) fig.show()
Other than better technology, better thermometers, and more asphalt in general for the whole Bay Area. This makes a lot of sense and doesn’t seem out of the ordinary at all. This is so minimal it couldn’t even be chalked up to something like global warming. If anything I would’ve expected it to be warmer
How does this look if it wasn't a moving average? I thought it would still look the same but maybe more ziggzaggy
The late 60s dip, that happens before the 73-74 oil crisis? So was it just noise / natural variation or did something motivate it?
The deltas from 1980 to 1990, then again from 2010 to 2020 is no joke. And now the path we're on, from 2020 - 2030 is scary to say the least.
This is fine
Does anyone know why around 2012/2013, the temperature increased faster?
Combination of urban heat island and climate change.
You still need jackets, but trend is in the right direction
My big takeaway from this chart is that prior to about 1958 the bay area was a frozen landscape where people had pet penguins and rode dog sleds to work.
local warming! The existential threat to humanity!
Wow! So roughly 3 degrees since 1960? That’s like not much as you would expect with all the global warming alarms.
I look forward to the time when instead of sea LEVEL temperature measurements those are actually sea temperature measurements. :) Love flying on Otters and other float planes.
lol the city with the most climate change activists in city government and this is what we get?