Don't forget to head over to listserv.transportation.org to sign up for the new CTPP news list. This one will be discontinued soon. You have likely received an invitation to the new list - 156 of you have accepted, and there's a new message there today.
Penelope Weinberger
She/They
CTPP Program Manager
AASHTO
Ctpp.transportation.org
Hello researchers!
CTPP is hosting a Census Data for Transportation Planning conference (and a call for presentations is coming) but attached please see a call for commissioned papers. Letters of interest with quals and an abstract are due Sept.10. Please be aware that your successful submission expects your attendance at our conference in Reno, NV, June 7 - 9, 2022
Penelope Weinberger
She/They
CTPP Program Manager
AASHTO
Ctpp.transportation.org
Hello CTPPers!
At long last the CTPP list is being migrated. The new list address will be:
ctpp(a)listserv.transportation.org
You will be migrated automatically.
The list lives at listserve.transportation.org and it is called CTPP
Penelope Weinberger
She/They
CTPP Program Manager
AASHTO
Ctpp.transportation.org
Hi,
I am new to the list and I want to post 2 questions:
1. Since CTPP data is based on ACS data and the lowest level of geography of ACS data is block groups, how was CTPP data generated at TAZ level?
2. Since TAZ level CTPP data will be discontinued, what will be the best practice to get TAZ level demographic, socioeconomic and housing characteristics data as input layers to travel demand model?
Thanks!
Best,
Xinbo Mi
Transportation Engineer
Evansville Metropolitan Planning Organization
1NW Martin Luther King Jr. Blvd. Room 316
Evansville, Indiana 47708
Office Main Line 812.436.7833
www.evansvillempo.com<http://www.evansvillempo.com/>
[new logo_smaller]
Hello CTPP-News:
Well, the newest news is that Census 2020 PL94-171 data is re-scheduled for release this coming Thursday, August 12th. Hope everyone is ready.
I did a lot of messing around with the R package TIGRIS this past spring. I’m not sure why I didn’t share it at that time, but here it is now. Hope it helps.
https://www.rdocumentation.org/packages/tigris/versions/1.4.1 <https://www.rdocumentation.org/packages/tigris/versions/1.4.1>
from the author’s page:
tigris is an R package that allows users to directly download and use TIGER/Line shapefiles (https://www.census.gov/geographies/mapping-files/time-series/geo/tiger-line… <https://www.census.gov/geographies/mapping-files/time-series/geo/tiger-line…>) from the US Census Bureau.
Here is Kyle Walker’s March 2021 lecture on “Spatial Analysis of US Census Data” on youtube.com <http://youtube.com/>. Watch it!
https://youtu.be/GqC1HjAKui4 <https://youtu.be/GqC1HjAKui4>
TIGRIS is working and ready for Census 2020 geographies!! So when the PL94-171 data is available this coming Thursday, I’m expecting a lot of people to get really busy with it!
(Refer back to my 7/23/21 post regarding the R package PL94171. My next example might be some stitching the TIGER files with the PL 94-171 demographic data!)
I’m attaching two of my R scripts that test a ton of the capability of TIGRIS. I’m not “attaching” any demographic data to the geographic layers, just yet. The purpose here is to get a hold of these TIGER/Line shapefiles, and get ready for newer (or older) census data from the decennial census or the American Community Survey.
My Part I script includes the following examples:
# 1.1 -- County Boundaries in California (Detailed TIGER/Highest Resolution)
# 1.2 -- Exporting SHP files to a local computer drive
# 2 -- County Boundaries in California (Medium Resolution)
# 3 -- County Boundaries in California (Lowest Resolution)
# 4.1 -- Tract Boundaries for One County
# 4.2 -- Tract Boundaries for One County, Multiple Census Years (1990-2020)
# 5 -- Tract Boundaries for Multiple Counties in a State
# 6 -- Block Group Boundaries for Multiple Counties in a State
# 7 -- Block Boundaries for Multiple Counties in a State
# 8 -- Place Boundaries within a State
# 9 -- PUMA Boundaries within a State
# 10 -- Consolidated Statistical Areas in the USA
# 11 -- Core-Based Statistical Areas in the USA
# 12 -- Congressional Districts in the USA, single state, and filtering states
# 13 -- Urbanized Areas in the USA
# 14 -- State Legislative Districts (Upper/Lower House) for a State
# 15 -- Zip Code Tabulation Areas, Selected for a State and within state
# 16 -- State Boundaries in the USA
My Part II script includes the following examples:
# 1 -- Roads for Counties, Multiple Counties within State
# 2 -- Primary & Secondary Roads within a State
# 3 -- Rails for the USA
# 4.1 -- Water Areas for Multiple Counties within a State
# 4.2 -- Water Lines for Multiple Counties within a State
# 5 -- Point & Area Landmarks within a State
That’s all for now. Happy Saturday afternoon, and IPA Day!
Chuck Purvis,
Hayward, Califorina
Attachments:
One of the great things about the Beyond 2020 web software for the CTPP 2012/16 is the ability to export GIS-ready “shapefiles” for use in your GIS applications. This works very well for limited numbers of geographic areas, say anywhere from one to a couple of thousand pieces of geography. I’m not sure what the upper limit on geographic areas it can export? Perhaps 5,000?
But the downside of the shapefiles exported from Beyond 2020 is that the output variables are given names like F0 to F35. Maybe there’s a magic button in Beyond 2020 that provides “mnemonic” variable names in the exported shapefiles? I can’t seem to find it.
Mnemonic variable names are basically memory clues as to what the variable is about: “transit” is probably transit commuters. “Total” is probably the total number of commuters, persons, households, etc., depending on the data universe. “VHH1_WHH0_est” is (obvious to me) “estimate of households with one vehicle and zero workers."
I wrote a R script that imports the shapefile’s DBF file (exported from Beyond2020); renames the variables to something more useful; adds a few extra variables; and exports a new DBF file. I’m also writing out csv and Excel workbooks, since I might use them for other analytical tasks.
Attached is my R script. It’s basic purpose is to read a DBF file, rename the varibles, add a few new variables, and write out DBF, csv, and XLSX file.
It’s using California county (n=58) for the workplace table A202105: Means of Transportation to Work (18 categories.) The process works, and I’m able to import these GIS files into QGIS with the renamed variables.
Well, only a few more hundred tables to go!!
Any advice or hints on how to improve on this process would be welcome!
cheers,
Chuck Purvis
Hayward, Califora
I’m still trying to understand what’s going on with the year 2020 American Community Survey (ACS). I went through the Friday, 7/29/21, PDF of the Census Bureau’s powerpoint presentation.
https://www.census.gov/newsroom/press-kits/2021/impact-pandemic-2020-acs-1-… <https://www.census.gov/newsroom/press-kits/2021/impact-pandemic-2020-acs-1-…>
https://www.census.gov/content/dam/Census/newsroom/press-kits/2021/acs-1-ye… <https://www.census.gov/content/dam/Census/newsroom/press-kits/2021/acs-1-ye…>
Census had this to say about “non-response”
All surveys typically have some nonresponse bias because those who do not respond tend to be different from those who do respond
Our standard methods for mitigating the nonresponse bias are insufficient for this data year
The 2020 ACS data collection had the lowest response rate ever for the survey at 71%, down from 86% in 2019
and 92% in 2018
This rate is an average across the entire data collection year
Response rates during the peak pandemic months [March-June 2020] were significantly lower
The big “wow” is the decline in the “non-response rate” from 86 percent in 2019 to 71 percent in 2020. Of course, my followup question is does this mean that 29 percent of respondents provided “incomplete data” that required their information to be edited / imputed / allocated? Or does it mean that 29 percent of respondents were “totally nonrespondent.”
Unfortunately, the 29 percent is “totally nonrespondent”.
Here’s the Census Bureau page that shows overall response rates in the ACS from 2005 to 2019, that is, NO useful information (?) from the selected sample. I think.
https://www.census.gov/acs/www/methodology/sample-size-and-data-quality/res… <https://www.census.gov/acs/www/methodology/sample-size-and-data-quality/res…>
This table is also amazing to show that the “Best” year for the ACS, in terms of response rates, was 2009, at 98.0 percent response rate.
These are nonresponse rates for the American Community Survey, not the decennial (the “short form”) Census.
# # #