My experience handling large data set (climate data set, on linux,
with parallel computers, and with R, Python) was using free software
MySQL (or PostgreSQL, mostly I use MySQL), for easy access and
management use MySQL workbench (
http://www.mysql.com/products/workbench/ ).
If you already install MOVES 2010, you can use MOVES MySQL engine to
dump your data. I use this MySQL to process vehicle registration data
(~9 million records of Michigan vehicles), it is faster than MS Access
cannot handle it.And it is on Windows.. (I have another version on
Linux, too).
I'm interested to dump Michigan data too, how big is this ? Penelope ?
Hary(ono) Prawiranata
Transportation Analyst/Modeler
Tri-County Regional Planning Commission
3135 Pine Tree Rd. Ste 2C
Lansing, MI 48911
On Thu, Nov 14, 2013 at 2:19 PM, Mara Kaminowitz
<mkaminowitz(a)baltometro.org> wrote:
I downloaded the Maryland state raw data (the whole
enchilada) that Penelope
was good enough to provide me. It came with documentation that clearly
explains what needs to be done but I am being hampered by the sheer size of
the dataset. It's 10 GB and that's without going into joining tables,
transposing them to meet my needs, etc. Even breaking the parts into
different databases it can't be handled in Access. I can fit Part 1 into an
ESRI geodatabase but I don't have the flexibility in linking tables that
Access has.
Does anyone have any suggestions for dealing with large databases? SQL
server is one option. Are there others?
Mara Kaminowitz, GISP
GIS Coordinator
.........................................................................
Baltimore Metropolitan Council
Offices @ McHenry Row
1500 Whetstone Way
Suite 300
Baltimore, MD 21230
410-732-0500 ext. 1030
mkaminowitz(a)baltometro.org
www.baltometro.org
_______________________________________________
ctpp-news mailing list
ctpp-news(a)ryoko.chrispy.net
http://ryoko.chrispy.net/mailman/listinfo/ctpp-news