1
|
e-mail from Jim on 2012-11-16:
|
2
|
-----
|
3
|
As a quick but hopefully sufficient way of transferring the geoscrub results back to you, I dumped my geoscrub output table out to CSV and stuck it on vegbiendev at /tmp/public.2012-11-04-07-34-10.r5984.geoscrub_output.csv.
|
4
|
|
5
|
Here is the essential schema info:
|
6
|
|
7
|
decimallatitude double precision
|
8
|
decimallongitude double precision
|
9
|
country text
|
10
|
stateprovince text
|
11
|
county text
|
12
|
countrystd text
|
13
|
stateprovincestd text
|
14
|
countystd text
|
15
|
latlonvalidity integer
|
16
|
countryvalidity integer
|
17
|
stateprovincevalidity integer
|
18
|
countyvalidity integer
|
19
|
|
20
|
The first 6 columns are identical to what you provided me as input, and if you do a projection over them, you should be able to recover the geoscrub_input table exactly. I confirmed this is the case in my database, but after importing on your end you should double check to make sure nothing screwy happened during the dump to CSV.
|
21
|
|
22
|
The added countrystd, stateprovincestd, and countystd columns contain the corresponding GADM place names in cases where the scrubbing procedure yielded a match to GADM. And the four *validity columns contain scores as described in my email to the bien-db list a few minutes ago.
|
23
|
-----
|
24
|
|
25
|
e-mail from Jim on 2012-11-16:
|
26
|
-----
|
27
|
Attached is a tabulation of provisional geo validity scores I generated for the full set of 1707970 geoscrub_input records Aaron provided me a couple of weeks ago (from schema public.2012-11-04-07-34-10.r5984). This goes all the way down to level of county/parish (i.e., 2nd order administrative divisions), although I know the scrubbing can still be improved especially at that lower level. Hence my "provisional" qualifier.
|
28
|
|
29
|
To produce these scores, I first passed the data through a geoscrubbing pipeline that attempts to translate asserted names into GADM (http://gadm.org) names with the help of geonames.org data, some custom mappings, and a few other tricks. Then I pushed them through a geovalidation pipeline that assesses the proximity of asserted lat/lon coordinates to their putative administrative areas in cases where scrubbing was successful. All operations happen in a Postgis database, and the full procedure ran for me in ~2 hours on a virtual server similar to vegbiendev. (This doesn't include the time it takes to set things up by importing GADM and geonames data and building appropriate indexes, but that's a one-time cost anyway.)
|
30
|
|
31
|
I still need to do a detailed writeup of the geoscrubbing and geovalidation procedures, but I think you have all the context you need for this email.
|
32
|
|
33
|
My validity codes are defined as follows, with the general rule that bigger numbers are better:
|
34
|
|
35
|
For latlonvalidity:
|
36
|
-1: Latitude and/or longitude is null
|
37
|
0: Coordinate is not a valid geographic location
|
38
|
1: Coordinate is a valid geographic location
|
39
|
|
40
|
For countryvalidity/stateprovincevalidity/countyvalidity:
|
41
|
-1: Name is null at this or some higher level
|
42
|
0: Complete name provided, but couldn't be scrubbed to GADM
|
43
|
1: Point is >5km from putative GADM polygon
|
44
|
2: Point is <=5km from putative GADM polygon, but still outside it
|
45
|
3: Point is in (or on border of) putative GADM polygon
|
46
|
|
47
|
Importantly, note that validity at each administrative level below country is constrained by the validity at higher levels. For example, if a stateprovince name is given but the country name is null, then the stateprovincevalidity score is -1. And of course, if a point doesn't fall within the scrubbed country, it certainly can't fall within the scrubbed stateprovince. To put it another way, the integer validity code at a lower level can never be larger than that of higher levels.
|
48
|
|
49
|
You could generate this yourself from the attached data, but for convenience, here is a tabulation of the lat/lon validity scores by themselves:
|
50
|
|
51
|
latlonvalidity count
|
52
|
-1 0
|
53
|
0 4981
|
54
|
1 1702989
|
55
|
|
56
|
... and here are separate tabulations of the scores for each administrative level, in each case considering only locations with a valid coordinate (i.e., where latlonvalidity is 1):
|
57
|
|
58
|
countryvalidity count
|
59
|
-1 222078
|
60
|
0 6521
|
61
|
1 49137
|
62
|
2 23444
|
63
|
3 1401809
|
64
|
|
65
|
stateprovincevalidity count
|
66
|
-1 298969
|
67
|
0 19282
|
68
|
1 107985
|
69
|
2 34634
|
70
|
3 1242119
|
71
|
|
72
|
countyvalidity count
|
73
|
-1 1429935
|
74
|
0 61266
|
75
|
1 24842
|
76
|
2 12477
|
77
|
3 174469
|
78
|
-----
|
79
|
|
80
|
e-mail from Jim on 2013-1-16:
|
81
|
-----
|
82
|
Back in Nov 2012 I created a local Git repo to manage my geovalidation
|
83
|
code as I was developing it and messing around. Technically I'm pretty
|
84
|
sure there is a way to graft it into the BIEN SVN repo you've been
|
85
|
using, but I don't have time to deal with that now, and anyway there may
|
86
|
be advantages to keeping this separate (much as the TNRS development is
|
87
|
independent). I'm happy to give you access so you can clone the Git repo
|
88
|
yourself if you'd like, but in any event, I just created a 'BIEN Geo'
|
89
|
subproject of the main BIEN project in Redmine, and exposed my repo
|
90
|
through the browser therein. It's currently set as a private project,
|
91
|
but you should be able to see this when logged in:
|
92
|
|
93
|
https://projects.nceas.ucsb.edu/nceas/projects/biengeo/repository
|
94
|
|
95
|
So now you should at least be able to view and download the scripts, and
|
96
|
peruse the history. (If you do want to be able to clone the repo, send
|
97
|
me a public key and I'll send you further instructions.)
|
98
|
|
99
|
I just spent some time improving comments in the various files, as well
|
100
|
as writing up a README.txt file that I hope will get you started. Let me
|
101
|
know if (when) you have questions. As I mention in the README, the
|
102
|
scripts are probably not 100% runnable end-to-end without some
|
103
|
intervention here and there. Pretty close, but not quite. That was
|
104
|
certainly my goal, but given that this was all done in a flurry of a few
|
105
|
days with continual changes, it's not quite there.
|
106
|
|
107
|
Also, the shell scripts do have the specific wget commands to grab GADM2
|
108
|
and geonames.org data that need to be imported into the database to
|
109
|
geoscrub/geovalidate-enable it. But if you want (of for that matter if
|
110
|
you have trouble with downloading anything), I can put the files I got
|
111
|
back in November somewhere on vegbiendev. In fact, although the GADM2
|
112
|
data should be unchanged (it's versioned at 2.0), the geonames.org
|
113
|
downloads come from their daily database dumps IIRC, so what you'd get
|
114
|
now might not be the same as what I got. Probably unlikely to be many,
|
115
|
if any, changes in the high-level admin units we're using, but who
|
116
|
knows.
|
117
|
-----
|