1
|
Installation:
|
2
|
Check out svn: svn co https://code.nceas.ucsb.edu/code/projects/bien
|
3
|
cd bien/
|
4
|
Install: make install
|
5
|
**WARNING**: This will delete the public schema of your VegBIEN DB!
|
6
|
Uninstall: make uninstall
|
7
|
**WARNING**: This will delete your entire VegBIEN DB!
|
8
|
This includes all archived imports and staging tables.
|
9
|
|
10
|
Connecting to vegbiendev:
|
11
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
12
|
cd /home/bien/svn # should happen automatically at login
|
13
|
|
14
|
Notes on system stability:
|
15
|
**WARNING**: system upgrades can break key parts of the full-database
|
16
|
import, causing errors such as disk space overruns. for this reason, it
|
17
|
is recommended to maintain a snapshot copy of the VM as it was at the
|
18
|
last successful import, for fallback use if a system upgrade breaks
|
19
|
anything. system upgrades on the snapshot VM should be disabled
|
20
|
completely, and because this will also disable security fixes, the
|
21
|
snapshot VM should be disconnected from the internet and all networking
|
22
|
interfaces. (this is an unfortunate consequence of modern OSes being
|
23
|
written in non-memory-safe languages such as C and C++.)
|
24
|
|
25
|
Notes on running programs:
|
26
|
**WARNING**: always start with a clean shell, to avoid spurious bugs. the
|
27
|
shell should not have changes to the env vars. (there have been bugs
|
28
|
that went away after closing and reopening the terminal window.) note
|
29
|
that running `exec bash` is not sufficient to *reset* the env vars.
|
30
|
|
31
|
Notes on editing files:
|
32
|
**WARNING**: shell scripts should always be read-only, so that editing them
|
33
|
while an import is in progress will not crash the import (see
|
34
|
http://vegpath.org/links/#**%20modifying%20a%20running%20shell%20script)
|
35
|
|
36
|
Single datasource import:
|
37
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
38
|
(Re)import and scrub: make inputs/<datasrc>/reimport_scrub by_col=1 &
|
39
|
(Re)import only: make inputs/<datasrc>/reimport by_col=1 &
|
40
|
Note that these commands also work if the datasource is not yet imported
|
41
|
Remake analytical DB: see Full database import > To remake analytical DB
|
42
|
|
43
|
Full database import:
|
44
|
**WARNING**: You must perform *every single* step listed below, to avoid
|
45
|
breaking column-based import
|
46
|
**WARNING**: always start with a clean shell, as described above under
|
47
|
"Notes on running programs"
|
48
|
**IMPORTANT**: the beginning of the import should be scheduled at a time
|
49
|
when the DB will not be needed for other uses. this is necessary because
|
50
|
vegbiendev will be slow for the first few hours of the import, due to
|
51
|
the import using all the available cores.
|
52
|
do steps under Maintenance > "to synchronize vegbiendev, jupiter, and
|
53
|
your local machine"
|
54
|
On local machine:
|
55
|
make inputs/upload
|
56
|
make inputs/upload live=1
|
57
|
make test by_col=1 # runtime: 20 min ("4m46.108s" + ("21:50:43" - "21:37:09")) @starscream
|
58
|
if you encounter errors, they are most likely related to the
|
59
|
PostgreSQL error parsing in /lib/sql.py parse_exception()
|
60
|
See note under Testing below
|
61
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
62
|
Ensure there are no local modifications: svn st
|
63
|
up
|
64
|
make inputs/download
|
65
|
make inputs/download live=1
|
66
|
For each newly-uploaded datasource above: make inputs/<datasrc>/reinstall
|
67
|
Update the auxiliary schemas: make schemas/reinstall
|
68
|
**WARNING**: requires sudo access!
|
69
|
The public schema will be installed separately by the import process
|
70
|
Delete imports before the last so they won't bloat the full DB backup:
|
71
|
make backups/vegbien.<version>.backup/remove
|
72
|
To keep a previous import other than the public schema:
|
73
|
export dump_opts='--exclude-schema=public --exclude-schema=<version>'
|
74
|
# env var will be inherited by `screen` shell
|
75
|
restart Postgres to free up any disk space used by temp tables from the last
|
76
|
import (this is apparently not automatically reclaimed):
|
77
|
make postgres_restart
|
78
|
Make sure there is at least 1 TB of disk space on /: df -h
|
79
|
**WARNING**: sometimes, this amount of available space is insufficient
|
80
|
and the entire disk space gets used up, crashing the import. if this
|
81
|
occurs, the problem will often be fixed just by rerunning the import
|
82
|
again. (the high-water mark varies by import.)
|
83
|
although the import schema itself is only 315 GB, Postgres uses
|
84
|
significant temporary space at the beginning of the import.
|
85
|
the total disk usage oscillates between 1.2 TB and the entire disk
|
86
|
for the first day (for import started @12:55:09, high-water marks of
|
87
|
1.7 TB @14:00:25, 1.8 TB @15:38:32; then next day w/ 2 datasources
|
88
|
running: entire disk for 4 min @05:35:44, 1.8 TB @11:15:05).
|
89
|
To free up space, remove backups that have been archived on jupiter:
|
90
|
List backups/ to view older backups
|
91
|
Check their MD5 sums using the steps under On jupiter below
|
92
|
Remove these backups
|
93
|
unset version # clear any version from last import, etc.
|
94
|
if no commits have been made since the last import (eg. if retrying an
|
95
|
import), set a custom version that differs from the auto-assigned one
|
96
|
(would otherwise cause a collision with the last import):
|
97
|
svn info
|
98
|
extract the svn revision after "Revision:"
|
99
|
export version=r[revision]_2 # +suffix to distinguish from last import
|
100
|
# env var will be inherited by `screen` shell
|
101
|
screen
|
102
|
Press ENTER
|
103
|
unset TMOUT # TMOUT causes screen to exit even with background processes
|
104
|
set -o ignoreeof #prevent Ctrl+D from exiting `screen` to keep attached jobs
|
105
|
to import just a subset of the datasources:
|
106
|
declare -ax inputs=(inputs/{src,...}/)
|
107
|
# array vars *not* inherited by `screen` shell
|
108
|
export version=custom_import_name
|
109
|
Start column-based import: . bin/import_all
|
110
|
To use row-based import: . bin/import_all by_col=
|
111
|
To stop all running imports: . bin/stop_imports
|
112
|
**WARNING**: Do NOT run import_all in the background, or the jobs it
|
113
|
creates won't be owned by your shell.
|
114
|
Note that import_all will take up to an hour to import the NCBI backbone
|
115
|
and other metadata before returning control to the shell.
|
116
|
To view progress:
|
117
|
tail inputs/{.,}*/*/logs/$version.log.sql
|
118
|
note: at the beginning of the import, the system may send out CPU load
|
119
|
warning e-mails. these can safely be ignored. (they happen because the
|
120
|
parallel imports use all the available cores.)
|
121
|
Wait (4 days) for the import to finish
|
122
|
To recover from a closed terminal window: screen -r
|
123
|
To restart an aborted import for a specific table:
|
124
|
export version=<version>
|
125
|
(set -o errexit; make inputs/<datasrc>/<table>/import_scrub by_col=1 continue=1; make inputs/<datasrc>/publish) &
|
126
|
bin/after_import $! & # $! can also be obtained from `jobs -l`
|
127
|
Get $version: echo $version
|
128
|
Set $version in all vegbiendev terminals: export version=<version>
|
129
|
When there are no more running jobs, exit `screen`: exit # not Ctrl+D
|
130
|
upload logs: make inputs/upload live=1
|
131
|
On local machine: make inputs/download-logs live=1
|
132
|
check for disk space errors:
|
133
|
grep --files-with-matches -F 'No space left on device' inputs/{.,}*/*/logs/$version.log.sql
|
134
|
if there are any matches:
|
135
|
manually reimport these datasources using the steps under
|
136
|
Single datasource import
|
137
|
bin/after_import &
|
138
|
wait for the import to finish
|
139
|
tail inputs/{.,}*/*/logs/$version.log.sql
|
140
|
In the output, search for "Command exited with non-zero status"
|
141
|
For inputs that have this, fix the associated bug(s)
|
142
|
If many inputs have errors, discard the current (partial) import:
|
143
|
make schemas/$version/uninstall
|
144
|
Otherwise, continue
|
145
|
In PostgreSQL:
|
146
|
Go to wiki.vegpath.org/VegBIEN_contents
|
147
|
Get the # observations
|
148
|
Get the # datasources
|
149
|
Get the # datasources with observations
|
150
|
in the r# schema:
|
151
|
Check that analytical_stem contains [# observations] rows
|
152
|
Check that source contains [# datasources] rows up through XAL. If this
|
153
|
is not the case, manually check the entries in source against the
|
154
|
datasources list on the wiki page (some datasources may be near the
|
155
|
end depending on import order).
|
156
|
Check that provider_count contains [# datasources with observations]
|
157
|
rows with dataset="(total)" (at the top when the table is unsorted)
|
158
|
Check that TNRS ran successfully:
|
159
|
tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
|
160
|
If the log ends in an AssertionError
|
161
|
"assert sql.table_col_names(db, table) == header":
|
162
|
Figure out which TNRS CSV columns have changed
|
163
|
On local machine:
|
164
|
Make the changes in the DB's TNRS and public schemas
|
165
|
rm=1 inputs/.TNRS/schema.sql.run export_
|
166
|
make schemas/remake
|
167
|
inputs/test_taxonomic_names/test_scrub # re-run TNRS
|
168
|
rm=1 inputs/.TNRS/data.sql.run export_
|
169
|
Commit
|
170
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
171
|
If dropping a column, save the dependent views
|
172
|
Make the same changes in the live TNRS.tnrs table on vegbiendev
|
173
|
If dropping a column, recreate the dependent views
|
174
|
Restart the TNRS client: make scrub by_col=1 &
|
175
|
Publish the new import:
|
176
|
**WARNING**: Before proceeding, be sure you have done *every single*
|
177
|
verification step listed above. Otherwise, a previous valid import
|
178
|
could incorrectly be overwritten with a broken one.
|
179
|
make schemas/$version/publish # runtime: 1 min ("real 1m10.451s")
|
180
|
unset version
|
181
|
make backups/upload live=1
|
182
|
on local machine:
|
183
|
make backups/vegbien.$version.backup/download live=1
|
184
|
# download backup to local machine
|
185
|
ssh aaronmk@jupiter.nceas.ucsb.edu
|
186
|
cd /data/dev/aaronmk/bien/backups
|
187
|
For each newly-archived backup:
|
188
|
make -s <backup>.md5/test
|
189
|
Check that "OK" is printed next to the filename
|
190
|
If desired, record the import times in inputs/import.stats.xls:
|
191
|
On local machine:
|
192
|
Open inputs/import.stats.xls
|
193
|
If the rightmost import is within 5 columns of column IV:
|
194
|
Copy the current tab to <leftmost-date>~<rightmost-date>
|
195
|
Remove the previous imports from the current tab because they are
|
196
|
now in the copied tab instead
|
197
|
Insert a copy of the leftmost "By column" column group before it
|
198
|
export version=<version>
|
199
|
bin/import_date inputs/{.,}*/*/logs/$version.log.sql
|
200
|
Update the import date in the upper-right corner
|
201
|
bin/import_times inputs/{.,}*/*/logs/$version.log.sql
|
202
|
Paste the output over the # Rows/Time columns, making sure that the
|
203
|
row counts match up with the previous import's row counts
|
204
|
If the row counts do not match up, insert or reorder rows as needed
|
205
|
until they do. Get the datasource names from the log file footers:
|
206
|
tail inputs/{.,}*/*/logs/$version.log.sql
|
207
|
Commit: svn ci -m 'inputs/import.stats.xls: updated import times'
|
208
|
Running individual steps separately:
|
209
|
To run TNRS:
|
210
|
To use an import other than public: export version=<version>
|
211
|
make scrub &
|
212
|
To view progress:
|
213
|
tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
|
214
|
To remake analytical DB:
|
215
|
To use an import other than public: export version=<version>
|
216
|
bin/make_analytical_db & # runtime: 13 h ("12:43:57elapsed")
|
217
|
To view progress:
|
218
|
tail -150 inputs/analytical_db/logs/make_analytical_db.log.sql
|
219
|
To back up DB (staging tables and last import):
|
220
|
To use an import *other than public*: export version=<version>
|
221
|
make backups/TNRS.backup-remake &
|
222
|
dump_opts=--exclude-schema=public make backups/vegbien.$version.backup/test &
|
223
|
If after renaming to public, instead set dump_opts='' and replace
|
224
|
$version with the appropriate revision
|
225
|
make backups/upload live=1
|
226
|
|
227
|
Datasource setup:
|
228
|
On local machine:
|
229
|
Example steps for a datasource: wiki.vegpath.org/Import_process_for_Madidi
|
230
|
umask ug=rwx,o= # prevent files from becoming web-accessible
|
231
|
Add a new datasource: make inputs/<datasrc>/add
|
232
|
<datasrc> may not contain spaces, and should be abbreviated.
|
233
|
If the datasource is a herbarium, <datasrc> should be the herbarium code
|
234
|
as defined by the Index Herbariorum <http://sweetgum.nybg.org/ih/>
|
235
|
For a new-style datasource (one containing a ./run runscript):
|
236
|
"cp" -f inputs/.NCBI/{Makefile,run,table.run} inputs/<datasrc>/
|
237
|
For MySQL inputs (exports and live DB connections):
|
238
|
For .sql exports:
|
239
|
Place the original .sql file in _src/ (*not* in _MySQL/)
|
240
|
Follow the steps starting with Install the staging tables below.
|
241
|
This is for an initial sync to get the file onto vegbiendev.
|
242
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
243
|
Create a database for the MySQL export in phpMyAdmin
|
244
|
Give the bien user all database-specific privileges *except*
|
245
|
UPDATE, DELETE, ALTER, DROP. This prevents bugs in the
|
246
|
import scripts from accidentally deleting data.
|
247
|
bin/mysql_bien database <inputs/<datasrc>/_src/export.sql &
|
248
|
mkdir inputs/<datasrc>/_MySQL/
|
249
|
cp -p lib/MySQL.{data,schema}.sql.make inputs/<datasrc>/_MySQL/
|
250
|
Edit _MySQL/*.make for the DB connection
|
251
|
For a .sql export, use server=vegbiendev and --user=bien
|
252
|
Skip the Add input data for each table section
|
253
|
For MS Access databases:
|
254
|
Place the .mdb or .accdb file in _src/
|
255
|
Download and install Access To PostgreSQL from
|
256
|
http://www.bullzip.com/download.php
|
257
|
Use Access To PostgreSQL to export the database:
|
258
|
Export just the tables/indexes to inputs/<datasrc>/<file>.schema.sql
|
259
|
Export just the data to inputs/<datasrc>/<file>.data.sql
|
260
|
In <file>.schema.sql, make the following changes:
|
261
|
Replace text "BOOLEAN" with "/*BOOLEAN*/INTEGER"
|
262
|
Replace text "DOUBLE PRECISION NULL" with "DOUBLE PRECISION"
|
263
|
Skip the Add input data for each table section
|
264
|
Add input data for each table present in the datasource:
|
265
|
For .sql exports, you must use the name of the table in the DB export
|
266
|
For CSV files, you can use any name. It's recommended to use a table
|
267
|
name from <https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCSV#Suggested-table-names>
|
268
|
Note that if this table will be joined together with another table, its
|
269
|
name must end in ".src"
|
270
|
make inputs/<datasrc>/<table>/add
|
271
|
Important: DO NOT just create an empty directory named <table>!
|
272
|
This command also creates necessary subdirs, such as logs/.
|
273
|
If the table is in a .sql export: make inputs/<datasrc>/<table>/install
|
274
|
Otherwise, place the CSV(s) for the table in
|
275
|
inputs/<datasrc>/<table>/ OR place a query joining other tables
|
276
|
together in inputs/<datasrc>/<table>/create.sql
|
277
|
Important: When exporting relational databases to CSVs, you MUST ensure
|
278
|
that embedded quotes are escaped by doubling them, *not* by
|
279
|
preceding them with a "\" as is the default in phpMyAdmin
|
280
|
If there are multiple part files for a table, and the header is repeated
|
281
|
in each part, make sure each header is EXACTLY the same.
|
282
|
(If the headers are not the same, the CSV concatenation script
|
283
|
assumes the part files don't have individual headers and treats the
|
284
|
subsequent headers as data rows.)
|
285
|
Add <table> to inputs/<datasrc>/import_order.txt before other tables
|
286
|
that depend on it
|
287
|
For a new-style datasource:
|
288
|
"cp" -f inputs/.NCBI/nodes/run inputs/<datasrc>/<table>/
|
289
|
inputs/<datasrc>/<table>/run
|
290
|
Install the staging tables:
|
291
|
make inputs/<datasrc>/reinstall quiet=1 &
|
292
|
For a MySQL .sql export:
|
293
|
At prompt "[you]@vegbiendev's password:", enter your password
|
294
|
At prompt "Enter password:", enter the value in config/bien_password
|
295
|
To view progress: tail -f inputs/<datasrc>/<table>/logs/install.log.sql
|
296
|
View the logs: tail -n +1 inputs/<datasrc>/*/logs/install.log.sql
|
297
|
tail provides a header line with the filename
|
298
|
+1 starts at the first line, to show the whole file
|
299
|
For every file with an error 'column "..." specified more than once':
|
300
|
Add a header override file "+header.<ext>" in <table>/:
|
301
|
Note: The leading "+" should sort it before the flat files.
|
302
|
"_" unfortunately sorts *after* capital letters in ASCII.
|
303
|
Create a text file containing the header line of the flat files
|
304
|
Add an ! at the beginning of the line
|
305
|
This signals cat_csv that this is a header override.
|
306
|
For empty names, use their 0-based column # (by convention)
|
307
|
For duplicate names, add a distinguishing suffix
|
308
|
For long names that collided, rename them to <= 63 chars long
|
309
|
Do NOT make readability changes in this step; that is what the
|
310
|
map spreadsheets (below) are for.
|
311
|
Save
|
312
|
If you made any changes, re-run the install command above
|
313
|
Auto-create the map spreadsheets: make inputs/<datasrc>/
|
314
|
Map each table's columns:
|
315
|
In each <table>/ subdir, for each "via map" map.csv:
|
316
|
Open the map in a spreadsheet editor
|
317
|
Open the "core map" /mappings/Veg+-VegBIEN.csv
|
318
|
In each row of the via map, set the right column to a value from the
|
319
|
left column of the core map
|
320
|
Save
|
321
|
Regenerate the derived maps: make inputs/<datasrc>/
|
322
|
Accept the test cases:
|
323
|
For a new-style datasource:
|
324
|
inputs/<datasrc>/run
|
325
|
svn di inputs/<datasrc>/*/test.xml.ref
|
326
|
If you get errors, follow the steps for old-style datasources below
|
327
|
For an old-style datasource:
|
328
|
make inputs/<datasrc>/test
|
329
|
When prompted to "Accept new test output", enter y and press ENTER
|
330
|
If you instead get errors, do one of the following for each one:
|
331
|
- If the error was due to a bug, fix it
|
332
|
- Add a SQL function that filters or transforms the invalid data
|
333
|
- Make an empty mapping for the columns that produced the error.
|
334
|
Put something in the Comments column of the map spreadsheet to
|
335
|
prevent the automatic mapper from auto-removing the mapping.
|
336
|
When accepting tests, it's helpful to use WinMerge
|
337
|
(see WinMerge setup below for configuration)
|
338
|
make inputs/<datasrc>/test by_col=1
|
339
|
If you get errors this time, this always indicates a bug, usually in
|
340
|
the VegBIEN unique constraints or column-based import itself
|
341
|
Add newly-created files: make inputs/<datasrc>/add
|
342
|
Commit: svn ci -m "Added inputs/<datasrc>/" inputs/<datasrc>/
|
343
|
Update vegbiendev:
|
344
|
ssh aaronmk@jupiter.nceas.ucsb.edu
|
345
|
up
|
346
|
On local machine:
|
347
|
./fix_perms
|
348
|
make inputs/upload
|
349
|
make inputs/upload live=1
|
350
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
351
|
up
|
352
|
make inputs/download
|
353
|
make inputs/download live=1
|
354
|
Follow the steps under Install the staging tables above
|
355
|
|
356
|
Maintenance:
|
357
|
on a live machine, you should put the following in your .profile:
|
358
|
--
|
359
|
# make svn files web-accessible. this does not affect unversioned files, because
|
360
|
# these get the right permissions on the local machine instead.
|
361
|
umask ug=rwx,o=rx
|
362
|
|
363
|
unset TMOUT # TMOUT causes screen to exit even with background processes
|
364
|
--
|
365
|
if http://vegbiendev.nceas.ucsb.edu/phppgadmin/ goes down:
|
366
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
367
|
make phppgadmin-Linux
|
368
|
regularly, re-run full-database import so that bugs in it don't pile up.
|
369
|
it needs to be kept in working order so that it works when it's needed.
|
370
|
to synchronize vegbiendev, jupiter, and your local machine:
|
371
|
**WARNING**: pay careful attention to all files that will be deleted or
|
372
|
overwritten!
|
373
|
install put if needed:
|
374
|
download https://uutils.googlecode.com/svn/trunk/bin/put to ~/bin/ and `chmod +x` it
|
375
|
when changes are made on vegbiendev:
|
376
|
avoid extraneous diffs when rsyncing:
|
377
|
on all machines:
|
378
|
up
|
379
|
./fix_perms
|
380
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
381
|
upload:
|
382
|
overwrite=1 bin/sync_upload --size-only
|
383
|
then review diff, and rerun with `l=1` prepended
|
384
|
on your machine:
|
385
|
download:
|
386
|
overwrite=1 swap=1 src=. dest='aaronmk@jupiter.nceas.ucsb.edu:~/bien' put --exclude=.svn inputs/VegBIEN/TWiki
|
387
|
then review diff, and rerun with `l=1` prepended
|
388
|
swap=1 bin/sync_upload backups/TNRS.backup
|
389
|
then review diff, and rerun with `l=1` prepended
|
390
|
overwrite=1 swap=1 bin/sync_upload --size-only
|
391
|
then review diff, and rerun with `l=1` prepended
|
392
|
overwrite=1 sync_remote_url=~/Dropbox/svn/ bin/sync_upload --existing --size-only # just update mtimes/perms
|
393
|
then review diff, and rerun with `l=1` prepended
|
394
|
to back up e-mails:
|
395
|
on local machine:
|
396
|
/Applications/gmvault-v1.8.1-beta/bin/gmvault sync --multiple-db-owner --type quick aaronmk.nceas@gmail.com
|
397
|
/Applications/gmvault-v1.8.1-beta/bin/gmvault sync --multiple-db-owner --type quick aaronmk@nceas.ucsb.edu
|
398
|
open Thunderbird
|
399
|
click the All Mail folder for each account and wait for it to download the e-mails in it
|
400
|
to back up the version history:
|
401
|
# back up first on the local machine, because often only the svnsync
|
402
|
command gets run, and that way it will get backed up immediately to
|
403
|
Dropbox (and hourly to Time Machine), while vegbiendev only gets
|
404
|
backed up daily to tape
|
405
|
on local machine:
|
406
|
svnsync sync file://"$HOME"/Dropbox/docs/BIEN/svn_repo/ # initial runtime: 1.5 h ("08:21:38" - "06:45:26") @vegbiendev
|
407
|
(cd ~/Dropbox/docs/BIEN/git/; git svn fetch)
|
408
|
overwrite=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/svn_repo/ # runtime: 1 min ("1:05.08")
|
409
|
then review diff, and rerun with `l=1` prepended
|
410
|
overwrite=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/git/
|
411
|
then review diff, and rerun with `l=1` prepended
|
412
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
413
|
# use absolute path for vegbiendev commands because the Ubuntu 14.04
|
414
|
version of rsync doesn't expand ~ properly
|
415
|
overwrite=1 swap=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/svn_repo/ # runtime: 30 s ("36.19")
|
416
|
then review diff, and rerun with `l=1` prepended
|
417
|
overwrite=1 swap=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/git/
|
418
|
then review diff, and rerun with `l=1` prepended
|
419
|
to synchronize a Mac's settings with my testing machine's:
|
420
|
download:
|
421
|
**WARNING**: this will overwrite all your user's settings!
|
422
|
on your machine:
|
423
|
overwrite=1 swap=1 sync_local_dir=~/Library/ sync_remote_subdir=Library/ bin/sync_upload --exclude="/Saved Application State"
|
424
|
then review diff, and rerun with `l=1` prepended
|
425
|
upload:
|
426
|
do step when changes are made on vegbiendev > on your machine, download
|
427
|
ssh aaronmk@jupiter.nceas.ucsb.edu
|
428
|
(cd ~/Dropbox/svn/; up)
|
429
|
on your machine:
|
430
|
rm ~/'Library/Thunderbird/Profiles/9oo8rcyn.default/ImapMail/imap.googlemail.com/[Gmail].sbd/Spam'
|
431
|
# remove the downloaded Spam folder, because spam e-mails often contain viruses that would trigger clamscan
|
432
|
overwrite=1 del= sync_local_dir=~/Dropbox/svn/ sync_remote_subdir=Dropbox/svn/ bin/sync_upload --size-only # just update mtimes
|
433
|
then review diff, and rerun with `l=1` prepended
|
434
|
overwrite=1 inplace=1 sync_local_dir=~ sync_remote_subdir= bin/sync_upload ~/"VirtualBox VMs/**" # need inplace=1 because they are very large files
|
435
|
then review diff, and rerun with `l=1` prepended
|
436
|
overwrite=1 sync_local_dir=~ sync_remote_subdir= bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/.Trash" --exclude="/bin" --exclude="/bin/pg_ctl" --exclude="/bin/unzip" --exclude="/Dropbox/home" --exclude="/.profile" --exclude="/.shrc" --exclude="/.bashrc"
|
437
|
then review diff, and rerun with `l=1` prepended
|
438
|
overwrite=1 sync_local_dir=~ sync_remote_url=~/Dropbox/home bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/.Trash" --exclude="/.dropbox" --exclude="/Documents/BIEN" --exclude="/Dropbox" --exclude="/software" --exclude="/VirtualBox VMs/**.sav" --exclude="/VirtualBox VMs/**.vdi" --exclude="/VirtualBox VMs/**.vmdk"
|
439
|
then review diff, and rerun with `l=1` prepended
|
440
|
to backup files not in Time Machine:
|
441
|
On local machine:
|
442
|
overwrite=1 src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo -E put Library/PostgreSQL/9.3/data/
|
443
|
then review diff, and rerun with `l=1` prepended
|
444
|
pg_ctl. stop # stop the PostgreSQL server
|
445
|
overwrite=1 src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo -E put Library/PostgreSQL/9.3/data/
|
446
|
then review diff, and rerun with `l=1` prepended
|
447
|
pg_ctl. start # start the PostgreSQL server
|
448
|
VegCore data dictionary:
|
449
|
Regularly, or whenever the VegCore data dictionary page
|
450
|
(https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCore)
|
451
|
is changed, regenerate mappings/VegCore.csv:
|
452
|
On local machine:
|
453
|
make mappings/VegCore.htm-remake; make mappings/
|
454
|
apply new data dict mappings to datasource mappings/staging tables:
|
455
|
inputs/run postprocess # runtime: see inputs/run
|
456
|
time yes|make inputs/{NVS,SALVIAS,TEAM}/test # old-style import; runtime: 1 min ("0m59.692s") @starscream
|
457
|
svn di mappings/VegCore.tables.redmine
|
458
|
If there are changes, update the data dictionary's Tables section
|
459
|
When moving terms, check that no terms were lost: svn di
|
460
|
svn ci -m 'mappings/VegCore.htm: regenerated from wiki'
|
461
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
462
|
perform the steps under "apply new data dict mappings to
|
463
|
datasource mappings/staging tables" above
|
464
|
Important: Whenever you install a system update that affects PostgreSQL or
|
465
|
any of its dependencies, such as libc, you should restart the PostgreSQL
|
466
|
server. Otherwise, you may get strange errors like "the database system
|
467
|
is in recovery mode" which go away upon reimport, or you may not be able
|
468
|
to access the database as the postgres superuser. This applies to both
|
469
|
Linux and Mac OS X.
|
470
|
|
471
|
Backups:
|
472
|
Archived imports:
|
473
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
474
|
Back up: make backups/<version>.backup &
|
475
|
Note: To back up the last import, you must archive it first:
|
476
|
make schemas/rotate
|
477
|
Test: make -s backups/<version>.backup/test &
|
478
|
Restore: make backups/<version>.backup/restore &
|
479
|
Remove: make backups/<version>.backup/remove
|
480
|
Download: make backups/<version>.backup/download
|
481
|
TNRS cache:
|
482
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
483
|
Back up: make backups/TNRS.backup-remake &
|
484
|
runtime: 3 min ("real 2m48.859s")
|
485
|
Restore:
|
486
|
yes|make inputs/.TNRS/uninstall
|
487
|
make backups/TNRS.backup/restore &
|
488
|
runtime: 5.5 min ("real 5m35.829s")
|
489
|
yes|make schemas/public/reinstall
|
490
|
Must come after TNRS restore to recreate tnrs_input_name view
|
491
|
Full DB:
|
492
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
493
|
Back up: make backups/vegbien.<version>.backup &
|
494
|
Test: make -s backups/vegbien.<version>.backup/test &
|
495
|
Restore: make backups/vegbien.<version>.backup/restore &
|
496
|
Download: make backups/vegbien.<version>.backup/download
|
497
|
Import logs:
|
498
|
On local machine:
|
499
|
Download: make inputs/download-logs live=1
|
500
|
|
501
|
Datasource refreshing:
|
502
|
VegBank:
|
503
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
504
|
make inputs/VegBank/vegbank.sql-remake
|
505
|
make inputs/VegBank/reinstall quiet=1 &
|
506
|
|
507
|
Schema changes:
|
508
|
On local machine:
|
509
|
When changing the analytical views, run sync_analytical_..._to_view()
|
510
|
to update the corresponding table
|
511
|
Remember to update the following files with any renamings:
|
512
|
schemas/filter_ERD.csv
|
513
|
mappings/VegCore-VegBIEN.csv
|
514
|
mappings/verify.*.sql
|
515
|
Regenerate schema from installed DB: make schemas/remake
|
516
|
Reinstall DB from schema: make schemas/public/reinstall schemas/reinstall
|
517
|
**WARNING**: This will delete the public schema of your VegBIEN DB!
|
518
|
If needed, reinstall staging tables:
|
519
|
On local machine:
|
520
|
sudo -E -u postgres psql <<<'ALTER DATABASE vegbien RENAME TO vegbien_prev'
|
521
|
make db
|
522
|
. bin/reinstall_all
|
523
|
Fix any bugs and retry until no errors
|
524
|
make schemas/public/install
|
525
|
This must be run *after* the datasources are installed, because
|
526
|
views in public depend on some of the datasources
|
527
|
sudo -E -u postgres psql <<<'DROP DATABASE vegbien_prev'
|
528
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
529
|
repeat the above steps
|
530
|
**WARNING**: Do not run this until reinstall_all runs successfully
|
531
|
on the local machine, or the live DB may be unrestorable!
|
532
|
update mappings and staging table column names:
|
533
|
on local machine:
|
534
|
inputs/run postprocess # runtime: see inputs/run
|
535
|
time yes|make inputs/{NVS,SALVIAS,TEAM}/test # old-style import; runtime: 1 min ("0m59.692s") @starscream
|
536
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
537
|
manually apply schema changes to the live public schema
|
538
|
do steps under "on local machine" above
|
539
|
Sync ERD with vegbien.sql schema:
|
540
|
Run make schemas/vegbien.my.sql
|
541
|
Open schemas/vegbien.ERD.mwb in MySQLWorkbench
|
542
|
Go to File > Export > Synchronize With SQL CREATE Script...
|
543
|
For Input File, select schemas/vegbien.my.sql
|
544
|
Click Continue
|
545
|
In the changes list, select each table with an arrow next to it
|
546
|
Click Update Model
|
547
|
Click Continue
|
548
|
Note: The generated SQL script will be empty because we are syncing in
|
549
|
the opposite direction
|
550
|
Click Execute
|
551
|
Reposition any lines that have been reset
|
552
|
Add any new tables by dragging them from the Catalog in the left sidebar
|
553
|
to the diagram
|
554
|
Remove any deleted tables by right-clicking the table's diagram element,
|
555
|
selecting Delete '<table name>', and clicking Delete
|
556
|
Save
|
557
|
If desired, update the graphical ERD exports (see below)
|
558
|
Update graphical ERD exports:
|
559
|
Go to File > Export > Export as PNG...
|
560
|
Select schemas/vegbien.ERD.png and click Save
|
561
|
Go to File > Export > Export as SVG...
|
562
|
Select schemas/vegbien.ERD.svg and click Save
|
563
|
Go to File > Export > Export as Single Page PDF...
|
564
|
Select schemas/vegbien.ERD.1_pg.pdf and click Save
|
565
|
Go to File > Print...
|
566
|
In the lower left corner, click PDF > Save as PDF...
|
567
|
Set the Title and Author to ""
|
568
|
Select schemas/vegbien.ERD.pdf and click Save
|
569
|
Commit: svn ci -m "schemas/vegbien.ERD.mwb: Regenerated exports"
|
570
|
Refactoring tips:
|
571
|
To rename a table:
|
572
|
In vegbien.sql, do the following:
|
573
|
Replace regexp (?<=_|\b)<old>(?=_|\b) with <new>
|
574
|
This is necessary because the table name is *everywhere*
|
575
|
Search for <new>
|
576
|
Manually change back any replacements inside comments
|
577
|
To rename a column:
|
578
|
Rename the column: ALTER TABLE <table> RENAME <old> TO <new>;
|
579
|
Recreate any foreign key for the column, removing CONSTRAINT <name>
|
580
|
This resets the foreign key name using the new column name
|
581
|
Creating a poster of the ERD:
|
582
|
Determine the poster size:
|
583
|
Measure the line height (from the bottom of one line to the bottom
|
584
|
of another): 16.3cm/24 lines = 0.679cm
|
585
|
Measure the height of the ERD: 35.4cm*2 = 70.8cm
|
586
|
Zoom in as far as possible
|
587
|
Measure the height of a capital letter: 3.5mm
|
588
|
Measure the line height: 8.5mm
|
589
|
Calculate the text's fraction of the line height: 3.5mm/8.5mm = 0.41
|
590
|
Calculate the text height: 0.679cm*0.41 = 0.28cm
|
591
|
Calculate the text height's fraction of the ERD height:
|
592
|
0.28cm/70.8cm = 0.0040
|
593
|
Measure the text height on the *VegBank* ERD poster: 5.5mm = 0.55cm
|
594
|
Calculate the VegBIEN poster height to make the text the same size:
|
595
|
0.55cm/0.0040 = 137.5cm H; *1in/2.54cm = 54.1in H
|
596
|
The ERD aspect ratio is 11 in W x (2*8.5in H) = 11x17 portrait
|
597
|
Calculate the VegBIEN poster width: 54.1in H*11W/17H = 35.0in W
|
598
|
The minimum VegBIEN poster size is 35x54in portrait
|
599
|
Determine the cost:
|
600
|
The FedEx Kinkos near NCEAS (1030 State St, Santa Barbara, CA 93101)
|
601
|
charges the following for posters:
|
602
|
base: $7.25/sq ft
|
603
|
lamination: $3/sq ft
|
604
|
mounting on a board: $8/sq ft
|
605
|
|
606
|
Testing:
|
607
|
On a development machine, you should put the following in your .profile:
|
608
|
umask ug=rwx,o= # prevent files from becoming web-accessible
|
609
|
export log= n=2
|
610
|
For development machine specs, see /planning/resources/dev_machine.specs/
|
611
|
On local machine:
|
612
|
Mapping process: make test
|
613
|
Including column-based import: make test by_col=1
|
614
|
If the row-based and column-based imports produce different inserted
|
615
|
row counts, this usually means that a table is underconstrained
|
616
|
(the unique indexes don't cover all possible rows).
|
617
|
This can occur if you didn't use COALESCE(field, null_value) around
|
618
|
a nullable field in a unique index. See sql_gen.null_sentinels for
|
619
|
the appropriate null value to use.
|
620
|
Map spreadsheet generation: make remake
|
621
|
Missing mappings: make missing_mappings
|
622
|
Everything (for most complete coverage): make test-all
|
623
|
|
624
|
Debugging:
|
625
|
"Binary chop" debugging:
|
626
|
(This is primarily useful for regressions that occurred in a previous
|
627
|
revision, which was committed without running all the tests)
|
628
|
up -r <rev>; make inputs/.TNRS/reinstall; make schemas/public/reinstall; make <failed-test>.xml
|
629
|
.htaccess:
|
630
|
mod_rewrite:
|
631
|
**IMPORTANT**: whenever you change the DirectorySlash setting for a
|
632
|
directory, you *must* clear your browser's cache to ensure that
|
633
|
a cached redirect is not used. this is because RewriteRule
|
634
|
redirects are (by default) temporary, but DirectorySlash
|
635
|
redirects are permanent.
|
636
|
for Firefox:
|
637
|
press Cmd+Shift+Delete
|
638
|
check only Cache
|
639
|
press Enter or click Clear Now
|
640
|
|
641
|
WinMerge setup:
|
642
|
In a Windows VM:
|
643
|
Install WinMerge from <http://winmerge.org/>
|
644
|
Open WinMerge
|
645
|
Go to Edit > Options and click Compare in the left sidebar
|
646
|
Enable "Moved block detection", as described at
|
647
|
<http://manual.winmerge.org/Configuration.html#d0e5892>.
|
648
|
Set Whitespace to Ignore change, as described at
|
649
|
<http://manual.winmerge.org/Configuration.html#d0e5758>.
|
650
|
|
651
|
Documentation:
|
652
|
To generate a Redmine-formatted list of steps for column-based import:
|
653
|
On local machine:
|
654
|
make schemas/public/reinstall
|
655
|
make inputs/ACAD/Specimen/logs/steps.by_col.log.sql
|
656
|
To import and scrub just the test taxonomic names:
|
657
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
658
|
inputs/test_taxonomic_names/test_scrub
|
659
|
|
660
|
General:
|
661
|
To see a program's description, read its top-of-file comment
|
662
|
To see a program's usage, run it without arguments
|
663
|
To remake a directory: make <dir>/remake
|
664
|
To remake a file: make <file>-remake
|