1
|
Installation:
|
2
|
Check out svn:
|
3
|
svn co https://code.nceas.ucsb.edu/code/projects/bien/trunk bien
|
4
|
cd bien/
|
5
|
Install: make install
|
6
|
**WARNING**: This will delete the public schema of your VegBIEN DB!
|
7
|
Uninstall: make uninstall
|
8
|
**WARNING**: This will delete your entire VegBIEN DB!
|
9
|
This includes all archived imports and staging tables.
|
10
|
|
11
|
Connecting to vegbiendev:
|
12
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
13
|
cd /home/bien # should happen automatically at login
|
14
|
|
15
|
Notes on system stability:
|
16
|
**WARNING**: system upgrades can break key parts of the full-database
|
17
|
import, causing errors such as disk space overruns. for this reason, it
|
18
|
is recommended to maintain a snapshot copy of the VM as it was at the
|
19
|
last successful import, for fallback use if a system upgrade breaks
|
20
|
anything. system upgrades on the snapshot VM should be disabled
|
21
|
completely, and because this will also disable security fixes, the
|
22
|
snapshot VM should be disconnected from the internet and all networking
|
23
|
interfaces. (this is an unfortunate consequence of modern OSes being
|
24
|
written in non-memory-safe languages such as C and C++.)
|
25
|
|
26
|
Notes on running programs:
|
27
|
**WARNING**: always start with a clean shell, to avoid spurious bugs. the
|
28
|
shell should not have changes to the env vars. (there have been bugs
|
29
|
that went away after closing and reopening the terminal window.) note
|
30
|
that running `exec bash` is not sufficient to *reset* the env vars.
|
31
|
|
32
|
Notes on editing files:
|
33
|
**WARNING**: shell scripts should always be read-only, so that editing them
|
34
|
while an import is in progress will not crash the import (see
|
35
|
http://vegpath.org/links/#**%20modifying%20a%20running%20shell%20script)
|
36
|
|
37
|
Single datasource import:
|
38
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
39
|
(Re)import and scrub: make inputs/<datasrc>/reimport_scrub by_col=1 &
|
40
|
(Re)import only: make inputs/<datasrc>/reimport by_col=1 &
|
41
|
Note that these commands also work if the datasource is not yet imported
|
42
|
Remake analytical DB: see Full database import > To remake analytical DB
|
43
|
|
44
|
Full database import:
|
45
|
**WARNING**: You must perform *every single* step listed below, to avoid
|
46
|
breaking column-based import
|
47
|
**WARNING**: always start with a clean shell, as described above under
|
48
|
"Notes on running programs"
|
49
|
**IMPORTANT**: the beginning of the import should be scheduled at a time
|
50
|
when the DB will not be needed for other uses. this is necessary because
|
51
|
vegbiendev will be slow for the first few hours of the import, due to
|
52
|
the import using all the available cores.
|
53
|
do steps under Maintenance > "to synchronize vegbiendev, jupiter, and
|
54
|
your local machine"
|
55
|
On local machine:
|
56
|
make inputs/upload
|
57
|
make inputs/upload live=1
|
58
|
make test by_col=1 # runtime: 20 min ("4m46.108s" + ("21:50:43" - "21:37:09")) @starscream
|
59
|
if you encounter errors, they are most likely related to the
|
60
|
PostgreSQL error parsing in /lib/sql.py parse_exception()
|
61
|
See note under Testing below
|
62
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
63
|
Ensure there are no local modifications: svn st
|
64
|
up
|
65
|
make inputs/download
|
66
|
make inputs/download live=1
|
67
|
For each newly-uploaded datasource above: make inputs/<datasrc>/reinstall
|
68
|
Update the auxiliary schemas: make schemas/reinstall
|
69
|
**WARNING**: requires sudo access!
|
70
|
The public schema will be installed separately by the import process
|
71
|
Delete imports before the last so they won't bloat the full DB backup:
|
72
|
make backups/vegbien.<version>.backup/remove
|
73
|
To keep a previous import other than the public schema:
|
74
|
export dump_opts='--exclude-schema=public --exclude-schema=<version>'
|
75
|
# env var will be inherited by `screen` shell
|
76
|
restart Postgres to free up any disk space used by temp tables from the last
|
77
|
import (this is apparently not automatically reclaimed):
|
78
|
make postgres_restart
|
79
|
Make sure there is at least 1 TB of disk space on /: df -h
|
80
|
although the import schema itself is only 315 GB, Postgres uses
|
81
|
significant temporary space at the beginning of the import.
|
82
|
the total disk usage oscillates between 1.2 TB and the entire disk
|
83
|
for the first day (for import started @12:55:09, high-water marks of
|
84
|
1.7 TB @14:00:25, 1.8 TB @15:38:32; then next day w/ 2 datasources
|
85
|
running: entire disk for 4 min @05:35:44, 1.8 TB @11:15:05).
|
86
|
To free up space, remove backups that have been archived on jupiter:
|
87
|
List backups/ to view older backups
|
88
|
Check their MD5 sums using the steps under On jupiter below
|
89
|
Remove these backups
|
90
|
for full import:
|
91
|
screen
|
92
|
Press ENTER
|
93
|
for small import, use above, or the following:
|
94
|
$0 # nested shell to contain the env changes
|
95
|
the following must happen within screen to avoid affecting the outer shell:
|
96
|
unset TMOUT # TMOUT causes shell to exit even with background processes
|
97
|
set -o ignoreeof # prevent Ctrl+D from exiting shell to keep attached jobs
|
98
|
on local machine:
|
99
|
unset n # clear any limit set in .profile (unless desired)
|
100
|
unset log # allow logging output to go to log files
|
101
|
unset version # clear any version from last import, etc.
|
102
|
if no commits have been made since the last import (eg. if retrying an
|
103
|
import), set a custom version that differs from the auto-assigned one
|
104
|
(would otherwise cause a collision with the last import):
|
105
|
svn info
|
106
|
extract the svn revision after "Revision:"
|
107
|
export version=r[revision]_2 # +suffix to distinguish from last import
|
108
|
# env var will be inherited by `screen` shell
|
109
|
to import just a subset of the datasources:
|
110
|
declare -ax inputs; inputs=(inputs/{src,...}/) # no () in declare on Mac
|
111
|
# array vars *not* inherited by `screen` shell
|
112
|
export version=custom_import_name
|
113
|
Start column-based import: . bin/import_all
|
114
|
To use row-based import: . bin/import_all by_col=
|
115
|
To stop all running imports: . bin/stop_imports
|
116
|
**WARNING**: Do NOT run import_all in the background, or the jobs it
|
117
|
creates won't be owned by your shell.
|
118
|
Note that import_all will take up to an hour to import the NCBI backbone
|
119
|
and other metadata before returning control to the shell.
|
120
|
To view progress:
|
121
|
tail inputs/{.,}*/*/logs/$version.log.sql
|
122
|
note: at the beginning of the import, the system may send out CPU load
|
123
|
warning e-mails. these can safely be ignored. (they happen because the
|
124
|
parallel imports use all the available cores.)
|
125
|
for test import, turn off DB backup (also turns off analytical DB creation):
|
126
|
kill % # cancel after_import()
|
127
|
Wait (4 days) for the import to finish
|
128
|
To recover from a closed terminal window: screen -r
|
129
|
To restart an aborted import for a specific table:
|
130
|
export version=<version>
|
131
|
(set -o errexit; make inputs/<datasrc>/<table>/import_scrub by_col=1 continue=1; make inputs/<datasrc>/publish) &
|
132
|
bin/after_import $! & # $! can also be obtained from `jobs -l`
|
133
|
Get $version: echo $version
|
134
|
Set $version in all vegbiendev terminals: export version=<version>
|
135
|
When there are no more running jobs, exit `screen`: exit # not Ctrl+D
|
136
|
upload logs: make inputs/upload live=1
|
137
|
On local machine: make inputs/download-logs live=1
|
138
|
check for disk space errors:
|
139
|
grep --files-with-matches -F 'No space left on device' inputs/{.,}*/*/logs/$version.log.sql
|
140
|
if there are any matches:
|
141
|
manually reimport these datasources using the steps under
|
142
|
Single datasource import
|
143
|
bin/after_import &
|
144
|
wait for the import to finish
|
145
|
tail inputs/{.,}*/*/logs/$version.log.sql
|
146
|
In the output, search for "Command exited with non-zero status"
|
147
|
For inputs that have this, fix the associated bug(s)
|
148
|
If many inputs have errors, discard the current (partial) import:
|
149
|
make schemas/$version/uninstall
|
150
|
Otherwise, continue
|
151
|
In PostgreSQL:
|
152
|
Go to wiki.vegpath.org/VegBIEN_contents
|
153
|
Get the # observations
|
154
|
Get the # datasources
|
155
|
Get the # datasources with observations
|
156
|
in the r# schema:
|
157
|
Check that analytical_stem contains [# observations] rows
|
158
|
Check that source contains [# datasources] rows up through XAL. If this
|
159
|
is not the case, manually check the entries in source against the
|
160
|
datasources list on the wiki page (some datasources may be near the
|
161
|
end depending on import order).
|
162
|
Check that provider_count contains [# datasources with observations]
|
163
|
rows with dataset="(total)" (at the top when the table is unsorted)
|
164
|
Check that TNRS ran successfully:
|
165
|
tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
|
166
|
If the log ends in an AssertionError
|
167
|
"assert sql.table_col_names(db, table) == header":
|
168
|
Figure out which TNRS CSV columns have changed
|
169
|
On local machine:
|
170
|
Make the changes in the DB's TNRS and public schemas
|
171
|
rm=1 inputs/.TNRS/schema.sql.run export_
|
172
|
make schemas/remake
|
173
|
inputs/test_taxonomic_names/test_scrub # re-run TNRS
|
174
|
rm=1 inputs/.TNRS/data.sql.run export_
|
175
|
Commit
|
176
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
177
|
If dropping a column, save the dependent views
|
178
|
Make the same changes in the live TNRS.tnrs table on vegbiendev
|
179
|
If dropping a column, recreate the dependent views
|
180
|
Restart the TNRS client: make scrub by_col=1 &
|
181
|
Publish the new import:
|
182
|
**WARNING**: Before proceeding, be sure you have done *every single*
|
183
|
verification step listed above. Otherwise, a previous valid import
|
184
|
could incorrectly be overwritten with a broken one.
|
185
|
make schemas/$version/publish # runtime: 1 min ("real 1m10.451s")
|
186
|
unset version
|
187
|
make backups/upload live=1
|
188
|
on local machine:
|
189
|
make backups/vegbien.$version.backup/download live=1
|
190
|
# download backup to local machine
|
191
|
ssh aaronmk@jupiter.nceas.ucsb.edu
|
192
|
cd /data/dev/aaronmk/bien/backups
|
193
|
For each newly-archived backup:
|
194
|
make -s <backup>.md5/test
|
195
|
Check that "OK" is printed next to the filename
|
196
|
If desired, record the import times in inputs/import.stats.xls:
|
197
|
On local machine:
|
198
|
Open inputs/import.stats.xls
|
199
|
If the rightmost import is within 5 columns of column IV:
|
200
|
Copy the current tab to <leftmost-date>~<rightmost-date>
|
201
|
Remove the previous imports from the current tab because they are
|
202
|
now in the copied tab instead
|
203
|
Insert a copy of the leftmost "By column" column group before it
|
204
|
export version=<version>
|
205
|
bin/import_date inputs/{.,}*/*/logs/$version.log.sql
|
206
|
Update the import date in the upper-right corner
|
207
|
bin/import_times inputs/{.,}*/*/logs/$version.log.sql
|
208
|
Paste the output over the # Rows/Time columns, making sure that the
|
209
|
row counts match up with the previous import's row counts
|
210
|
If the row counts do not match up, insert or reorder rows as needed
|
211
|
until they do. Get the datasource names from the log file footers:
|
212
|
tail inputs/{.,}*/*/logs/$version.log.sql
|
213
|
Commit: svn ci -m 'inputs/import.stats.xls: updated import times'
|
214
|
Running individual steps separately:
|
215
|
To run TNRS:
|
216
|
To use an import other than public: export version=<version>
|
217
|
to rescrub all names:
|
218
|
make inputs/.TNRS/reinstall
|
219
|
re-create public-schema views that were cascadingly deleted
|
220
|
make scrub &
|
221
|
To view progress:
|
222
|
tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
|
223
|
To remake analytical DB:
|
224
|
To use an import other than public: export version=<version>
|
225
|
bin/make_analytical_db & # runtime: 13 h ("12:43:57elapsed")
|
226
|
To view progress:
|
227
|
tail -150 inputs/analytical_db/logs/make_analytical_db.log.sql
|
228
|
To back up DB (staging tables and last import):
|
229
|
To use an import *other than public*: export version=<version>
|
230
|
make backups/TNRS.backup-remake &
|
231
|
dump_opts=--exclude-schema=public make backups/vegbien.$version.backup/test &
|
232
|
If after renaming to public, instead set dump_opts='' and replace
|
233
|
$version with the appropriate revision
|
234
|
make backups/upload live=1
|
235
|
|
236
|
Datasource setup:
|
237
|
On local machine:
|
238
|
Example steps for a datasource: wiki.vegpath.org/Import_process_for_Madidi
|
239
|
umask ug=rwx,o= # prevent files from becoming web-accessible
|
240
|
Add a new datasource: make inputs/<datasrc>/add
|
241
|
<datasrc> may not contain spaces, and should be abbreviated.
|
242
|
If the datasource is a herbarium, <datasrc> should be the herbarium code
|
243
|
as defined by the Index Herbariorum <http://sweetgum.nybg.org/ih/>
|
244
|
For a new-style datasource (one containing a ./run runscript):
|
245
|
"cp" -f inputs/.NCBI/{Makefile,run,table.run} inputs/<datasrc>/
|
246
|
For MySQL inputs (exports and live DB connections):
|
247
|
For .sql exports:
|
248
|
Place the original .sql file in _src/ (*not* in _MySQL/)
|
249
|
Follow the steps starting with Install the staging tables below.
|
250
|
This is for an initial sync to get the file onto vegbiendev.
|
251
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
252
|
Create a database for the MySQL export in phpMyAdmin
|
253
|
Give the bien user all database-specific privileges *except*
|
254
|
UPDATE, DELETE, ALTER, DROP. This prevents bugs in the
|
255
|
import scripts from accidentally deleting data.
|
256
|
bin/mysql_bien database <inputs/<datasrc>/_src/export.sql &
|
257
|
mkdir inputs/<datasrc>/_MySQL/
|
258
|
cp -p lib/MySQL.{data,schema}.sql.make inputs/<datasrc>/_MySQL/
|
259
|
Edit _MySQL/*.make for the DB connection
|
260
|
For a .sql export, use server=vegbiendev and --user=bien
|
261
|
Skip the Add input data for each table section
|
262
|
For MS Access databases:
|
263
|
Place the .mdb or .accdb file in _src/
|
264
|
Download and install Access To PostgreSQL from
|
265
|
http://www.bullzip.com/download.php
|
266
|
Use Access To PostgreSQL to export the database:
|
267
|
Export just the tables/indexes to inputs/<datasrc>/<file>.schema.sql
|
268
|
Export just the data to inputs/<datasrc>/<file>.data.sql
|
269
|
In <file>.schema.sql, make the following changes:
|
270
|
Replace text "BOOLEAN" with "/*BOOLEAN*/INTEGER"
|
271
|
Replace text "DOUBLE PRECISION NULL" with "DOUBLE PRECISION"
|
272
|
Skip the Add input data for each table section
|
273
|
Add input data for each table present in the datasource:
|
274
|
For .sql exports, you must use the name of the table in the DB export
|
275
|
For CSV files, you can use any name. It's recommended to use a table
|
276
|
name from <https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCSV#Suggested-table-names>
|
277
|
Note that if this table will be joined together with another table, its
|
278
|
name must end in ".src"
|
279
|
make inputs/<datasrc>/<table>/add
|
280
|
Important: DO NOT just create an empty directory named <table>!
|
281
|
This command also creates necessary subdirs, such as logs/.
|
282
|
If the table is in a .sql export: make inputs/<datasrc>/<table>/install
|
283
|
Otherwise, place the CSV(s) for the table in
|
284
|
inputs/<datasrc>/<table>/ OR place a query joining other tables
|
285
|
together in inputs/<datasrc>/<table>/create.sql
|
286
|
Important: When exporting relational databases to CSVs, you MUST ensure
|
287
|
that embedded quotes are escaped by doubling them, *not* by
|
288
|
preceding them with a "\" as is the default in phpMyAdmin
|
289
|
If there are multiple part files for a table, and the header is repeated
|
290
|
in each part, make sure each header is EXACTLY the same.
|
291
|
(If the headers are not the same, the CSV concatenation script
|
292
|
assumes the part files don't have individual headers and treats the
|
293
|
subsequent headers as data rows.)
|
294
|
Add <table> to inputs/<datasrc>/import_order.txt before other tables
|
295
|
that depend on it
|
296
|
For a new-style datasource:
|
297
|
"cp" -f inputs/.NCBI/nodes/run inputs/<datasrc>/<table>/
|
298
|
inputs/<datasrc>/<table>/run
|
299
|
Install the staging tables:
|
300
|
make inputs/<datasrc>/reinstall quiet=1 &
|
301
|
For a MySQL .sql export:
|
302
|
At prompt "[you]@vegbiendev's password:", enter your password
|
303
|
At prompt "Enter password:", enter the value in config/bien_password
|
304
|
To view progress: tail -f inputs/<datasrc>/<table>/logs/install.log.sql
|
305
|
View the logs: tail -n +1 inputs/<datasrc>/*/logs/install.log.sql
|
306
|
tail provides a header line with the filename
|
307
|
+1 starts at the first line, to show the whole file
|
308
|
For every file with an error 'column "..." specified more than once':
|
309
|
Add a header override file "+header.<ext>" in <table>/:
|
310
|
Note: The leading "+" should sort it before the flat files.
|
311
|
"_" unfortunately sorts *after* capital letters in ASCII.
|
312
|
Create a text file containing the header line of the flat files
|
313
|
Add an ! at the beginning of the line
|
314
|
This signals cat_csv that this is a header override.
|
315
|
For empty names, use their 0-based column # (by convention)
|
316
|
For duplicate names, add a distinguishing suffix
|
317
|
For long names that collided, rename them to <= 63 chars long
|
318
|
Do NOT make readability changes in this step; that is what the
|
319
|
map spreadsheets (below) are for.
|
320
|
Save
|
321
|
If you made any changes, re-run the install command above
|
322
|
Auto-create the map spreadsheets: make inputs/<datasrc>/
|
323
|
Map each table's columns:
|
324
|
In each <table>/ subdir, for each "via map" map.csv:
|
325
|
Open the map in a spreadsheet editor
|
326
|
Open the "core map" /mappings/Veg+-VegBIEN.csv
|
327
|
In each row of the via map, set the right column to a value from the
|
328
|
left column of the core map
|
329
|
Save
|
330
|
Regenerate the derived maps: make inputs/<datasrc>/
|
331
|
Accept the test cases:
|
332
|
For a new-style datasource:
|
333
|
inputs/<datasrc>/run
|
334
|
svn di inputs/<datasrc>/*/test.xml.ref
|
335
|
If you get errors, follow the steps for old-style datasources below
|
336
|
For an old-style datasource:
|
337
|
make inputs/<datasrc>/test
|
338
|
When prompted to "Accept new test output", enter y and press ENTER
|
339
|
If you instead get errors, do one of the following for each one:
|
340
|
- If the error was due to a bug, fix it
|
341
|
- Add a SQL function that filters or transforms the invalid data
|
342
|
- Make an empty mapping for the columns that produced the error.
|
343
|
Put something in the Comments column of the map spreadsheet to
|
344
|
prevent the automatic mapper from auto-removing the mapping.
|
345
|
When accepting tests, it's helpful to use WinMerge
|
346
|
(see WinMerge setup below for configuration)
|
347
|
make inputs/<datasrc>/test by_col=1
|
348
|
If you get errors this time, this always indicates a bug, usually in
|
349
|
the VegBIEN unique constraints or column-based import itself
|
350
|
Add newly-created files: make inputs/<datasrc>/add
|
351
|
Commit: svn ci -m "Added inputs/<datasrc>/" inputs/<datasrc>/
|
352
|
Update vegbiendev:
|
353
|
ssh aaronmk@jupiter.nceas.ucsb.edu
|
354
|
up
|
355
|
On local machine:
|
356
|
./fix_perms
|
357
|
make inputs/upload
|
358
|
make inputs/upload live=1
|
359
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
360
|
up
|
361
|
make inputs/download
|
362
|
make inputs/download live=1
|
363
|
Follow the steps under Install the staging tables above
|
364
|
|
365
|
Maintenance:
|
366
|
on a live machine, you should put the following in your .profile:
|
367
|
--
|
368
|
# make svn files web-accessible. this does not affect unversioned files, because
|
369
|
# these get the right permissions on the local machine instead.
|
370
|
umask ug=rwx,o=rx
|
371
|
|
372
|
unset TMOUT # TMOUT causes screen to exit even with background processes
|
373
|
--
|
374
|
if http://vegbiendev.nceas.ucsb.edu/phppgadmin/ goes down:
|
375
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
376
|
make phppgadmin-Linux
|
377
|
regularly, re-run full-database import so that bugs in it don't pile up.
|
378
|
it needs to be kept in working order so that it works when it's needed.
|
379
|
to back up the vegbiendev databases:
|
380
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
381
|
back up MySQL: # usually few changes, so do this first
|
382
|
live= backups/mysql_snapshot
|
383
|
then review diff, and rerun without `live=`
|
384
|
l=1 overwrite=1 inplace=1 local_dir=/ remote_url="$USER@jupiter:/data/dev/aaronmk/Documents/BIEN/" subpath=/var/lib/mysql.bak/ sudo -E env PATH="$PATH" bin/sync_upload
|
385
|
on local machine:
|
386
|
l=1 swap=1 overwrite=1 inplace=1 local_dir=~ sync_remote_subdir= subpath=~/Documents/BIEN/var/lib/mysql.bak/ bin/sync_upload
|
387
|
back up Postgres:
|
388
|
live= backups/pg_snapshot
|
389
|
then review diff, and rerun without `live=`
|
390
|
to synchronize vegbiendev, jupiter, and your local machine:
|
391
|
**WARNING**: pay careful attention to all files that will be deleted or
|
392
|
overwritten!
|
393
|
install put if needed:
|
394
|
download https://uutils.googlecode.com/svn/trunk/bin/put to ~/bin/ and `chmod +x` it
|
395
|
when changes are made on vegbiendev:
|
396
|
avoid extraneous diffs when rsyncing:
|
397
|
on all machines:
|
398
|
up
|
399
|
./fix_perms
|
400
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
401
|
upload:
|
402
|
overwrite=1 bin/sync_upload --size-only
|
403
|
then review diff, and rerun with `l=1` prepended
|
404
|
on your machine:
|
405
|
download:
|
406
|
overwrite=1 swap=1 src=. dest='aaronmk@jupiter.nceas.ucsb.edu:~/bien' put --exclude=.svn web/BIEN3/TWiki
|
407
|
then review diff, and rerun with `l=1` prepended
|
408
|
swap=1 bin/sync_upload backups/TNRS.backup
|
409
|
then review diff, and rerun with `l=1` prepended
|
410
|
overwrite=1 swap=1 bin/sync_upload --size-only
|
411
|
then review diff, and rerun with `l=1` prepended
|
412
|
overwrite=1 sync_remote_url=~/Dropbox/svn/ bin/sync_upload --existing --size-only # just update mtimes/perms
|
413
|
then review diff, and rerun with `l=1` prepended
|
414
|
to back up e-mails:
|
415
|
on local machine:
|
416
|
/Applications/gmvault-v1.8.1-beta/bin/gmvault sync --multiple-db-owner --type quick aaronmk.nceas@gmail.com
|
417
|
/Applications/gmvault-v1.8.1-beta/bin/gmvault sync --multiple-db-owner --type quick aaronmk@nceas.ucsb.edu
|
418
|
open Thunderbird
|
419
|
click the All Mail folder for each account and wait for it to download the e-mails in it
|
420
|
to back up the version history:
|
421
|
# back up first on the local machine, because often only the svnsync
|
422
|
command gets run, and that way it will get backed up immediately to
|
423
|
Dropbox (and hourly to Time Machine), while vegbiendev only gets
|
424
|
backed up daily to tape
|
425
|
on local machine:
|
426
|
svnsync sync file://"$HOME"/Dropbox/docs/BIEN/svn_repo/ # initial runtime: 1.5 h ("08:21:38" - "06:45:26") @vegbiendev
|
427
|
(cd ~/Dropbox/docs/BIEN/git/; git svn fetch)
|
428
|
overwrite=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/svn_repo/ # runtime: 1 min ("1:05.08")
|
429
|
then review diff, and rerun with `l=1` prepended
|
430
|
overwrite=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/git/
|
431
|
then review diff, and rerun with `l=1` prepended
|
432
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
433
|
# use absolute path for vegbiendev commands because the Ubuntu 14.04
|
434
|
version of rsync doesn't expand ~ properly
|
435
|
overwrite=1 swap=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/svn_repo/ # runtime: 30 s ("36.19")
|
436
|
then review diff, and rerun with `l=1` prepended
|
437
|
overwrite=1 swap=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/git/
|
438
|
then review diff, and rerun with `l=1` prepended
|
439
|
to synchronize a Mac's settings with my testing machine's:
|
440
|
download:
|
441
|
**WARNING**: this will overwrite all your user's settings!
|
442
|
on your machine:
|
443
|
overwrite=1 swap=1 sync_local_dir=~/Library/ sync_remote_subdir=Library/ bin/sync_upload --exclude="/Saved Application State"
|
444
|
then review diff, and rerun with `l=1` prepended
|
445
|
upload:
|
446
|
do step when changes are made on vegbiendev > on your machine, download
|
447
|
ssh aaronmk@jupiter.nceas.ucsb.edu
|
448
|
(cd ~/Dropbox/svn/; up)
|
449
|
on your machine:
|
450
|
rm ~/'Library/Thunderbird/Profiles/9oo8rcyn.default/ImapMail/imap.googlemail.com/[Gmail].sbd/Spam'
|
451
|
# remove the downloaded Spam folder, because spam e-mails often contain viruses that would trigger clamscan
|
452
|
overwrite=1 del= sync_local_dir=~/Dropbox/svn/ sync_remote_subdir=Dropbox/svn/ bin/sync_upload --size-only # just update mtimes
|
453
|
then review diff, and rerun with `l=1` prepended
|
454
|
overwrite=1 inplace=1 sync_local_dir=~ sync_remote_subdir= bin/sync_upload ~/"VirtualBox VMs/**" # need inplace=1 because they are very large files
|
455
|
then review diff, and rerun with `l=1` prepended
|
456
|
overwrite=1 sync_local_dir=~ sync_remote_subdir= bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/Library/Thunderbird/Profiles/9oo8rcyn.default/global-messages-db.sqlite" --exclude="/.Trash" --exclude="/bin" --exclude="/bin/pg_ctl" --exclude="/bin/unzip" --exclude="/Dropbox/home" --exclude="/.profile" --exclude="/.shrc" --exclude="/.bashrc" --exclude="/software/**/.svn"
|
457
|
then review diff, and rerun with `l=1` prepended
|
458
|
stop Dropbox: system tray > Dropbox icon > gear icon > Quit Dropbox
|
459
|
this prevents Dropbox from trying to capture filesystem
|
460
|
events while syncing
|
461
|
overwrite=1 sync_local_dir=~ sync_remote_url=~/Dropbox/home bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/Library/Thunderbird/Profiles/9oo8rcyn.default/global-messages-db.sqlite" --exclude="/.Trash" --exclude="/.dropbox" --exclude="/Documents/BIEN" --exclude="/Dropbox" --exclude="/software" --exclude="/VirtualBox VMs/**.sav" --exclude="/VirtualBox VMs/**.vdi" --exclude="/VirtualBox VMs/**.vmdk"
|
462
|
then review diff, and rerun with `l=1` prepended
|
463
|
start Dropbox: /Applications > double-click Dropbox.app
|
464
|
to backup files not in Time Machine:
|
465
|
On local machine:
|
466
|
overwrite=1 src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo -E put usr/local/var/postgres
|
467
|
then review diff, and rerun with `l=1` prepended
|
468
|
launchctl unload ~/Library/LaunchAgents/homebrew.mxcl.postgresql.plist # stop the PostgreSQL server
|
469
|
overwrite=1 src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo -E put usr/local/var/postgres
|
470
|
then review diff, and rerun with `l=1` prepended
|
471
|
launchctl load ~/Library/LaunchAgents/homebrew.mxcl.postgresql.plist # start the PostgreSQL server
|
472
|
VegCore data dictionary:
|
473
|
Regularly, or whenever the VegCore data dictionary page
|
474
|
(https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCore)
|
475
|
is changed, regenerate mappings/VegCore.csv:
|
476
|
On local machine:
|
477
|
make mappings/VegCore.htm-remake; make mappings/
|
478
|
apply new data dict mappings to datasource mappings/staging tables:
|
479
|
inputs/run postprocess # runtime: see inputs/run
|
480
|
time yes|make inputs/{NVS,SALVIAS,TEAM}/test # old-style import; runtime: 1 min ("0m59.692s") @starscream
|
481
|
svn di mappings/VegCore.tables.redmine
|
482
|
If there are changes, update the data dictionary's Tables section
|
483
|
When moving terms, check that no terms were lost: svn di
|
484
|
svn ci -m 'mappings/VegCore.htm: regenerated from wiki'
|
485
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
486
|
perform the steps under "apply new data dict mappings to
|
487
|
datasource mappings/staging tables" above
|
488
|
Important: Whenever you install a system update that affects PostgreSQL or
|
489
|
any of its dependencies, such as libc, you should restart the PostgreSQL
|
490
|
server. Otherwise, you may get strange errors like "the database system
|
491
|
is in recovery mode" which go away upon reimport, or you may not be able
|
492
|
to access the database as the postgres superuser. This applies to both
|
493
|
Linux and Mac OS X.
|
494
|
|
495
|
Backups:
|
496
|
Archived imports:
|
497
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
498
|
Back up: make backups/<version>.backup &
|
499
|
Note: To back up the last import, you must archive it first:
|
500
|
make schemas/rotate
|
501
|
Test: make -s backups/<version>.backup/test &
|
502
|
Restore: make backups/<version>.backup/restore &
|
503
|
Remove: make backups/<version>.backup/remove
|
504
|
Download: make backups/<version>.backup/download
|
505
|
TNRS cache:
|
506
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
507
|
Back up: make backups/TNRS.backup-remake &
|
508
|
runtime: 3 min ("real 2m48.859s")
|
509
|
Restore:
|
510
|
yes|make inputs/.TNRS/uninstall
|
511
|
make backups/TNRS.backup/restore &
|
512
|
runtime: 5.5 min ("real 5m35.829s")
|
513
|
yes|make schemas/public/reinstall
|
514
|
Must come after TNRS restore to recreate tnrs_input_name view
|
515
|
Full DB:
|
516
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
517
|
Back up: make backups/vegbien.<version>.backup &
|
518
|
Test: make -s backups/vegbien.<version>.backup/test &
|
519
|
Restore: make backups/vegbien.<version>.backup/restore &
|
520
|
Download: make backups/vegbien.<version>.backup/download
|
521
|
Import logs:
|
522
|
On local machine:
|
523
|
Download: make inputs/download-logs live=1
|
524
|
|
525
|
Datasource refreshing:
|
526
|
VegBank:
|
527
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
528
|
make inputs/VegBank/vegbank.sql-remake
|
529
|
make inputs/VegBank/reinstall quiet=1 &
|
530
|
|
531
|
Schema changes:
|
532
|
On local machine:
|
533
|
When changing the analytical views, run sync_analytical_..._to_view()
|
534
|
to update the corresponding table
|
535
|
Remember to update the following files with any renamings:
|
536
|
schemas/filter_ERD.csv
|
537
|
mappings/VegCore-VegBIEN.csv
|
538
|
mappings/verify.*.sql
|
539
|
Regenerate schema from installed DB: make schemas/remake
|
540
|
Reinstall DB from schema: make schemas/public/reinstall schemas/reinstall
|
541
|
**WARNING**: This will delete the public schema of your VegBIEN DB!
|
542
|
If needed, reinstall staging tables:
|
543
|
On local machine:
|
544
|
sudo -E -u postgres psql <<<'ALTER DATABASE vegbien RENAME TO vegbien_prev'
|
545
|
make db
|
546
|
. bin/reinstall_all
|
547
|
Fix any bugs and retry until no errors
|
548
|
make schemas/public/install
|
549
|
This must be run *after* the datasources are installed, because
|
550
|
views in public depend on some of the datasources
|
551
|
sudo -E -u postgres psql <<<'DROP DATABASE vegbien_prev'
|
552
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
553
|
repeat the above steps
|
554
|
**WARNING**: Do not run this until reinstall_all runs successfully
|
555
|
on the local machine, or the live DB may be unrestorable!
|
556
|
update mappings and staging table column names:
|
557
|
on local machine:
|
558
|
inputs/run postprocess # runtime: see inputs/run
|
559
|
time yes|make inputs/{NVS,SALVIAS,TEAM}/test # old-style import; runtime: 1 min ("0m59.692s") @starscream
|
560
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
561
|
manually apply schema changes to the live public schema
|
562
|
do steps under "on local machine" above
|
563
|
Sync ERD with vegbien.sql schema:
|
564
|
Run make schemas/vegbien.my.sql
|
565
|
Open schemas/vegbien.ERD.mwb in MySQLWorkbench
|
566
|
Go to File > Export > Synchronize With SQL CREATE Script...
|
567
|
For Input File, select schemas/vegbien.my.sql
|
568
|
Click Continue
|
569
|
In the changes list, select each table with an arrow next to it
|
570
|
Click Update Model
|
571
|
Click Continue
|
572
|
Note: The generated SQL script will be empty because we are syncing in
|
573
|
the opposite direction
|
574
|
Click Execute
|
575
|
Reposition any lines that have been reset
|
576
|
Add any new tables by dragging them from the Catalog in the left sidebar
|
577
|
to the diagram
|
578
|
Remove any deleted tables by right-clicking the table's diagram element,
|
579
|
selecting Delete '<table name>', and clicking Delete
|
580
|
Save
|
581
|
If desired, update the graphical ERD exports (see below)
|
582
|
Update graphical ERD exports:
|
583
|
Go to File > Export > Export as PNG...
|
584
|
Select schemas/vegbien.ERD.png and click Save
|
585
|
Go to File > Export > Export as SVG...
|
586
|
Select schemas/vegbien.ERD.svg and click Save
|
587
|
Go to File > Export > Export as Single Page PDF...
|
588
|
Select schemas/vegbien.ERD.1_pg.pdf and click Save
|
589
|
Go to File > Print...
|
590
|
In the lower left corner, click PDF > Save as PDF...
|
591
|
Set the Title and Author to ""
|
592
|
Select schemas/vegbien.ERD.pdf and click Save
|
593
|
Commit: svn ci -m "schemas/vegbien.ERD.mwb: Regenerated exports"
|
594
|
Refactoring tips:
|
595
|
To rename a table:
|
596
|
In vegbien.sql, do the following:
|
597
|
Replace regexp (?<=_|\b)<old>(?=_|\b) with <new>
|
598
|
This is necessary because the table name is *everywhere*
|
599
|
Search for <new>
|
600
|
Manually change back any replacements inside comments
|
601
|
To rename a column:
|
602
|
Rename the column: ALTER TABLE <table> RENAME <old> TO <new>;
|
603
|
Recreate any foreign key for the column, removing CONSTRAINT <name>
|
604
|
This resets the foreign key name using the new column name
|
605
|
Creating a poster of the ERD:
|
606
|
Determine the poster size:
|
607
|
Measure the line height (from the bottom of one line to the bottom
|
608
|
of another): 16.3cm/24 lines = 0.679cm
|
609
|
Measure the height of the ERD: 35.4cm*2 = 70.8cm
|
610
|
Zoom in as far as possible
|
611
|
Measure the height of a capital letter: 3.5mm
|
612
|
Measure the line height: 8.5mm
|
613
|
Calculate the text's fraction of the line height: 3.5mm/8.5mm = 0.41
|
614
|
Calculate the text height: 0.679cm*0.41 = 0.28cm
|
615
|
Calculate the text height's fraction of the ERD height:
|
616
|
0.28cm/70.8cm = 0.0040
|
617
|
Measure the text height on the *VegBank* ERD poster: 5.5mm = 0.55cm
|
618
|
Calculate the VegBIEN poster height to make the text the same size:
|
619
|
0.55cm/0.0040 = 137.5cm H; *1in/2.54cm = 54.1in H
|
620
|
The ERD aspect ratio is 11 in W x (2*8.5in H) = 11x17 portrait
|
621
|
Calculate the VegBIEN poster width: 54.1in H*11W/17H = 35.0in W
|
622
|
The minimum VegBIEN poster size is 35x54in portrait
|
623
|
Determine the cost:
|
624
|
The FedEx Kinkos near NCEAS (1030 State St, Santa Barbara, CA 93101)
|
625
|
charges the following for posters:
|
626
|
base: $7.25/sq ft
|
627
|
lamination: $3/sq ft
|
628
|
mounting on a board: $8/sq ft
|
629
|
|
630
|
Testing:
|
631
|
On a development machine, you should put the following in your .profile:
|
632
|
umask ug=rwx,o= # prevent files from becoming web-accessible
|
633
|
export log= n=2
|
634
|
For development machine specs, see /planning/resources/dev_machine.specs/
|
635
|
On local machine:
|
636
|
Mapping process: make test
|
637
|
Including column-based import: make test by_col=1
|
638
|
If the row-based and column-based imports produce different inserted
|
639
|
row counts, this usually means that a table is underconstrained
|
640
|
(the unique indexes don't cover all possible rows).
|
641
|
This can occur if you didn't use COALESCE(field, null_value) around
|
642
|
a nullable field in a unique index. See sql_gen.null_sentinels for
|
643
|
the appropriate null value to use.
|
644
|
Map spreadsheet generation: make remake
|
645
|
Missing mappings: make missing_mappings
|
646
|
Everything (for most complete coverage): make test-all
|
647
|
|
648
|
Debugging:
|
649
|
"Binary chop" debugging:
|
650
|
(This is primarily useful for regressions that occurred in a previous
|
651
|
revision, which was committed without running all the tests)
|
652
|
up -r <rev>; make inputs/.TNRS/reinstall; make schemas/public/reinstall; make <failed-test>.xml
|
653
|
.htaccess:
|
654
|
mod_rewrite:
|
655
|
**IMPORTANT**: whenever you change the DirectorySlash setting for a
|
656
|
directory, you *must* clear your browser's cache to ensure that
|
657
|
a cached redirect is not used. this is because RewriteRule
|
658
|
redirects are (by default) temporary, but DirectorySlash
|
659
|
redirects are permanent.
|
660
|
for Firefox:
|
661
|
press Cmd+Shift+Delete
|
662
|
check only Cache
|
663
|
press Enter or click Clear Now
|
664
|
|
665
|
WinMerge setup:
|
666
|
In a Windows VM:
|
667
|
Install WinMerge from <http://winmerge.org/>
|
668
|
Open WinMerge
|
669
|
Go to Edit > Options and click Compare in the left sidebar
|
670
|
Enable "Moved block detection", as described at
|
671
|
<http://manual.winmerge.org/Configuration.html#d0e5892>.
|
672
|
Set Whitespace to Ignore change, as described at
|
673
|
<http://manual.winmerge.org/Configuration.html#d0e5758>.
|
674
|
|
675
|
Documentation:
|
676
|
To generate a Redmine-formatted list of steps for column-based import:
|
677
|
On local machine:
|
678
|
make schemas/public/reinstall
|
679
|
make inputs/ACAD/Specimen/logs/steps.by_col.log.sql
|
680
|
To import and scrub just the test taxonomic names:
|
681
|
ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
|
682
|
inputs/test_taxonomic_names/test_scrub
|
683
|
|
684
|
General:
|
685
|
To see a program's description, read its top-of-file comment
|
686
|
To see a program's usage, run it without arguments
|
687
|
To remake a directory: make <dir>/remake
|
688
|
To remake a file: make <file>-remake
|