Project

General

Profile

1
Installation:
2
	Check out svn: svn co https://code.nceas.ucsb.edu/code/projects/bien
3
	cd bien/
4
	Install: make install
5
		**WARNING**: This will delete the public schema of your VegBIEN DB!
6
	Uninstall: make uninstall
7
		**WARNING**: This will delete your entire VegBIEN DB!
8
		This includes all archived imports and staging tables.
9

    
10
Connecting to vegbiendev:
11
	ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
12
	cd /home/bien/svn # should happen automatically at login
13

    
14
Notes on system stability:
15
	**WARNING**: system upgrades can break key parts of the full-database
16
		import, causing errors such as disk space overruns. for this reason, it
17
		is recommended to maintain a snapshot copy of the VM as it was at the
18
		last successful import, for fallback use if a system upgrade breaks
19
		anything. system upgrades on the snapshot VM should be disabled
20
		completely, and because this will also disable security fixes, the
21
		snapshot VM should be disconnected from the internet and all networking
22
		interfaces. (this is an unfortunate consequence of modern OSes being
23
		written in non-memory-safe languages such as C and C++.)
24

    
25
Notes on running programs:
26
	**WARNING**: always start with a clean shell, to avoid spurious bugs. the
27
		shell should not have changes to the env vars. (there have been bugs
28
		that went away after closing and reopening the terminal window.) note
29
		that running `exec bash` is not sufficient to *reset* the env vars.
30

    
31
Notes on editing files:
32
	**WARNING**: shell scripts should always be read-only, so that editing them
33
		while an import is in progress will not crash the import (see
34
		http://vegpath.org/links/#**%20modifying%20a%20running%20shell%20script)
35

    
36
Single datasource import:
37
	ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
38
	(Re)import and scrub: make inputs/<datasrc>/reimport_scrub by_col=1 &
39
	(Re)import only: make inputs/<datasrc>/reimport by_col=1 &
40
	Note that these commands also work if the datasource is not yet imported
41
	Remake analytical DB: see Full database import > To remake analytical DB
42

    
43
Full database import:
44
	**WARNING**: You must perform *every single* step listed below, to avoid
45
		breaking column-based import
46
	**WARNING**: always start with a clean shell, as described above under
47
		"Notes on running programs"
48
	**IMPORTANT**: the beginning of the import should be scheduled at a time
49
		when the DB will not be needed for other uses. this is necessary because
50
		vegbiendev will be slow for the first few hours of the import, due to
51
		the import using all the available cores.
52
	do steps under Maintenance > "to synchronize vegbiendev, jupiter, and
53
		your local machine"
54
	On local machine:
55
		make inputs/upload
56
		make inputs/upload live=1
57
		make test by_col=1 # runtime: 20 min ("4m46.108s" + ("21:50:43" - "21:37:09")) @starscream
58
			if you encounter errors, they are most likely related to the
59
				PostgreSQL error parsing in /lib/sql.py parse_exception()
60
			See note under Testing below
61
	ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
62
	Ensure there are no local modifications: svn st
63
	up
64
	make inputs/download
65
	make inputs/download live=1
66
	For each newly-uploaded datasource above: make inputs/<datasrc>/reinstall
67
	Update the auxiliary schemas: make schemas/reinstall
68
		**WARNING**: requires sudo access!
69
		The public schema will be installed separately by the import process
70
	Delete imports before the last so they won't bloat the full DB backup:
71
		make backups/vegbien.<version>.backup/remove
72
		To keep a previous import other than the public schema:
73
			export dump_opts='--exclude-schema=public --exclude-schema=<version>'
74
			# env var will be inherited by `screen` shell
75
	restart Postgres to free up any disk space used by temp tables from the last
76
		import (this is apparently not automatically reclaimed):
77
		make postgres_restart
78
	Make sure there is at least 1 TB of disk space on /: df -h
79
		**WARNING**: sometimes, this amount of available space is insufficient
80
			and the entire disk space gets used up, crashing the import. if this
81
			occurs, the problem will often be fixed just by rerunning the import
82
			again. (the high-water mark varies by import.)
83
		although the import schema itself is only 315 GB, Postgres uses
84
			significant temporary space at the beginning of the import.
85
			the total disk usage oscillates between 1.2 TB and the entire disk
86
			for the first day (for import started @12:55:09, high-water marks of
87
			1.7 TB @14:00:25, 1.8 TB @15:38:32; then next day w/ 2 datasources
88
			running: entire disk for 4 min @05:35:44, 1.8 TB @11:15:05).
89
		To free up space, remove backups that have been archived on jupiter:
90
			List backups/ to view older backups
91
			Check their MD5 sums using the steps under On jupiter below
92
			Remove these backups
93
	for full import:
94
		screen
95
		Press ENTER
96
	for small import, use above, or the following:
97
		$0 # nested shell to contain the env changes
98
	the following must happen within screen to avoid affecting the outer shell:
99
	unset TMOUT # TMOUT causes screen to exit even with background processes
100
	set -o ignoreeof #prevent Ctrl+D from exiting `screen` to keep attached jobs
101
	unset n # on local machine, clear any limit set in .profile (unless desired)
102
	unset version # clear any version from last import, etc.
103
	if no commits have been made since the last import (eg. if retrying an
104
		import), set a custom version that differs from the auto-assigned one
105
		(would otherwise cause a collision with the last import):
106
		svn info
107
		extract the svn revision after "Revision:"
108
		export version=r[revision]_2 # +suffix to distinguish from last import
109
			# env var will be inherited by `screen` shell
110
	to import just a subset of the datasources:
111
		declare -ax inputs=(inputs/{src,...}/)
112
			# array vars *not* inherited by `screen` shell
113
		export version=custom_import_name
114
	Start column-based import: . bin/import_all
115
		To use row-based import: . bin/import_all by_col=
116
		To stop all running imports: . bin/stop_imports
117
		**WARNING**: Do NOT run import_all in the background, or the jobs it
118
			creates won't be owned by your shell.
119
		Note that import_all will take up to an hour to import the NCBI backbone
120
			and other metadata before returning control to the shell.
121
		To view progress:
122
			tail inputs/{.,}*/*/logs/$version.log.sql
123
	note: at the beginning of the import, the system may send out CPU load
124
		warning e-mails. these can safely be ignored. (they happen because the
125
		parallel imports use all the available cores.)
126
	Wait (4 days) for the import to finish
127
	To recover from a closed terminal window: screen -r
128
	To restart an aborted import for a specific table:
129
		export version=<version>
130
		(set -o errexit; make inputs/<datasrc>/<table>/import_scrub by_col=1 continue=1; make inputs/<datasrc>/publish) &
131
		bin/after_import $! & # $! can also be obtained from `jobs -l`
132
	Get $version: echo $version
133
	Set $version in all vegbiendev terminals: export version=<version>
134
	When there are no more running jobs, exit `screen`: exit # not Ctrl+D
135
	upload logs: make inputs/upload live=1
136
	On local machine: make inputs/download-logs live=1
137
	check for disk space errors:
138
		grep --files-with-matches -F 'No space left on device' inputs/{.,}*/*/logs/$version.log.sql
139
		if there are any matches:
140
			manually reimport these datasources using the steps under
141
				Single datasource import
142
			bin/after_import &
143
			wait for the import to finish
144
	tail inputs/{.,}*/*/logs/$version.log.sql
145
	In the output, search for "Command exited with non-zero status"
146
	For inputs that have this, fix the associated bug(s)
147
	If many inputs have errors, discard the current (partial) import:
148
		make schemas/$version/uninstall
149
	Otherwise, continue
150
	In PostgreSQL:
151
		Go to wiki.vegpath.org/VegBIEN_contents
152
		Get the # observations
153
		Get the # datasources
154
		Get the # datasources with observations
155
		in the r# schema:
156
		Check that analytical_stem contains [# observations] rows
157
		Check that source contains [# datasources] rows up through XAL. If this
158
			is not the case, manually check the entries in source against the
159
			datasources list on the wiki page (some datasources may be near the
160
			end depending on import order).
161
		Check that provider_count contains [# datasources with observations]
162
			rows with dataset="(total)" (at the top when the table is unsorted)
163
	Check that TNRS ran successfully:
164
		tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
165
		If the log ends in an AssertionError
166
			"assert sql.table_col_names(db, table) == header":
167
			Figure out which TNRS CSV columns have changed
168
			On local machine:
169
				Make the changes in the DB's TNRS and public schemas
170
				rm=1 inputs/.TNRS/schema.sql.run export_
171
				make schemas/remake
172
				inputs/test_taxonomic_names/test_scrub # re-run TNRS
173
				rm=1 inputs/.TNRS/data.sql.run export_
174
				Commit
175
			ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
176
				If dropping a column, save the dependent views
177
				Make the same changes in the live TNRS.tnrs table on vegbiendev
178
				If dropping a column, recreate the dependent views
179
				Restart the TNRS client: make scrub by_col=1 &
180
	Publish the new import:
181
		**WARNING**: Before proceeding, be sure you have done *every single*
182
			verification step listed above. Otherwise, a previous valid import
183
			could incorrectly be overwritten with a broken one.
184
		make schemas/$version/publish # runtime: 1 min ("real 1m10.451s")
185
	unset version
186
	make backups/upload live=1
187
	on local machine:
188
		make backups/vegbien.$version.backup/download live=1
189
			# download backup to local machine
190
	ssh aaronmk@jupiter.nceas.ucsb.edu
191
		cd /data/dev/aaronmk/bien/backups
192
		For each newly-archived backup:
193
			make -s <backup>.md5/test
194
			Check that "OK" is printed next to the filename
195
	If desired, record the import times in inputs/import.stats.xls:
196
		On local machine:
197
		Open inputs/import.stats.xls
198
		If the rightmost import is within 5 columns of column IV:
199
			Copy the current tab to <leftmost-date>~<rightmost-date>
200
			Remove the previous imports from the current tab because they are
201
				now in the copied tab instead
202
		Insert a copy of the leftmost "By column" column group before it
203
		export version=<version>
204
		bin/import_date inputs/{.,}*/*/logs/$version.log.sql
205
		Update the import date in the upper-right corner
206
		bin/import_times inputs/{.,}*/*/logs/$version.log.sql
207
		Paste the output over the # Rows/Time columns, making sure that the
208
			row counts match up with the previous import's row counts
209
		If the row counts do not match up, insert or reorder rows as needed
210
			until they do. Get the datasource names from the log file footers:
211
			tail inputs/{.,}*/*/logs/$version.log.sql
212
		Commit: svn ci -m 'inputs/import.stats.xls: updated import times'
213
	Running individual steps separately:
214
	To run TNRS:
215
		To use an import other than public: export version=<version>
216
		make scrub &
217
		To view progress:
218
			tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
219
	To remake analytical DB:
220
		To use an import other than public: export version=<version>
221
		bin/make_analytical_db & # runtime: 13 h ("12:43:57elapsed")
222
		To view progress:
223
			tail -150 inputs/analytical_db/logs/make_analytical_db.log.sql
224
	To back up DB (staging tables and last import):
225
		To use an import *other than public*: export version=<version>
226
		make backups/TNRS.backup-remake &
227
		dump_opts=--exclude-schema=public make backups/vegbien.$version.backup/test &
228
			If after renaming to public, instead set dump_opts='' and replace
229
			$version with the appropriate revision
230
		make backups/upload live=1
231

    
232
Datasource setup:
233
	On local machine:
234
	Example steps for a datasource: wiki.vegpath.org/Import_process_for_Madidi
235
	umask ug=rwx,o= # prevent files from becoming web-accessible
236
	Add a new datasource: make inputs/<datasrc>/add
237
		<datasrc> may not contain spaces, and should be abbreviated.
238
		If the datasource is a herbarium, <datasrc> should be the herbarium code
239
			as defined by the Index Herbariorum <http://sweetgum.nybg.org/ih/>
240
	For a new-style datasource (one containing a ./run runscript):
241
		"cp" -f inputs/.NCBI/{Makefile,run,table.run} inputs/<datasrc>/
242
	For MySQL inputs (exports and live DB connections):
243
		For .sql exports:
244
			Place the original .sql file in _src/ (*not* in _MySQL/)
245
			Follow the steps starting with Install the staging tables below.
246
				This is for an initial sync to get the file onto vegbiendev.
247
			ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
248
				Create a database for the MySQL export in phpMyAdmin
249
				Give the bien user all database-specific privileges *except*
250
					UPDATE, DELETE, ALTER, DROP. This prevents bugs in the
251
					import scripts from accidentally deleting data.
252
				bin/mysql_bien database <inputs/<datasrc>/_src/export.sql &
253
		mkdir inputs/<datasrc>/_MySQL/
254
		cp -p lib/MySQL.{data,schema}.sql.make inputs/<datasrc>/_MySQL/
255
		Edit _MySQL/*.make for the DB connection
256
			For a .sql export, use server=vegbiendev and --user=bien
257
		Skip the Add input data for each table section
258
	For MS Access databases:
259
		Place the .mdb or .accdb file in _src/
260
		Download and install Access To PostgreSQL from
261
			http://www.bullzip.com/download.php
262
		Use Access To PostgreSQL to export the database:
263
			Export just the tables/indexes to inputs/<datasrc>/<file>.schema.sql
264
			Export just the data to inputs/<datasrc>/<file>.data.sql
265
		In <file>.schema.sql, make the following changes:
266
			Replace text "BOOLEAN" with "/*BOOLEAN*/INTEGER"
267
			Replace text "DOUBLE PRECISION NULL" with "DOUBLE PRECISION"
268
		Skip the Add input data for each table section
269
	Add input data for each table present in the datasource:
270
		For .sql exports, you must use the name of the table in the DB export
271
		For CSV files, you can use any name. It's recommended to use a table
272
			name from <https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCSV#Suggested-table-names>
273
		Note that if this table will be joined together with another table, its
274
			name must end in ".src"
275
		make inputs/<datasrc>/<table>/add
276
			Important: DO NOT just create an empty directory named <table>!
277
				This command also creates necessary subdirs, such as logs/.
278
		If the table is in a .sql export: make inputs/<datasrc>/<table>/install
279
			Otherwise, place the CSV(s) for the table in
280
			inputs/<datasrc>/<table>/ OR place a query joining other tables
281
			together in inputs/<datasrc>/<table>/create.sql
282
		Important: When exporting relational databases to CSVs, you MUST ensure
283
			that embedded quotes are escaped by doubling them, *not* by
284
			preceding them with a "\" as is the default in phpMyAdmin
285
		If there are multiple part files for a table, and the header is repeated
286
			in each part, make sure each header is EXACTLY the same.
287
			(If the headers are not the same, the CSV concatenation script
288
			assumes the part files don't have individual headers and treats the
289
			subsequent headers as data rows.)
290
		Add <table> to inputs/<datasrc>/import_order.txt before other tables
291
			that depend on it
292
		For a new-style datasource:
293
			"cp" -f inputs/.NCBI/nodes/run inputs/<datasrc>/<table>/
294
			inputs/<datasrc>/<table>/run
295
	Install the staging tables:
296
		make inputs/<datasrc>/reinstall quiet=1 &
297
		For a MySQL .sql export:
298
			At prompt "[you]@vegbiendev's password:", enter your password
299
			At prompt "Enter password:", enter the value in config/bien_password
300
		To view progress: tail -f inputs/<datasrc>/<table>/logs/install.log.sql
301
		View the logs: tail -n +1 inputs/<datasrc>/*/logs/install.log.sql
302
			tail provides a header line with the filename
303
			+1 starts at the first line, to show the whole file
304
		For every file with an error 'column "..." specified more than once':
305
			Add a header override file "+header.<ext>" in <table>/:
306
				Note: The leading "+" should sort it before the flat files.
307
					"_" unfortunately sorts *after* capital letters in ASCII.
308
				Create a text file containing the header line of the flat files
309
				Add an ! at the beginning of the line
310
					This signals cat_csv that this is a header override.
311
				For empty names, use their 0-based column # (by convention)
312
				For duplicate names, add a distinguishing suffix
313
				For long names that collided, rename them to <= 63 chars long
314
				Do NOT make readability changes in this step; that is what the
315
					map spreadsheets (below) are for.
316
				Save
317
		If you made any changes, re-run the install command above
318
	Auto-create the map spreadsheets: make inputs/<datasrc>/
319
	Map each table's columns:
320
		In each <table>/ subdir, for each "via map" map.csv:
321
			Open the map in a spreadsheet editor
322
			Open the "core map" /mappings/Veg+-VegBIEN.csv
323
			In each row of the via map, set the right column to a value from the
324
				left column of the core map
325
			Save
326
		Regenerate the derived maps: make inputs/<datasrc>/
327
	Accept the test cases:
328
		For a new-style datasource:
329
			inputs/<datasrc>/run
330
			svn di inputs/<datasrc>/*/test.xml.ref
331
			If you get errors, follow the steps for old-style datasources below
332
		For an old-style datasource:
333
			make inputs/<datasrc>/test
334
			When prompted to "Accept new test output", enter y and press ENTER
335
			If you instead get errors, do one of the following for each one:
336
			-	If the error was due to a bug, fix it
337
			-	Add a SQL function that filters or transforms the invalid data
338
			-	Make an empty mapping for the columns that produced the error.
339
				Put something in the Comments column of the map spreadsheet to
340
				prevent the automatic mapper from auto-removing the mapping.
341
			When accepting tests, it's helpful to use WinMerge
342
				(see WinMerge setup below for configuration)
343
		make inputs/<datasrc>/test by_col=1
344
			If you get errors this time, this always indicates a bug, usually in
345
				the VegBIEN unique constraints or column-based import itself
346
	Add newly-created files: make inputs/<datasrc>/add
347
	Commit: svn ci -m "Added inputs/<datasrc>/" inputs/<datasrc>/
348
	Update vegbiendev:
349
		ssh aaronmk@jupiter.nceas.ucsb.edu
350
			up
351
		On local machine:
352
			./fix_perms
353
			make inputs/upload
354
			make inputs/upload live=1
355
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
356
			up
357
			make inputs/download
358
			make inputs/download live=1
359
			Follow the steps under Install the staging tables above
360

    
361
Maintenance:
362
	on a live machine, you should put the following in your .profile:
363
--
364
# make svn files web-accessible. this does not affect unversioned files, because
365
# these get the right permissions on the local machine instead.
366
umask ug=rwx,o=rx
367

    
368
unset TMOUT # TMOUT causes screen to exit even with background processes
369
--
370
	if http://vegbiendev.nceas.ucsb.edu/phppgadmin/ goes down:
371
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
372
			make phppgadmin-Linux
373
	regularly, re-run full-database import so that bugs in it don't pile up.
374
		it needs to be kept in working order so that it works when it's needed.
375
	to synchronize vegbiendev, jupiter, and your local machine:
376
		**WARNING**: pay careful attention to all files that will be deleted or
377
			overwritten!
378
		install put if needed:
379
			download https://uutils.googlecode.com/svn/trunk/bin/put to ~/bin/ and `chmod +x` it
380
		when changes are made on vegbiendev:
381
			avoid extraneous diffs when rsyncing:
382
				on all machines:
383
				up
384
				./fix_perms
385
			ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
386
				upload:
387
				overwrite=1 bin/sync_upload --size-only
388
					then review diff, and rerun with `l=1` prepended
389
			on your machine:
390
				download:
391
				overwrite=1 swap=1 src=. dest='aaronmk@jupiter.nceas.ucsb.edu:~/bien' put --exclude=.svn inputs/VegBIEN/TWiki
392
					then review diff, and rerun with `l=1` prepended
393
				swap=1 bin/sync_upload backups/TNRS.backup
394
					then review diff, and rerun with `l=1` prepended
395
				overwrite=1 swap=1 bin/sync_upload --size-only
396
					then review diff, and rerun with `l=1` prepended
397
				overwrite=1 sync_remote_url=~/Dropbox/svn/ bin/sync_upload --existing --size-only # just update mtimes/perms
398
					then review diff, and rerun with `l=1` prepended
399
	to back up e-mails:
400
		on local machine:
401
		/Applications/gmvault-v1.8.1-beta/bin/gmvault sync --multiple-db-owner --type quick aaronmk.nceas@gmail.com
402
		/Applications/gmvault-v1.8.1-beta/bin/gmvault sync --multiple-db-owner --type quick aaronmk@nceas.ucsb.edu
403
		open Thunderbird
404
		click the All Mail folder for each account and wait for it to download the e-mails in it
405
	to back up the version history:
406
		# back up first on the local machine, because often only the svnsync
407
			command gets run, and that way it will get backed up immediately to
408
			Dropbox (and hourly to Time Machine), while vegbiendev only gets
409
			backed up daily to tape
410
		on local machine:
411
		svnsync sync file://"$HOME"/Dropbox/docs/BIEN/svn_repo/ # initial runtime: 1.5 h ("08:21:38" - "06:45:26") @vegbiendev
412
		(cd ~/Dropbox/docs/BIEN/git/; git svn fetch)
413
		overwrite=1        src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/svn_repo/ # runtime: 1 min ("1:05.08")
414
			then review diff, and rerun with `l=1` prepended
415
		overwrite=1        src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/git/
416
			then review diff, and rerun with `l=1` prepended
417
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
418
		# use absolute path for vegbiendev commands because the Ubuntu 14.04
419
			version of rsync doesn't expand ~ properly
420
		overwrite=1 swap=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/svn_repo/ # runtime: 30 s ("36.19")
421
			then review diff, and rerun with `l=1` prepended
422
		overwrite=1 swap=1 src=~ dest='aaronmk@jupiter.nceas.ucsb.edu:/data/dev/aaronmk/' put Dropbox/docs/BIEN/git/
423
			then review diff, and rerun with `l=1` prepended
424
	to synchronize a Mac's settings with my testing machine's:
425
		download:
426
			**WARNING**: this will overwrite all your user's settings!
427
			on your machine:
428
			overwrite=1 swap=1 sync_local_dir=~/Library/ sync_remote_subdir=Library/ bin/sync_upload --exclude="/Saved Application State"
429
				then review diff, and rerun with `l=1` prepended
430
		upload:
431
			do step when changes are made on vegbiendev > on your machine, download
432
			ssh aaronmk@jupiter.nceas.ucsb.edu
433
				(cd ~/Dropbox/svn/; up)
434
			on your machine:
435
				rm ~/'Library/Thunderbird/Profiles/9oo8rcyn.default/ImapMail/imap.googlemail.com/[Gmail].sbd/Spam'
436
					# remove the downloaded Spam folder, because spam e-mails often contain viruses that would trigger clamscan
437
				overwrite=1 del=      sync_local_dir=~/Dropbox/svn/ sync_remote_subdir=Dropbox/svn/ bin/sync_upload --size-only # just update mtimes
438
					then review diff, and rerun with `l=1` prepended
439
				overwrite=1 inplace=1 sync_local_dir=~              sync_remote_subdir=             bin/sync_upload ~/"VirtualBox VMs/**" # need inplace=1 because they are very large files
440
					then review diff, and rerun with `l=1` prepended
441
				overwrite=1           sync_local_dir=~              sync_remote_subdir=             bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/.Trash" --exclude="/bin" --exclude="/bin/pg_ctl" --exclude="/bin/unzip" --exclude="/Dropbox/home" --exclude="/.profile" --exclude="/.shrc" --exclude="/.bashrc" --exclude="/VirtualBox VMs/Ubuntu/Ubuntu.vdi"
442
					then review diff, and rerun with `l=1` prepended
443
				overwrite=1           sync_local_dir=~              sync_remote_url=~/Dropbox/home  bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/.Trash" --exclude="/.dropbox" --exclude="/Documents/BIEN" --exclude="/Dropbox" --exclude="/software" --exclude="/VirtualBox VMs/**.sav" --exclude="/VirtualBox VMs/**.vdi" --exclude="/VirtualBox VMs/**.vmdk"
444
					then review diff, and rerun with `l=1` prepended
445
	to backup files not in Time Machine:
446
		On local machine:
447
		overwrite=1 src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo -E put Library/PostgreSQL/9.3/data/
448
			then review diff, and rerun with `l=1` prepended
449
		pg_ctl. stop # stop the PostgreSQL server
450
		overwrite=1 src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo -E put Library/PostgreSQL/9.3/data/
451
			then review diff, and rerun with `l=1` prepended
452
		pg_ctl. start # start the PostgreSQL server
453
	VegCore data dictionary:
454
		Regularly, or whenever the VegCore data dictionary page
455
			(https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCore)
456
			is changed, regenerate mappings/VegCore.csv:
457
			On local machine:
458
			make mappings/VegCore.htm-remake; make mappings/
459
			apply new data dict mappings to datasource mappings/staging tables:
460
				inputs/run postprocess # runtime: see inputs/run
461
				time yes|make inputs/{NVS,SALVIAS,TEAM}/test # old-style import; runtime: 1 min ("0m59.692s") @starscream
462
			svn di mappings/VegCore.tables.redmine
463
			If there are changes, update the data dictionary's Tables section
464
			When moving terms, check that no terms were lost: svn di
465
			svn ci -m 'mappings/VegCore.htm: regenerated from wiki'
466
			ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
467
				perform the steps under "apply new data dict mappings to
468
					datasource mappings/staging tables" above
469
	Important: Whenever you install a system update that affects PostgreSQL or
470
		any of its dependencies, such as libc, you should restart the PostgreSQL
471
		server. Otherwise, you may get strange errors like "the database system
472
		is in recovery mode" which go away upon reimport, or you may not be able
473
		to access the database as the postgres superuser. This applies to both
474
		Linux and Mac OS X.
475

    
476
Backups:
477
	Archived imports:
478
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
479
		Back up: make backups/<version>.backup &
480
			Note: To back up the last import, you must archive it first:
481
				make schemas/rotate
482
		Test: make -s backups/<version>.backup/test &
483
		Restore: make backups/<version>.backup/restore &
484
		Remove: make backups/<version>.backup/remove
485
		Download: make backups/<version>.backup/download
486
	TNRS cache:
487
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
488
		Back up: make backups/TNRS.backup-remake &
489
			runtime: 3 min ("real 2m48.859s")
490
		Restore:
491
			yes|make inputs/.TNRS/uninstall
492
			make backups/TNRS.backup/restore &
493
				runtime: 5.5 min ("real 5m35.829s")
494
			yes|make schemas/public/reinstall
495
				Must come after TNRS restore to recreate tnrs_input_name view
496
	Full DB:
497
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
498
		Back up: make backups/vegbien.<version>.backup &
499
		Test: make -s backups/vegbien.<version>.backup/test &
500
		Restore: make backups/vegbien.<version>.backup/restore &
501
		Download: make backups/vegbien.<version>.backup/download
502
	Import logs:
503
		On local machine:
504
		Download: make inputs/download-logs live=1
505

    
506
Datasource refreshing:
507
	VegBank:
508
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
509
		make inputs/VegBank/vegbank.sql-remake
510
		make inputs/VegBank/reinstall quiet=1 &
511

    
512
Schema changes:
513
	On local machine:
514
	When changing the analytical views, run sync_analytical_..._to_view()
515
		to update the corresponding table
516
	Remember to update the following files with any renamings:
517
		schemas/filter_ERD.csv
518
		mappings/VegCore-VegBIEN.csv
519
		mappings/verify.*.sql
520
	Regenerate schema from installed DB: make schemas/remake
521
	Reinstall DB from schema: make schemas/public/reinstall schemas/reinstall
522
		**WARNING**: This will delete the public schema of your VegBIEN DB!
523
	If needed, reinstall staging tables:
524
		On local machine:
525
			sudo -E -u postgres psql <<<'ALTER DATABASE vegbien RENAME TO vegbien_prev'
526
			make db
527
			. bin/reinstall_all
528
			Fix any bugs and retry until no errors
529
			make schemas/public/install
530
				This must be run *after* the datasources are installed, because
531
				views in public depend on some of the datasources
532
			sudo -E -u postgres psql <<<'DROP DATABASE vegbien_prev'
533
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
534
			repeat the above steps
535
			**WARNING**: Do not run this until reinstall_all runs successfully
536
			on the local machine, or the live DB may be unrestorable!
537
	update mappings and staging table column names:
538
		on local machine:
539
			inputs/run postprocess # runtime: see inputs/run
540
			time yes|make inputs/{NVS,SALVIAS,TEAM}/test # old-style import; runtime: 1 min ("0m59.692s") @starscream
541
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
542
			manually apply schema changes to the live public schema
543
			do steps under "on local machine" above
544
	Sync ERD with vegbien.sql schema:
545
		Run make schemas/vegbien.my.sql
546
		Open schemas/vegbien.ERD.mwb in MySQLWorkbench
547
		Go to File > Export > Synchronize With SQL CREATE Script...
548
		For Input File, select schemas/vegbien.my.sql
549
		Click Continue
550
		In the changes list, select each table with an arrow next to it
551
		Click Update Model
552
		Click Continue
553
		Note: The generated SQL script will be empty because we are syncing in
554
			the opposite direction
555
		Click Execute
556
		Reposition any lines that have been reset
557
		Add any new tables by dragging them from the Catalog in the left sidebar
558
			to the diagram
559
		Remove any deleted tables by right-clicking the table's diagram element,
560
			selecting Delete '<table name>', and clicking Delete
561
		Save
562
		If desired, update the graphical ERD exports (see below)
563
	Update graphical ERD exports:
564
		Go to File > Export > Export as PNG...
565
		Select schemas/vegbien.ERD.png and click Save
566
		Go to File > Export > Export as SVG...
567
		Select schemas/vegbien.ERD.svg and click Save
568
		Go to File > Export > Export as Single Page PDF...
569
		Select schemas/vegbien.ERD.1_pg.pdf and click Save
570
		Go to File > Print...
571
		In the lower left corner, click PDF > Save as PDF...
572
		Set the Title and Author to ""
573
		Select schemas/vegbien.ERD.pdf and click Save
574
		Commit: svn ci -m "schemas/vegbien.ERD.mwb: Regenerated exports"
575
	Refactoring tips:
576
		To rename a table:
577
			In vegbien.sql, do the following:
578
				Replace regexp (?<=_|\b)<old>(?=_|\b) with <new>
579
					This is necessary because the table name is *everywhere*
580
				Search for <new>
581
				Manually change back any replacements inside comments
582
		To rename a column:
583
			Rename the column: ALTER TABLE <table> RENAME <old> TO <new>;
584
			Recreate any foreign key for the column, removing CONSTRAINT <name>
585
				This resets the foreign key name using the new column name
586
	Creating a poster of the ERD:
587
		Determine the poster size:
588
			Measure the line height (from the bottom of one line to the bottom
589
				of another): 16.3cm/24 lines = 0.679cm
590
			Measure the height of the ERD: 35.4cm*2 = 70.8cm
591
			Zoom in as far as possible
592
			Measure the height of a capital letter: 3.5mm
593
			Measure the line height: 8.5mm
594
			Calculate the text's fraction of the line height: 3.5mm/8.5mm = 0.41
595
			Calculate the text height: 0.679cm*0.41 = 0.28cm
596
			Calculate the text height's fraction of the ERD height:
597
				0.28cm/70.8cm = 0.0040
598
			Measure the text height on the *VegBank* ERD poster: 5.5mm = 0.55cm
599
			Calculate the VegBIEN poster height to make the text the same size:
600
				0.55cm/0.0040 = 137.5cm H; *1in/2.54cm = 54.1in H
601
			The ERD aspect ratio is 11 in W x (2*8.5in H) = 11x17 portrait
602
			Calculate the VegBIEN poster width: 54.1in H*11W/17H = 35.0in W
603
			The minimum VegBIEN poster size is 35x54in portrait
604
		Determine the cost:
605
			The FedEx Kinkos near NCEAS (1030 State St, Santa Barbara, CA 93101)
606
				charges the following for posters:
607
				base: $7.25/sq ft
608
				lamination: $3/sq ft
609
				mounting on a board: $8/sq ft
610

    
611
Testing:
612
	On a development machine, you should put the following in your .profile:
613
		umask ug=rwx,o= # prevent files from becoming web-accessible
614
		export log= n=2
615
	For development machine specs, see /planning/resources/dev_machine.specs/
616
	On local machine:
617
	Mapping process: make test
618
		Including column-based import: make test by_col=1
619
			If the row-based and column-based imports produce different inserted
620
			row counts, this usually means that a table is underconstrained
621
			(the unique indexes don't cover all possible rows).
622
			This can occur if you didn't use COALESCE(field, null_value) around
623
			a nullable field in a unique index. See sql_gen.null_sentinels for
624
			the appropriate null value to use.
625
	Map spreadsheet generation: make remake
626
	Missing mappings: make missing_mappings
627
	Everything (for most complete coverage): make test-all
628

    
629
Debugging:
630
	"Binary chop" debugging:
631
		(This is primarily useful for regressions that occurred in a previous
632
		revision, which was committed without running all the tests)
633
		up -r <rev>; make inputs/.TNRS/reinstall; make schemas/public/reinstall; make <failed-test>.xml
634
	.htaccess:
635
		mod_rewrite:
636
			**IMPORTANT**: whenever you change the DirectorySlash setting for a
637
				directory, you *must* clear your browser's cache to ensure that
638
				a cached redirect is not used. this is because RewriteRule
639
				redirects are (by default) temporary, but DirectorySlash
640
				redirects are permanent.
641
				for Firefox:
642
					press Cmd+Shift+Delete
643
					check only Cache
644
					press Enter or click Clear Now
645

    
646
WinMerge setup:
647
	In a Windows VM:
648
	Install WinMerge from <http://winmerge.org/>
649
	Open WinMerge
650
	Go to Edit > Options and click Compare in the left sidebar
651
	Enable "Moved block detection", as described at
652
		<http://manual.winmerge.org/Configuration.html#d0e5892>.
653
	Set Whitespace to Ignore change, as described at
654
		<http://manual.winmerge.org/Configuration.html#d0e5758>.
655

    
656
Documentation:
657
	To generate a Redmine-formatted list of steps for column-based import:
658
		On local machine:
659
		make schemas/public/reinstall
660
		make inputs/ACAD/Specimen/logs/steps.by_col.log.sql
661
	To import and scrub just the test taxonomic names:
662
		ssh -t vegbiendev.nceas.ucsb.edu exec sudo -u aaronmk -i
663
		inputs/test_taxonomic_names/test_scrub
664

    
665
General:
666
	To see a program's description, read its top-of-file comment
667
	To see a program's usage, run it without arguments
668
	To remake a directory: make <dir>/remake
669
	To remake a file: make <file>-remake
(6-6/11)