Project

General

Profile

1
Installation:
2
	Check out svn: svn co https://code.nceas.ucsb.edu/code/projects/bien
3
	cd bien/
4
	Install: make install
5
		WARNING: This will delete the current public schema of your VegBIEN DB!
6
	Uninstall: make uninstall
7
		WARNING: This will delete your entire VegBIEN DB!
8
		This includes all archived imports and staging tables.
9

    
10
Maintenance:
11
	on a live machine, you should put the following in your .profile:
12
		umask ug=rwx,o= # prevent files from becoming web-accessible
13
		unset TMOUT # TMOUT causes screen to exit even with background processes
14
	if http://vegbiendev.nceas.ucsb.edu/phppgadmin/ goes down:
15
		on vegbiendev: make postgres-Linux
16
	to synchronize vegbiendev, jupiter, and your local machine:
17
		WARNING: pay careful attention to all files that will be deleted or
18
			overwritten!
19
		install put if needed:
20
			download https://uutils.googlecode.com/svn/trunk/bin/put to ~/bin/ and `chmod +x` it
21
		when changes are made on vegbiendev:
22
			on vegbiendev, upload:
23
				overwrite=1 bin/sync_upload
24
					then rerun with l=1 ...
25
			on your machine, download:
26
				overwrite=1 swap=1 src=. dest='aaronmk@jupiter:~/bien' put --exclude=.svn inputs/VegBIEN/TWiki
27
					then rerun with l=1 ...
28
				overwrite=1 swap=1 bin/sync_upload
29
					then rerun with l=1 ...
30
				overwrite=1 sync_remote_url=~/Dropbox/svn/ bin/sync_upload --existing --size-only # just update mtimes/perms
31
					then rerun with l=1 ...
32
	to synchronize a Mac's settings with my testing machine's:
33
		download:
34
			WARNING: this will overwrite all your user's settings!
35
			overwrite=1 swap=1 sync_local_dir=~/Library/ sync_remote_subdir=Library/ bin/sync_upload --exclude="/Saved Application State"
36
				then rerun with l=1 ...
37
		upload:
38
			do step when changes are made on vegbiendev > on your machine, download
39
			on jupiter: (cd ~/Dropbox/svn/; svn up)
40
			on your machine:
41
				overwrite=1 del= sync_local_dir=~/Dropbox/svn/ sync_remote_subdir=Dropbox/svn/ bin/sync_upload --size-only # just update mtimes
42
					then rerun with l=1 ...
43
				overwrite=1      sync_local_dir=~              sync_remote_subdir=             bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/.Trash" --exclude="/bin" --exclude="/bin/pg_ctl" --exclude="/bin/unzip" --exclude="/Dropbox/home"
44
					then rerun with l=1 ...
45
				overwrite=1      sync_local_dir=~              sync_remote_url=~/Dropbox/home  bin/sync_upload --exclude="/Library/Saved Application State" --exclude="/.Trash" --exclude="/.dropbox" --exclude="/Documents/BIEN" --exclude="/Dropbox" --exclude="/software" --exclude="/VirtualBox VMs/**.sav" --exclude="/VirtualBox VMs/**.vdi" --exclude="/VirtualBox VMs/**.vmdk"
46
					then rerun with l=1 ...
47
		upload just the VirtualBox VMs:
48
			on your machine:
49
				overwrite=1      sync_local_dir=~              sync_remote_subdir=             bin/sync_upload ~/"VirtualBox VMs/**"
50
					then rerun with l=1 ...
51
	to backup files not in Time Machine:
52
		stop the PostgreSQL server
53
		src=/ dest=/Volumes/Time\ Machine\ Backups/ sudo put Library/PostgreSQL/9.1/data/
54
			then rerun with l=1 ...
55
		start the PostgreSQL server
56
	VegCore data dictionary:
57
		Regularly, or whenever the VegCore data dictionary page
58
			(https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCore)
59
			is changed, regenerate mappings/VegCore.csv:
60
			make mappings/VegCore.htm-remake; make mappings/
61
			svn di mappings/VegCore.tables.redmine
62
			If there are changes, update the data dictionary's Tables section
63
			When moving terms, check that no terms were lost: svn di
64
			svn ci -m 'mappings/VegCore.htm: regenerated from wiki'
65
	Important: Whenever you install a system update that affects PostgreSQL or
66
		any of its dependencies, such as libc, you should restart the PostgreSQL
67
		server. Otherwise, you may get strange errors like "the database system
68
		is in recovery mode" which go away upon reimport, or you may not be able
69
		to access the database as the postgres superuser. This applies to both
70
		Linux and Mac OS X.
71

    
72
Single datasource import:
73
	(Re)import and scrub: make inputs/<datasrc>/reimport_scrub by_col=1
74
	(Re)import only: make inputs/<datasrc>/reimport by_col=1
75
	(Re)scrub: make inputs/<datasrc>/rescrub by_col=1
76
	Note that these commands also work if the datasource is not yet imported
77

    
78
Full database import:
79
	WARNING: You must perform *every single* step listed below, to avoid
80
		breaking column-based import
81
	On jupiter: svn up --force
82
	On local machine:
83
		./fix_perms
84
		do steps under Maintenance > "to synchronize vegbiendev, jupiter, and
85
			your local machine" above
86
		make inputs/upload
87
		make inputs/upload live=1
88
		make test by_col=1
89
			See note under Testing below
90
	On vegbiendev:
91
	Ensure there are no local modifications: svn st
92
	svn up
93
	make inputs/download
94
	make inputs/download live=1
95
	For each newly-uploaded datasource above: make inputs/<datasrc>/reinstall
96
	Update the auxiliary schemas: make schemas/reinstall
97
		The public schema will be installed separately by the import process
98
	Delete imports before the last so they won't bloat the full DB backup:
99
		make backups/vegbien.<version>.backup/remove
100
		To keep a previous import other than the public schema:
101
			export dump_opts='--exclude-schema=public --exclude-schema=<version>'
102
	Make sure there is at least 300GB of disk space on /: df -h
103
		The import schema is 255GB, and may use additional space for temp tables
104
		To free up space, remove backups that have been archived on jupiter:
105
			List backups/ to view older backups
106
			Check their MD5 sums using the steps under On jupiter below
107
			Remove these backups
108
	Remove any leftover TNRS lockfile: rm inputs/.TNRS/tnrs/tnrs.make.lock
109
		Usually, the PID in it would not exist, but sometimes it now refers to
110
			a different, active process which blocks tnrs.make
111
	screen
112
	Press ENTER
113
	set -o ignoreeof #prevent Ctrl+D from exiting `screen` to keep attached jobs
114
	unset TMOUT # TMOUT causes screen to exit even with background processes
115
	unset version
116
	Start column-based import: . bin/import_all by_col=1
117
		To use row-based import: . bin/import_all
118
		To stop all running imports: . bin/stop_imports
119
		WARNING: Do NOT run import_all in the background, or the jobs it creates
120
			won't be owned by your shell.
121
		Note that import_all will take up to an hour to import the NCBI backbone
122
			and other metadata before returning control to the shell.
123
	Wait (overnight) for the import to finish
124
	To recover from a closed terminal window: screen -r
125
	When there are no more running jobs, exit the screen
126
	Get $version: echo $version
127
	Set $version in all vegbiendev terminals: export version=<version>
128
	Upload logs (run on vegbiendev): make inputs/upload live=1
129
	On local machine: make inputs/download-logs live=1
130
	In PostgreSQL:
131
		Check that the provider_count and source tables contain entries for all
132
			inputs
133
	Check that TNRS ran successfully:
134
		tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
135
		If the log ends in an AssertionError
136
			"assert sql.table_col_names(db, table) == header":
137
			Figure out which TNRS CSV columns have changed
138
			On local machine:
139
				Make the changes in inputs/.TNRS/schema.sql
140
				rm inputs/.TNRS/tnrs/header.csv
141
				make inputs/.TNRS/reinstall schema_only=1
142
				make schemas/public/reinstall
143
				If there are errors "column ... does not exist", etc.:
144
					Make the necessary changes in schemas/vegbien.sql
145
					make schemas/public/reinstall
146
				make schemas/remake
147
				inputs/test_taxonomic_names/test_scrub
148
				In inputs/test_taxonomic_names/_scrub/TNRS.sql, copy the
149
					"COPY tnrs ..." statement to inputs/.TNRS/data.sql
150
				Commit
151
			On vegbiendev:
152
				If dropping a column, save the dependent views
153
				Make the same changes in the live TNRS.tnrs table on vegbiendev
154
				If dropping a column, recreate the dependent views
155
				Restart the TNRS client: make scrub by_col=1 &
156
	tail inputs/{.,}*/*/logs/$version.log.sql
157
	In the output, search for "Command exited with non-zero status"
158
	For inputs that have this, fix the associated bug(s)
159
	If many inputs have errors, discard the current (partial) import:
160
		make schemas/$version/uninstall
161
	Otherwise, continue
162
	Publish the new import:
163
		WARNING: Before proceeding, be sure you have done *every single*
164
			verification step listed above. Otherwise, a previous valid import
165
			could incorrectly be overwritten with a broken one.
166
		make schemas/$version/publish
167
	unset version
168
	backups/fix_perms
169
	make backups/upload live=1
170
	On jupiter:
171
		cd /data/dev/aaronmk/bien/backups
172
		For each newly-archived backup:
173
			make -s <backup>.md5/test
174
			Check that "OK" is printed next to the filename
175
	On nimoy:
176
		cd /home/bien/svn/
177
		svn up
178
		export version=<version>
179
		make backups/analytical_stem.$version.csv/download
180
		In the bien_web DB:
181
			Create the analytical_stem_<version> table using its schema
182
				in schemas/vegbien.my.sql
183
		make -s backups/analytical_stem.$version.csv.md5/test
184
		Check that "OK" is printed next to the filename
185
		table=analytical_stem_$version bin/publish_analytical_db \
186
			backups/analytical_stem.$version.csv
187
	If desired, record the import times in inputs/import.stats.xls:
188
		Open inputs/import.stats.xls
189
		If the rightmost import is within 5 columns of column IV:
190
			Copy the current tab to <leftmost-date>~<rightmost-date>
191
			Remove the previous imports from the current tab because they are
192
				now in the copied tab instead
193
		Insert a copy of the leftmost "By column" column group before it
194
		export version=<version>
195
		bin/import_date inputs/{.,}*/*/logs/$version.log.sql
196
		Update the import date in the upper-right corner
197
		bin/import_times inputs/{.,}*/*/logs/$version.log.sql
198
		Paste the output over the # Rows/Time columns, making sure that the
199
			row counts match up with the previous import's row counts
200
		If the row counts do not match up, insert or reorder rows as needed
201
			until they do. Get the datasource names from the log file footers:
202
			tail inputs/{.,}*/*/logs/$version.log.sql
203
		Commit: svn ci -m "inputs/import.stats.xls: Updated import times"
204
	To run TNRS:
205
		To use an import other than public: export version=<version>
206
		make scrub &
207
		To view progress:
208
			tail -100 inputs/.TNRS/tnrs/logs/tnrs.make.log.sql
209
	To remake analytical DB:
210
		To use an import other than public: export version=<version>
211
		bin/make_analytical_db &
212
		To view progress:
213
			tail -100 inputs/analytical_db/logs/make_analytical_db.log.sql
214
	To back up DB (staging tables and last import):
215
		To use an import other than public: export version=<version>
216
		If before renaming to public: export dump_opts=--exclude-schema=public
217
		make backups/vegbien.$version.backup/test &
218

    
219
Backups:
220
	Archived imports:
221
		Back up: make backups/<version>.backup &
222
			Note: To back up the last import, you must archive it first:
223
				make schemas/rotate
224
		Test: make -s backups/<version>.backup/test &
225
		Restore: make backups/<version>.backup/restore &
226
		Remove: make backups/<version>.backup/remove
227
		Download: make backups/<version>.backup/download
228
	TNRS cache:
229
		Back up: make backups/TNRS.backup-remake &
230
			runtime: 3 min ("real 2m48.859s")
231
		Restore:
232
			yes|make inputs/.TNRS/uninstall
233
			make backups/TNRS.backup/restore &
234
				runtime: 5.5 min ("real 5m35.829s")
235
			yes|make schemas/public/reinstall
236
				Must come after TNRS restore to recreate tnrs_input_name view
237
	Full DB:
238
		Back up: make backups/vegbien.<version>.backup &
239
		Test: make -s backups/vegbien.<version>.backup/test &
240
		Restore: make backups/vegbien.<version>.backup/restore &
241
		Download: make backups/vegbien.<version>.backup/download
242
	Import logs:
243
		Download: make inputs/download-logs live=1
244

    
245
Datasource setup:
246
	umask ug=rwx,o= # prevent files from becoming web-accessible
247
	Add a new datasource: make inputs/<datasrc>/add
248
		<datasrc> may not contain spaces, and should be abbreviated.
249
		If the datasource is a herbarium, <datasrc> should be the herbarium code
250
			as defined by the Index Herbariorum <http://sweetgum.nybg.org/ih/>
251
	For MySQL inputs (exports and live DB connections):
252
		For .sql exports:
253
			Place the original .sql file in _src/ (*not* in _MySQL/)
254
			Follow the steps starting with Install the staging tables below.
255
				This is for an initial sync to get the file onto vegbiendev.
256
			On vegbiendev:
257
				Create a database for the MySQL export in phpMyAdmin
258
				Give the bien user all database-specific privileges *except*
259
					UPDATE, DELETE, ALTER, DROP. This prevents bugs in the
260
					import scripts from accidentally deleting data.
261
				bin/mysql_bien database <inputs/<datasrc>/_src/export.sql &
262
		mkdir inputs/<datasrc>/_MySQL/
263
		cp -p lib/MySQL.{data,schema}.sql.make inputs/<datasrc>/_MySQL/
264
		Edit _MySQL/*.make for the DB connection
265
			For a .sql export, use server=vegbiendev and --user=bien
266
		Skip the Add input data for each table section
267
	For MS Access databases:
268
		Place the .mdb or .accdb file in _src/
269
		Download and install Access To PostgreSQL from
270
			http://www.bullzip.com/download.php
271
		Use Access To PostgreSQL to export the database:
272
			Export just the tables/indexes to inputs/<datasrc>/<file>.schema.sql
273
			Export just the data to inputs/<datasrc>/<file>.data.sql
274
		In <file>.schema.sql, make the following changes:
275
			Replace text "BOOLEAN" with "/*BOOLEAN*/INTEGER"
276
			Replace text "DOUBLE PRECISION NULL" with "DOUBLE PRECISION"
277
		Skip the Add input data for each table section
278
	Add input data for each table present in the datasource:
279
		For .sql exports, you must use the name of the table in the DB export
280
		For CSV files, you can use any name. It's recommended to use a table
281
			name from <https://projects.nceas.ucsb.edu/nceas/projects/bien/wiki/VegCSV#Suggested-table-names>
282
		Note that if this table will be joined together with another table, its
283
			name must end in ".src"
284
		make inputs/<datasrc>/<table>/add
285
			Important: DO NOT just create an empty directory named <table>!
286
				This command also creates necessary subdirs, such as logs/.
287
		If the table is in a .sql export: make inputs/<datasrc>/<table>/install
288
			Otherwise, place the CSV(s) for the table in
289
			inputs/<datasrc>/<table>/ OR place a query joining other tables
290
			together in inputs/<datasrc>/<table>/create.sql
291
		Important: When exporting relational databases to CSVs, you MUST ensure
292
			that embedded quotes are escaped by doubling them, *not* by
293
			preceding them with a "\" as is the default in phpMyAdmin
294
		If there are multiple part files for a table, and the header is repeated
295
			in each part, make sure each header is EXACTLY the same.
296
			(If the headers are not the same, the CSV concatenation script
297
			assumes the part files don't have individual headers and treats the
298
			subsequent headers as data rows.)
299
		Add <table> to inputs/<datasrc>/import_order.txt before other tables
300
			that depend on it
301
	Install the staging tables:
302
		make inputs/<datasrc>/reinstall quiet=1 &
303
		For a MySQL .sql export:
304
			At prompt "[you]@vegbiendev's password:", enter your password
305
			At prompt "Enter password:", enter the value in config/bien_password
306
		To view progress: tail -f inputs/<datasrc>/<table>/logs/install.log.sql
307
		View the logs: tail -n +1 inputs/<datasrc>/*/logs/install.log.sql
308
			tail provides a header line with the filename
309
			+1 starts at the first line, to show the whole file
310
		For every file with an error 'column "..." specified more than once':
311
			Add a header override file "+header.<ext>" in <table>/:
312
				Note: The leading "+" should sort it before the flat files.
313
					"_" unfortunately sorts *after* capital letters in ASCII.
314
				Create a text file containing the header line of the flat files
315
				Add an ! at the beginning of the line
316
					This signals cat_csv that this is a header override.
317
				For empty names, use their 0-based column # (by convention)
318
				For duplicate names, add a distinguishing suffix
319
				For long names that collided, rename them to <= 63 chars long
320
				Do NOT make readability changes in this step; that is what the
321
					map spreadsheets (below) are for.
322
				Save
323
		If you made any changes, re-run the install command above
324
	Auto-create the map spreadsheets: make inputs/<datasrc>/
325
	Map each table's columns:
326
		In each <table>/ subdir, for each "via map" map.csv:
327
			Open the map in a spreadsheet editor
328
			Open the "core map" /mappings/Veg+-VegBIEN.csv
329
			In each row of the via map, set the right column to a value from the
330
				left column of the core map
331
			Save
332
		Regenerate the derived maps: make inputs/<datasrc>/
333
	Accept the test cases:
334
		make inputs/<datasrc>/test
335
			When prompted to "Accept new test output", enter y and press ENTER
336
			If you instead get errors, do one of the following for each one:
337
			-	If the error was due to a bug, fix it
338
			-	Add a SQL function that filters or transforms the invalid data
339
			-	Make an empty mapping for the columns that produced the error.
340
				Put something in the Comments column of the map spreadsheet to
341
				prevent the automatic mapper from auto-removing the mapping.
342
			When accepting tests, it's helpful to use WinMerge
343
				(see WinMerge setup below for configuration)
344
		make inputs/<datasrc>/test by_col=1
345
			If you get errors this time, this always indicates a bug, usually in
346
				the VegBIEN unique constraints or column-based import itself
347
	Add newly-created files: make inputs/<datasrc>/add
348
	Commit: svn ci -m "Added inputs/<datasrc>/" inputs/<datasrc>/
349
	Update vegbiendev:
350
		On jupiter: svn up
351
		On local machine:
352
			./fix_perms
353
			make inputs/upload
354
			make inputs/upload live=1
355
		On vegbiendev:
356
			svn up
357
			make inputs/download
358
			make inputs/download live=1
359
			Follow the steps under Install the staging tables above
360

    
361
Datasource refreshing:
362
	VegBank:
363
		make inputs/VegBank/vegbank.sql-remake
364
		make inputs/VegBank/reinstall quiet=1 &
365

    
366
Schema changes:
367
	When changing the analytical views, run sync_analytical_..._to_view()
368
		to update the corresponding table
369
	Remember to update the following files with any renamings:
370
		schemas/filter_ERD.csv
371
		mappings/VegCore-VegBIEN.csv
372
		mappings/verify.*.sql
373
	Regenerate schema from installed DB: make schemas/remake
374
	Reinstall DB from schema: make schemas/public/reinstall schemas/reinstall
375
		WARNING: This will delete the current public schema of your VegBIEN DB!
376
	Reinstall staging tables:
377
		On local machine:
378
			sudo -E -u postgres psql <<<'ALTER DATABASE vegbien RENAME TO vegbien_prev'
379
			make db
380
			. bin/reinstall_all
381
			Fix any bugs and retry until no errors
382
			make schemas/public/install
383
				This must be run *after* the datasources are installed, because
384
				views in public depend on some of the datasources
385
			sudo -E -u postgres psql <<<'DROP DATABASE vegbien_prev'
386
		On vegbiendev: repeat the above steps
387
			WARNING: Do not run this until reinstall_all runs successfully on
388
			the local machine, or the live DB may be unrestorable!
389
	Sync ERD with vegbien.sql schema:
390
		Run make schemas/vegbien.my.sql
391
		Open schemas/vegbien.ERD.mwb in MySQLWorkbench
392
		Go to File > Export > Synchronize With SQL CREATE Script...
393
		For Input File, select schemas/vegbien.my.sql
394
		Click Continue
395
		In the changes list, select each table with an arrow next to it
396
		Click Update Model
397
		Click Continue
398
		Note: The generated SQL script will be empty because we are syncing in
399
			the opposite direction
400
		Click Execute
401
		Reposition any lines that have been reset
402
		Add any new tables by dragging them from the Catalog in the left sidebar
403
			to the diagram
404
		Remove any deleted tables by right-clicking the table's diagram element,
405
			selecting Delete '<table name>', and clicking Delete
406
		Save
407
		If desired, update the graphical ERD exports (see below)
408
	Update graphical ERD exports:
409
		Go to File > Export > Export as PNG...
410
		Select schemas/vegbien.ERD.png and click Save
411
		Go to File > Export > Export as SVG...
412
		Select schemas/vegbien.ERD.svg and click Save
413
		Go to File > Export > Export as Single Page PDF...
414
		Select schemas/vegbien.ERD.1_pg.pdf and click Save
415
		Go to File > Print...
416
		In the lower left corner, click PDF > Save as PDF...
417
		Set the Title and Author to ""
418
		Select schemas/vegbien.ERD.pdf and click Save
419
		Commit: svn ci -m "schemas/vegbien.ERD.mwb: Regenerated exports"
420
	Refactoring tips:
421
		To rename a table:
422
			In vegbien.sql, do the following:
423
				Replace regexp (?<=_|\b)<old>(?=_|\b) with <new>
424
					This is necessary because the table name is *everywhere*
425
				Search for <new>
426
				Manually change back any replacements inside comments
427
		To rename a column:
428
			Rename the column: ALTER TABLE <table> RENAME <old> TO <new>;
429
			Recreate any foreign key for the column, removing CONSTRAINT <name>
430
				This resets the foreign key name using the new column name
431
	Creating a poster of the ERD:
432
		Determine the poster size:
433
			Measure the line height (from the bottom of one line to the bottom
434
				of another): 16.3cm/24 lines = 0.679cm
435
			Measure the height of the ERD: 35.4cm*2 = 70.8cm
436
			Zoom in as far as possible
437
			Measure the height of a capital letter: 3.5mm
438
			Measure the line height: 8.5mm
439
			Calculate the text's fraction of the line height: 3.5mm/8.5mm = 0.41
440
			Calculate the text height: 0.679cm*0.41 = 0.28cm
441
			Calculate the text height's fraction of the ERD height:
442
				0.28cm/70.8cm = 0.0040
443
			Measure the text height on the *VegBank* ERD poster: 5.5mm = 0.55cm
444
			Calculate the VegBIEN poster height to make the text the same size:
445
				0.55cm/0.0040 = 137.5cm H; *1in/2.54cm = 54.1in H
446
			The ERD aspect ratio is 11 in W x (2*8.5in H) = 11x17 portrait
447
			Calculate the VegBIEN poster width: 54.1in H*11W/17H = 35.0in W
448
			The minimum VegBIEN poster size is 35x54in portrait
449
		Determine the cost:
450
			The FedEx Kinkos near NCEAS (1030 State St, Santa Barbara, CA 93101)
451
				charges the following for posters:
452
				base: $7.25/sq ft
453
				lamination: $3/sq ft
454
				mounting on a board: $8/sq ft
455

    
456
Testing:
457
	On a development machine, you should put the following in your .profile:
458
		umask ug=rwx,o= # prevent files from becoming web-accessible
459
		export log= n=2
460
	Mapping process: make test
461
		Including column-based import: make test by_col=1
462
			If the row-based and column-based imports produce different inserted
463
			row counts, this usually means that a table is underconstrained
464
			(the unique indexes don't cover all possible rows).
465
			This can occur if you didn't use COALESCE(field, null_value) around
466
			a nullable field in a unique index. See sql_gen.null_sentinels for
467
			the appropriate null value to use.
468
	Map spreadsheet generation: make remake
469
	Missing mappings: make missing_mappings
470
	Everything (for most complete coverage): make test-all
471

    
472
Debugging:
473
	"Binary chop" debugging:
474
		(This is primarily useful for regressions that occurred in a previous
475
		revision, which was committed without running all the tests)
476
		svn up -r <rev>; make inputs/.TNRS/reinstall; make schemas/public/reinstall; make <failed-test>.xml
477
	.htaccess:
478
		mod_rewrite:
479
			IMPORTANT: whenever you change the DirectorySlash setting for a
480
				directory, you *must* clear your browser's cache to ensure that
481
				a cached redirect is not used. this is because RewriteRule
482
				redirects are (by default) temporary, but DirectorySlash
483
				redirects are permanent.
484
				for Firefox:
485
					press Cmd+Shift+Delete
486
					check only Cache
487
					press Enter or click Clear Now
488

    
489
WinMerge setup:
490
	Install WinMerge from <http://winmerge.org/>
491
	Open WinMerge
492
	Go to Edit > Options and click Compare in the left sidebar
493
	Enable "Moved block detection", as described at
494
		<http://manual.winmerge.org/Configuration.html#d0e5892>.
495
	Set Whitespace to Ignore change, as described at
496
		<http://manual.winmerge.org/Configuration.html#d0e5758>.
497

    
498
Documentation:
499
	To generate a Redmine-formatted list of steps for column-based import:
500
		make schemas/public/reinstall
501
		make inputs/ACAD/Specimen/logs/steps.by_col.log.sql
502
	To import and scrub just the test taxonomic names:
503
		inputs/test_taxonomic_names/test_scrub
504

    
505
General:
506
	To see a program's description, read its top-of-file comment
507
	To see a program's usage, run it without arguments
508
	To remake a directory: make <dir>/remake
509
	To remake a file: make <file>-remake
(5-5/10)