added inputs/.TNRS/grants.sql, with statements to provide SELECT access to bien_read. these statements must be in grants.sql to avoid them being filtered out by pg_dump_limit.
inputs/input.Makefile: added support for separate grants.sql file, which may contain GRANT statements that would normally be filtered out by pg_dump_limit
inputs/input.Makefile: sql/install: added $debug option to run the *.sql import verbosely, to display which statements are being run. this should only be used for SQL files that use COPY FROM to import data, to avoid echoing pages of insert statements.
inputs/input.Makefile: keep $(sortFile) up-to-date: use sort_file_updated=1 flag to indicate that import_order.txt has already been checked, so that recursive invocations of make don't need to recheck it. also use this flag instead of an explicit $(MAKECMDGOALS) list to prevent the $(sortFile) check from being infinite-recursively reinvoked when input.Makefile is read as part of the $(sortFile) check itself.
inputs/input.Makefile: keep import_order.txt up-to-date by running `make $(sortFile)` each time make is run. this ensures that new datasources always have import_order.txt populated when make is first run. eventually, $(tables) can be always set to $(allTables) so that this auto-updating can also be used to ensure that new subdirs added by the user always make it into import_order.txt (so that they will be included in the subdirs that get remade, etc.). import_order.txt is primarily for specifying the order of the subdirs, but some datasources also use it to filter out subdirs, so it can't yet be always updated to include the full list of subdirs. however, the filter-out usage should no longer be necessary after the switch to new-style import.
inputs/input.Makefile: added $(filter_make), used to filter the output of embedded $(shell make ...) invocations
inputs/input.Makefile: $(sortFile): use $(filter-out)->then instead of $(filter)->else for clarity
inputs/input.Makefile: added $(sortFile) (import_order.txt) target which adds any missing tables to import_order.txt
inputs/input.Makefile: added list_tables to print $(tables) for use in populating import_order.txt
web/links/index.htm: updated to Firefox bookmarks. grouped version control systems into new version control folder.
inputs/.NCBI/: added new-style import runscripts, which renamed the staging table columns to VegCore
bugfix: lib/runscripts/datasrc_dir.run, subdir.run: need to remove leading . from dir name to get installed schema name, using new dir2schema()
lib/runscripts/datasrc_dir.run, subdir.run: use new lib/sh/datasrc.sh, which contains code in common to both datasrc-related dir runscripts
added lib/sh/datasrc.sh
inputs/.TNRS/schema.sql: AcceptedTaxon: removed Annotations entry because the accepted name only contains name elements, not additional text (vegpath.org/cf_aff)
bugfix: /README.TXT: Maintenance: syncing ~/bien to ~/Dropbox/svn: added overwrite=1 so that perms transfer from the authoritative ~/bien regardless of relative mtimes
removed no longer used lib/import.sh. use lib/runscripts/table.run instead.
added inputs/*/*/header.csv for CSV inputs, which are now generated by inputs/input.Makefile %/install
added inputs/FIA/*/{VegBIEN.csv,test.xml.ref}, which are now generated by the mapping process for the joined-together tables (even though they are not used by the import, because only occurrence_all is imported)
added inputs/GBIF/_archive/
removed inputs/GBIF/Specimen/, which has been replaced by the refresh in raw_occurrence_record_plants/
added inputs/GBIF/map.csv, used to regenerate inputs/GBIF/raw_occurrence_record_plants/map.csv when raw_occurrence_record_plants is resubset
inputs/FIA/*/postprocess.sql: removed svn:executable attribute using `svn pdel svn:executable ...` now that these are not shell scripts
removed no longer needed inputs/FIA/import. use inputs/FIA/run instead.
inputs/FIA/*/import: changed to postprocess.sql for use by the runscripts
added inputs/FIA/run
added inputs/FIA/*/run. these do not yet use the postprocessing operations in */import.
added inputs/FIA/table.run (for use by table subdirs) and helper Makefile
added lib/runscripts/view.run, for use with table subdirs for views, such as inputs/FIA/occurrence_all/
planning/timeline/timeline.2013.xls: added Reload analytical database checkmark for every Rebuild core database checkmark, because these are always done together as part of the import process
bugfix: inputs/FIA/occurrence_all/import: don't re-prepend * to terms because this is a view, and the underlying columns have already been mapped
bin/src_map: support custom (or no) new_term_prefix. no new_term_prefix is useful for views whose columns have already been renamed in the underlying tables and should not have * re-prepended.
planning/timeline/timeline.2013.xls: moved longer-term goals to new August column, leaving near-term goals in July
planning/timeline/timeline.2013.xls: erased cells where a task was planned but not worked on, so that all shaded cells in the past have check marks to indicate completion of a portion of the task, and empty shaded cells in the future indicate work left to do
planning/timeline/timeline.2013.xls: updated for current progress. renamed "Rerun species range models" to "Prepare to rerun species range models" because the range modeling itself is not part of the BIEN DB development. added a column for July with the tasks that are not yet complete.
bugfix: inputs/FIA/REF_SPECIES/import: PLANT_SYMBOL_TYPE: prepended * since it's a datasource column, and needs to match up with *PLANT_SYMBOL_TYPE in other table for joins
schemas/util.sql: try_create(): also ignore wrong_object_type exceptions thrown when trying to alter a view's columns
added inputs/FIA/_src/run, which runs ./download
lib/sh/make.sh: make(): run sys_cmd_path at a higher log_level since the make() steps should not be displayed by default
/README.TXT: to synchronize vegbiendev, jupiter, and your local machine: added step to update mtimes/perms on ~/Dropbox/svn/ so that copying files back to ~/bien does not overwrite the permissions from what is on vegbiendev
inputs/: don't upload test*.xml to jupiter on vegbiendev, because these files are also generated by the full database import but should only be backed up from one source machine, starscream (the Mac)
bin/make: moved $make_filter_active test to lib/sh/make.sh make() so that it's also used when make() is run directly (e.g. in a runscript) rather than via the bin/make wrapper in the PATH
bugfix: lib/sh/make.sh: make(): need to match absolute `make` paths such as /usr/bin/make
lib/sh/util.sh: added self_name alias and use it in self/self_sys
lib/sh/util.sh: added sys_cmd_path() and use it in cmd2sys
bugfix: bin/make: use separate $make_filter_active flag instead of $is_outermost for avoiding duplicate output filtering, so that an outer runscript, which sets $is_outermost but does not activate the make filter, will not prevent the make filter from being activated when make is invoked
bugfix: bin/make: need to use sys_cmd instead of command so that the system make command is invoked instead of the wrapper (which would cause infinite mutual recursion for the ~/bien working copy, although not for the ~/Dropbox/svn working copy because nonrecursive=1 was able to remove the single recursion)
bin/make: use .rel to do relative includes
bugfix: lib/sh/util.sh: .rel(): first use realpath() on BASH_SOURCE1 in case it's a symlink (as it is for bin/make)
inputs/FIA/_src/Makefile: Extraction: $(zips): use $(allZips) containing a zip for each state so that states that have not yet been downloaded and extracted (or had an empty dir created for them) will be downloaded. previously, the extract target only expanded existing zips but did not download new zips unless no zips had yet been downloaded. (this had been necessary because some states do not have a download, and the download of them would be continuously retried every time the Makefile was run.)
bugfix: inputs/FIA/_src/Makefile: `%: %.zip`: if unzip fails because the download does not exist, create an empty dir for the state instead of aborting make
inputs/FIA/_src/Makefile: use curl instead of wget because that is also available on Mac
bugfix: lib/sh/web.sh: curl(): use --fail so that curl returns a nonzero exit status on error (e.g. file not found) instead of appearing to exit successfully but outputting an error HTML document instead of the file
inputs/FIA/SUBPLOT/map.csv, import: prepended * to all FIA terms to clearly distinguish them from the VegCore terms. this is the standard convention for all datasources, to indicate which terms have not yet been mapped, but was not yet implemented at the beginning of new-style import (the FIA refresh was the first new-style datasource)....
inputs/FIA/import_order.txt: added remaining src tables, whose runscripts will be invoked in the order listed by lib/runscripts/datasrc_dir.run
added inputs/FIA/*/_no_import to src tables that are joined together in occurrence_all and should not also be imported separately once they are in import_order.txt
inputs/GBIF/run: inherit from lib/runscripts/datasrc_dir.run, which uses import_order.txt to forward calls to the subdirs
added blank runscripts inputs/GBIF/Source/run, Specimen/run because they are in import_order.txt (used by lib/runscripts/datasrc_dir.run)
bugfix: bin/make: do not alter the PATH passed to the invoked make command, since this is a general-purpose wrapper and is not linked to a specific working copy (it could be used to wrap any make invocation, not just for commands in the svn dir). this uses lib/sh/local.sh's new PATH_add= flag.
lib/sh/local.sh: added PATH_add= flag to allow turning off the addition of $bin_dir_abs to the PATH. this is useful for wrapper scripts that should not alter the PATH passed to their invoked command.
bugfix: lib/sh/make.sh: make(): invoke only the system make command instead of any wrapper for it in the PATH (by using self_sys instead of self), to prevent infinite recursion. single recursion is resolved by nonrecursive=1, but there are cases where mutual recursion occurs due to the presence of two, different bin/makes in the PATH (e.g. if you have two working copies with bin/make, and one is symlinked in your ~/bin/ folder), and these cases can only be resolved by clearing out the PATH completely (since the bin/makes do not know of each other's existence, in order to remove their parent dirs from the PATH).
lib/sh/util.sh: self_sys alias: use new sys_cmd() instead of `command -p` so that only the command path resolution is performed with a limited PATH, and the invoked command itself inherits the full PATH
lib/sh/util.sh: added sys_cmd(), which runs a system command and allows running a system command of the same name as the script
lib/sh/util.sh: added echo_builtin()
inputs/.rsync_ignore: test*.xml: turn on syncing again, but always treat the local side of the sync (starscream or vegbiendev) as the authoritative copy since they are the machines the tests can be run on
/.rsync_ignore: temp files: hide them on upload so that they are never synced to jupiter. hiding is different than unidirectionally exclude'ing them, because it also causes them to be deleted on the destination if they were uploaded in previous syncs.
inputs/VegBIEN/TWiki/.rsync_ignore: /**: turn syncing back on, but only allow it unidirectionally from vegbiendev->jupiter->starscream to avoid clobbering the live site or the jupiter backup. this is probably the only dir whose authoritative copy is always on vegbiendev. for all other dirs, edits can be made wherever convenient, so no copy is authoritative and no sync directions need to be restricted.
/README.TXT: Maintenance: synchronization: fixed whitespace
inputs/.rsync_ignore: install.log.sql: only exclude this on starscream (the local machine), using new machine-specific .rsync_filters, so that vegbiendev's copies of this will be backed up
lib/sh/sync.sh: upload(): .rsync_filter: also support machine-specific filters, for cases when different machines produce the same file (e.g. a log file) but only one machine's copy should be backed up
/README.TXT: Maintenance: to synchronize a Mac's settings with my testing machine's: removed filters that are now handled by .rsync_ignores
added inputs/GBIF/_src/.rsync_filter.upload,download to prevent old versions of GBIFPortalDB-*.dump.gz from being downloaded to the local machine, while keeping them on jupiter. this avoids the need to store these files in ~/Documents/BIEN/large_files/ with symlinks from inputs/GBIF/_src/ to exclude them from the sync.
bugfix: /README.TXT: Maintenance: to synchronize a Mac's settings with my testing machine's: sync ~/Dropbox/svn/ (the no-unversioned-files working copy) separately from the rest of the files, because .svn/ is now excluded by /.rsync_ignore, so that `svn up` needs to be used to keep the .svn/ dirs in sync. note that .svn/ should generally not be synced between machines, because they may use incompatible versions of the svn working copy format.
/README.TXT: Maintenance: to synchronize a Mac's settings with my testing machine's: use new bin/sync_upload (with $sync_remote_subdir) so that per-dir .rsync_ignores are processed, and to use the default $sync_remote_url
lib/sh/local.sh: $sync_remote_url: allow user to override just the sync subdir (not the whole URL) in $sync_remote_subdir. this is useful e.g. for backing up the Mac's files to jupiter.
/README.TXT: Maintenance: to synchronize vegbiendev, jupiter, and your local machine: use new bin/sync_upload instead of specifying all the filter patterns manually. this replaces several `put` commands with various filters with just a bin/sync_upload each on vegbiendev and your machine (in overwrite=1 mode to force a complete sync).
bugfix: backups/.rsync_filter.download: need to prevent existing backups from being deleted on the local side, too, by changing hide patterns to exclude
lib/sh/sync.sh: upload(): make put's $subpath option relative to the currdir instead, like the --include paths. note that $subpath unfortunately can't be used in subdirs at this point because it will cause rsync to ignore the .rsync_ignores and .rsync_filters in parent dirs, including the essential .rsync_ignore in the sync root dir.
/README.TXT: removed unnecessary `env` before kw params, which are treated as such whenever they appear before a command name
bugfix: /README.TXT: updated `make backups/download` to `make backups/<file>/download`
backups/Makefile: upload: use bin/sync_upload
inputs/Makefile: download-logs: use bin/sync_upload like upload/download
bugfix: /README.TXT: `make inputs/upload`, `make inputs/download`: added live=1 so that the sync operation runs rather than previewing what will be synced. removed test=1 because this flag is not used by put.
bugfix: inputs/Makefile: upload, download: need to exclude files in .rsync_ignore, so that large local-only files, such as inputs/GBIF/raw_occurrence_record_plants/table*.tsv, do not have to be synced before `make inputs/upload` can complete (the corresponding .gz gets extracted instead); and deleted temp files in inputs/VegBIEN/TWiki/, such as active sessions, are not added back to the live copy on vegbiendev. previously, fixing this required extracting the rsync command run by `make inputs/upload`, etc. and manually editing it to exclude the applicable .rsync_ignore files, each time `make inputs/upload`, etc. was run (including before every column-based import).
bugfix: bin/make: need to leave bin/, ~/bin/ in the PATH when running make nonrecursively, so that commands invoked by it which are located in these dirs (e.g. put, which will be used by `make inputs/upload`) can still be found. this requires using command()'s new nonrecursive=1 flag instead of running no_PATH_recursion, so that no_PATH_recursion() only affects the resolution of the command path, but does not propagate the filtered PATH to the invoked command itself.
lib/sh/util.sh: command(): added nonrecursive=1 flag, which uses cmd2abs_path to run an external command nonrecursively
lib/sh/util.sh: added cmd2abs_path, which makes the command in $1 nonrecursive
bugfix: lib/sh/util.sh: PATH_rm(): also need to remove adjacent occurrences of the same path (or occurrences which become adjacent when other paths are removed), which :...: matching wasn't doing because the trailing : is consumed, preventing it from being matched at the beginning of the next path. since unlike filesystem paths with /, it is not necessary for a match to span multiple :-separated sections, we can just use new split() to split apart the PATH into an array of paths, filter each path, and join() them back together.
lib/sh/util.sh: added split()
lib/sh/util.sh: auto-echo common external commands: added `which`
lib/sh/util.sh: auto-echo common external commands: use simpler echo_run instead of command since logging handling is not needed
added backups/vegbien.r9897.backup.md5
lib/sh/sync.sh: upload(): documented that each --include path is relative to the currdir, not the root dir of the upload ($local_dir). this feature, although previously unintended, is actually better because the user can change to a subdir of the root dir and specify upload paths relative to the dir they are in. however, when invoking upload() from a script with --include paths specified, this means you need to use an absolute path (e.g. "$(dirname "${BASH_SOURCE0}")"/...; or the value that will become $local_dir, which for sync_upload() is $root_dir).
backups/.rsync_ignore: replaced with .rsync_filter.upload to allow uploading new backups but not deleting existing backups if they don't exist on the local (rsync-invoking) side; and .rsync_filter.download to avoid downloading backups to the local side. this allows storing older backups just on jupiter, where there is much more disk space. note that this change must be made on the remote side (jupiter) for it to be effective, because these are remote-side rules and are only processed by the remote-side rsync instance.
lib/sh/sync.sh: upload(): use directional .rsync_filter to supplement .rsync_ignore with all kinds of --filter rules. separate .rsync_filters are needed for the upload (swap=) and download (swap=1) directions because the sender and the receiver are reversed, causing asymmetric rules like protect/hide to change meaning.
updated backups/TNRS.backup.md5
added backups/TNRS.2013-6-17.backup.md5, TNRS.2013-6-22.backup.md5
/README.TXT: Backups: TNRS cache: Back up/Restore: added runtimes (3 min/5.5 min)
lib/sh/sync.sh: upload(): usage: documented put's swap=1 flag, which downloads instead of uploads