Activity
From 08/21/2014 to 09/19/2014
09/19/2014
- 03:51 PM Revision 14702: bugfix: /Makefile: install: also need to run config/download to get the necessary passwords
- 03:43 PM Revision 14701: config/Makefile: added download target
- 03:11 PM Revision 14700: /Makefile: config: renamed to bin/install because config is the name of a directory
- 02:53 PM Revision 14699: /README.TXT: Installation: `make install`: added instructions for what to do at each prompt
- 02:25 PM Revision 14698: /README.TXT: Installation: Check out svn: need to install svn first as this does not come preinstalled on Ubuntu
09/18/2014
09/17/2014
- 09:37 AM Bug #953 (New): CVS and VegBank data
- Martha,
Here is a rehash of issues with CVS. Several of these are likely to apply to Vegbank as well. I view th...
09/16/2014
- 05:35 PM Revision 14696: /README.TXT: to back up the local machine's hard drive: just do a single pass, to avoid the numerous different steps
- 05:25 PM Revision 14695: backups/*retention_policy*: on jupiter: changed to avoid retaining backups further back, as these consume significant disk space on jupiter and are not useful for anything
- 05:12 PM Task #952 (New): refresh GBIF so it can be published
- see [[conditions of use#GBIF|GBIF conditions of use]] > new portal
- 04:25 PM Revision 14694: fix: lib/sh/local.sh: $sync_remote_url: need $USER so user can be overridden when running as root
- 04:07 PM Revision 14693: fix: /README.TXT: to back up the local machine's settings: to jupiter: need `sudo -E` for new Documents/BIEN/vegbiendev*/
- 03:47 PM Revision 14692: /README.TXT: backups of ~/.dropbox/: just pause and resume Dropbox instead of quitting and restarting it
- 03:40 PM Revision 14691: /README.TXT: to back up the local machine's settings: to Dropbox: exclude gmvault-db/ to save space in Dropbox
- 03:36 PM Revision 14690: /README.TXT: to back up the local machine's settings: don't need to exclude ~/Library/Thunderbird/Profiles/9oo8rcyn.default/global-messages-db.sqlite because the disk that was corrupting this file has been replaced (#907)
- 03:33 PM Revision 14689: /README.TXT: to back up e-mails: don't need to also sync aaronmk@nceas.ucsb.edu because these e-mails are also in aaronmk.nceas@gmail.com (auto-forwarded)
- 03:16 PM Revision 14688: /README.TXT: Notes on system stability: removed no longer applicable warning about system upgrades, which is now incorrect because the disk space overrun bug (#887) was found *not* to have been caused by a system upgrade
- 02:57 PM Revision 14687: web/links/index.htm: updated to Firefox bookmarks: Roundtable: added links for Roundtable on "obstacles faced by researchers who reuse, share and manage data, and strategies for overcoming them"
- 02:46 PM Revision 14686: lib/Firefox_bookmarks.reformat.csv: label page's self-description as such: also support quotations enclosed in '
09/10/2014
- 11:07 PM Revision 14685: fix: exports/native_status_resolver.csv.run: added `users_by_name=1` to preserve the file group
- 11:00 PM Revision 14684: added inputs/VegBank/run.call_graph.log
- 05:45 PM Task #907 (Resolved): troubleshoot rsync verification errors
- this problem no longer occurs after the Time Machine restore, so it was most likely a disk corruption issue with the ...
- 05:03 PM Revision 14683: inputs/VegBank/run.log: updated for echo_vars() changes. the PG* vars, which contain important information, will now not need to be filtered out.
- 04:48 PM Revision 14682: lib/sh/util.sh: echo_vars(): merge repeated flags so there aren't flags in between the vars (which is also not valid declare syntax)
- 04:15 PM Revision 14681: lib/sh/db.sh: pg_cmd(): log vars on same line to avoid clutter
- 04:10 PM Task #887: fix disk space leak that fills the disk and crashes the import
- not fixing this because we fixed the bug in our code that was triggering this Postgres bug
- 03:58 PM Task #476 (New): develop map spreadsheet -> header override file translation utility
- multiple output locations for the same input column are now handled by source-specific derived columns, so this is no...
- 03:56 PM Task #483 (Resolved): rename staging table columns according to map.csv
- this was later implemented in a way that avoids needing to reinstall the staging tables
- 03:49 PM Task #951 (Resolved): test that `make install` is able to fully re-create vegbiendev
- h3. steps
# update vegbiendev:... - 03:17 PM Revision 14680: lib/sh/util.sh: echo_vars(): put all the vars on the same line so they don't clutter up the call graph generated at the default verbosity
- 01:56 PM Revision 14679: web/links/index.htm: updated to Firefox bookmarks: Mac: added link for VNC client. extended attributes: added link for chflags.
- 12:52 PM Revision 14678: added planning/workflow/staging_tables_installation_for_SQL_datasource.odg.src.log
- 12:51 PM Revision 14677: added inputs/VegBank/run.log
- 12:49 PM Revision 14676: fix: inputs/input.Makefile: $(svnFilesGlob): *.log should be in both the subdirs and the main dir
- 12:48 PM Revision 14675: inputs/input.Makefile: $(svnFilesGlob): *.log
09/09/2014
09/08/2014
- 04:09 PM Revision 14673: schemas/VegBIEN/data_dictionary/VegBIEN data dictionary.xlsx: updated
- 04:01 PM Revision 14672: bugfix: schemas/public_.sql: view_full_occurrence_individual_view and related views: synced to data dictionary spreadsheet, which adds back the links to the definitions (which used to be part of the column name itself)
- 03:50 PM Revision 14671: fix: schemas/public_.sql: analytical_plot, analytical_specimen: updated column names to be the same as analytical_stem, which these are a subset of
09/05/2014
- 10:51 PM Revision 14670: /README.TXT: to synchronize vegbiendev, jupiter, and your local machine: avoid extraneous diffs when rsyncing: clarified the machines that the command should be run on
- 10:47 PM Revision 14669: web/links/index.htm: updated to Firefox bookmarks: removed broken favicons
- 10:45 PM Revision 14668: web/links/index.htm: updated to Firefox bookmarks: updated favicons
- 10:43 PM Revision 14667: fix: /README.TXT: to backup files not in Time Machine: need to use 2 TB external hard drive instead of Time Machine drive because Time Machine drive does not have ~/Documents/BIEN/ in a location where it can be hardlinked against
- 10:02 PM Revision 14666: web/links/index.htm: updated to Firefox bookmarks: categorized uncategorized bookmarks
- 10:00 PM Revision 14665: web/links/index.htm: updated to Firefox bookmarks: updated favicons
- 09:55 PM Revision 14664: web/links/index.htm: updated to Firefox bookmarks: local machine phpPgAdmin: removed this so the Mac won't get woken up on network access whenever someone opens the links page, which attempts to load the favicon from the local machine. the previous solution of manually deleting the favicon (r13406) doesn't work because the favicon will just get re-added whenever this bookmark is visited.
- 09:37 PM Revision 14663: web/links/index.htm: updated to Firefox bookmarks: find: added instructions for searching by <>, not just =
- 08:48 PM Revision 14662: /README.TXT: Datasource setup: For MS Access databases: added that one should use the settings in the associated .ini file where available
- 08:46 PM Revision 14661: /README.TXT: Datasource setup: For MS Access databases: program link: added page subsections
- 07:44 PM Revision 14660: /README.TXT: to backup files not in Time Machine: note that Time Machine dereferences hard links: added commands documenting that this is the case
- 05:12 PM Revision 14659: fix: /README.TXT: to backup files not in Time Machine: on first run, create parent dirs: added mkdir for Postgres
- 05:11 PM Revision 14658: bugfix: /README.TXT: to backup files not in Time Machine: on first run, create parent dirs: mkdir: need sudo
- 05:07 PM Revision 14657: /README.TXT: to backup files not in Time Machine: moved to root/ subdir to group the multiple top-level dirs together
- 04:53 PM Revision 14656: /README.TXT: to backup files not in Time Machine: added the vegbiendev archival backups, which cannot be backed up by Time Machine because it dereferences hard links
- 04:51 PM Bug #950 (Resolved): fix view_full_occurrence_individual_view rows with is_geovalid NULL
- h3. test case
eg. this happens for CVS rows:
|datasource|country|state_province|county|latitude|longitude|is_ge... - 04:12 PM Revision 14655: /README.TXT: to backup files not in Time Machine: documented why Postgres cannot be backed up by Time Machine
09/04/2014
- 11:52 AM Revision 14654: /README.TXT: Single datasource refresh: added steps to place the updated extract and extracted flat file(s)
- 11:48 AM Revision 14653: /README.TXT: Single datasource refresh: connect to vegbiendev first, even though steps before it have their own step to do this
- 11:47 AM Revision 14652: /README.TXT: Single datasource refresh: reimport_scrub: added step to view progress
- 11:45 AM Revision 14651: /README.TXT: Single datasource refresh: moved to top since these steps are performed more often
- 10:23 AM Revision 14650: added planning/workflow/BIEN data workflow-2_bje.png export
- 10:05 AM Revision 14649: planning/meetings/BIEN conference call availability.xlsx: updated
- 10:04 AM Bug #948 (Resolved): fix duplicated rows in view_full_occurrence_individual
- 10:03 AM Revision 14648: planning/workflow/BIEN data workflow-2_bje.pptx: updated with Martha's changes and changes during conference call
- 08:10 AM Revision 14647: bugfix: web/BIEN3/Redmine/.htaccess: subpath redirect: also redirect dirs, so that empty-subdir main-page redirects (eg. wiki.vegpath.org) work properly
- 07:55 AM Revision 14646: bugfix: web/BIEN3/Redmine/.htaccess: main page should continue to redirect to wiki, not Redmine project page
- 07:44 AM Revision 14645: schemas/public_.sql: *_view: re-ran *_view_modify(), which use the new non-blocking rematerialize_view()
- 07:41 AM Revision 14644: schemas/public_.sql: viewFullOccurrence_*: renamed to view_full_occurrence_* at Brian M's and Martha's request (e-mails from Martha on 2014-8-12 at 17:37PT, and from Brian M on 2014-8-13 at 16:21PT). note that this change has already been made on vegbiendev.
- 07:21 AM Revision 14643: schemas/public_.sql: view_full_occurrence_individual: re-ran view_full_occurrence_individual_view_modify(), which uses the new non-blocking rematerialize_view()
- 07:20 AM Revision 14642: schemas/util.sql: rematerialize_view(): made it non-blocking, so that it would allow full access to the original materialized table during the operation
- 07:11 AM Revision 14641: schemas/util.sql: added identifier_replace()
- 07:08 AM Revision 14640: schemas/util.sql: added relation_replace()
- 01:50 AM Revision 14639: /README.TXT: Single datasource import: renamed to Single datasource refresh since it works on existing datasources
- 01:45 AM Revision 14638: /README.TXT: Single datasource import: also need to reload staging tables
- 01:42 AM Revision 14637: /README.TXT: Single datasource import: added steps to re-run geoscrubbing and back up the vegbiendev database
09/03/2014
- 02:43 AM Revision 14636: planning/workflow/BIEN data workflow-2_bje.pptx: fixed text alignment
- 02:35 AM Revision 14635: planning/workflow/BIEN data workflow-2_bje.pptx: answered questions asked in the diagram
- 02:19 AM Revision 14634: added planning/workflow/BIEN data workflow-2_bje.pptx from Martha/Brian E (in Asana)
08/29/2014
- 03:55 PM Revision 14633: added inputs/CVS/verify/Review of CVS data in BIEN3.docx
- 12:40 AM Revision 14632: backups/*retention_policy*: added explanations
- 12:39 AM Revision 14631: backups/*retention_policy*: on jupiter: backups further back: removed "if disk space permits" because this is already labeled "optionally"
- 12:38 AM Revision 14630: backups/*retention_policy*: changed to require retaining *.backup of the last 2 successful imports on all machines
- 12:25 AM Revision 14629: backups/*retention_policy*: allow keeping *.backup of the last 2 successful imports on all machines, not just jupiter
- 12:17 AM Revision 14628: **: renamed 2TB drive's BIEN3 partition to BIEN3.**SAVE** since one might not see the **SAVE** file in it
- 12:13 AM Revision 14627: **: renamed 2TB drive's BIEN3 partition to BIEN3.**SAVE** since one might not see the **SAVE** file in it
- 12:09 AM Revision 14626: **/"**DO_NOT_DELETE**": renamed to shorter **SAVE**
- 12:04 AM Revision 14625: added backups/*retention_policies*/ with retention policy files for each partition
08/28/2014
- 11:58 PM Revision 14624: backups/README.TXT: renamed to *retention_policy* to match the naming convention of the retention policy files in the various partitions
- 11:42 PM Revision 14623: /README.TXT: to back up the local machine's hard drive: also exclude *-files indicating the (differing) retention statuses of the partitions involved
- 08:13 PM Revision 14622: lib/tnrs.py single_tnrs_request(), bin/tnrs_client: use_tnrs_export: default to False because this mode uses incorrect selected matches (vegpath.org/issues/943), and the JSON mode that fixes this is now available
- 08:05 PM Revision 14621: bin/tnrs_db: tnrs.tnrs_request() call: explicitly set use_tnrs_export=True so that this continues to work if the default value is changed
- 07:57 PM Revision 14620: bugfix: lib/csvs.py: JsonReader: need to pass col_order to row_dict_to_list_reader
- 07:43 PM Revision 14619: config/VirtualBox_VMs/vegbiendev/README.TXT: ~/Documents/BIEN/vegbiendev.2014-2-2_1-07-32PT.+VirtualBox_changes/: renamed to vegbiendev.2014-2-2_1-07-32PT.VirtualBox/ to make clear that this is the VirtualBox version of vegbiendev
- 07:12 PM Revision 14618: bugfix: lib/tnrs.py: JSON output: need to stringify arrays so they match what is output in TSV-export mode
- 07:10 PM Revision 14617: lib/csvs.py: JsonReader: added support for values that are arrays
- 07:05 PM Revision 14616: lib/csvs.py: MultiFilter: inherit from WrapReader instead of Filter to avoid needing to define a no-op filter_() function
- 06:49 PM Revision 14615: bugfix: lib/csvs.py: row_dict_to_list_reader: need to override next() directly instead of just using Filter, because Filter doesn't support returning multiple rows for one input row (in this case, prepending a header row). this caused the 1st data row to be missing.
- 06:47 PM Revision 14614: lib/csvs.py: Filter: inherit from WrapReader, which separates out the CSV-reader API code
- 06:43 PM Revision 14613: lib/csvs.py: added WrapReader
- 06:43 PM Revision 14612: lib/csvs.py: added Reader
- 06:00 PM Revision 14611: schemas/public_.sql: views that use view_full_occurrence_individual_view: use the view_full_occurrence_individual table instead, now that this is materialized.
- 05:58 PM Revision 14610: planning/meetings/BIEN conference call availability.xlsx: updated
- 08:57 AM Revision 14609: /README.TXT: to back up the local machine's hard drive: renamed backup partition to BIEN3 to make clear what the backup drive contains
- 08:54 AM Revision 14608: fix: /README.TXT: to back up the local machine's hard drive: updated location of `screen` for added commands
- 08:53 AM Revision 14607: /README.TXT: added trailing / on dirs to make clear that they're dirs
- 08:40 AM Revision 14606: config/VirtualBox_VMs/vegbiendev/README.TXT: added instructions to configure the VM to support VirtualBox
- 08:22 AM Revision 14605: config/VirtualBox_VMs/vegbiendev/README.TXT: added instructions to retrieve the contents of the VM, with the VirtualBox changes added
- 07:47 AM Revision 14604: config/VirtualBox_VMs/vegbiendev/README.TXT: to retrieve the original contents of the backup from the VM: added steps to restore the correct VM snapshot
- 07:40 AM Revision 14603: config/VirtualBox_VMs/vegbiendev/README.TXT: also generate list of all the files whose permissions were changed since the backup, but which are extracted with their changed permissions instead of their original ones in the backup
- 07:05 AM Revision 14602: config/VirtualBox_VMs/vegbiendev/README.TXT: added instructions to retrieve the original contents of the backup from the VM
- 05:47 AM Revision 14601: fix: /README.TXT: to back up vegbiendev: also back up /home/aaronmk/bien/ (instead of just symlinking to the local copy), since this can be done space-efficiently with hardlinks. this ensures that the vegbiendev backup will not be modified when the local copy of bien/ is.
- 03:10 AM Revision 14600: lib/csvs.py: JsonReader: factored out row-dict-to-list into new row_dict_to_list_reader so that JSON-specific preprocessing is kept separate from the row format translation
08/27/2014
- 03:17 PM Revision 14599: lib/csvs.py: added MultiFilter, which enables applying multiple filters by nesting
08/26/2014
- 07:57 PM Revision 14598: lib/tnrs.py: single_tnrs_request(): JSON mode: implemented output of JSON data
- 07:53 PM Revision 14597: lib/tnrs.py: single_tnrs_request(): factored out wrapping in TnrsOutputStream, since this is done for both modes
- 07:47 PM Revision 14596: fix: lib/tnrs.py: JSON mode: TSV export columns: need to translate these to JSON column names before they can be used with the JSON data
- 07:44 PM Revision 14595: lib/csvs.py: added JsonReader, which reads parsed JSON data as row tuples
- 07:43 PM Revision 14594: lib/csvs.py: added row_dict_to_list(), which translates a CSV dict-based row to a list-based one
- 07:43 PM Revision 14593: lib/csvs.py: RowNumFilter: added support for filtering the header row as well
- 07:42 PM Revision 14592: lib/csvs.py: ColInsertFilter: added support for filtering the header row as well
- 05:12 PM Revision 14591: lib/csvs.py: InputRewriter: documented that this is also a stream (in addition to inheriting from StreamFilter)
- 05:11 PM Revision 14590: bugfix: lib/csvs.py: InputRewriter: accept a reader, as would be expected, instead of a custom stream whose lines are tuples
- 05:08 PM Revision 14589: fix: lib/sql_io.py: append_csv(): use new csvs.ProgressInputFilter instead of streams.ProgressInputStream(csvs.StreamFilter(__)), so that the input to csvs.InputRewriter is a reader, not a stream. this avoids the need for csvs.InputRewriter to accept a stream whose lines are tuples, instead of the expected reader.
- 05:02 PM Revision 14588: bugfix: inputs/input.Makefile: %/install: $(exportHeader) must come before postprocess because postprocess renames columns
- 04:50 PM Revision 14587: exports/: svn:ignore: added *.gz
- 04:49 PM Revision 14586: lib/csvs.py: added ProgressInputFilter, analogous to streams.ProgressInputStream
- 04:46 PM Revision 14585: lib/sql_io.py: added commented-out debug statement used to troubleshoot copy_expert() errors
- 04:45 PM Revision 14584: lib/dicts.py: added pair_keys(), pair_values()
- 04:15 PM Revision 14583: bugfix: lib/streams.py: CaptureStream: end_idx must also be > start_idx
- 04:07 PM Revision 14582: bugfix: inputs/input.Makefile: $(import_install_): need `set -o pipefail` to enable errexit
- 03:47 AM Revision 14581: /README.TXT: to backup files not in Time Machine: don't need to review diff because command is unidirectional
- 02:59 AM Revision 14580: fix: /README.TXT: to back up the local machine's hard drive: "repeat until only minimal changes" should refer to the first sync command
- 02:52 AM Revision 14579: inputs/.geoscrub/geoscrub_output/run: documented postprocess() rm=1 runtime (6 min)
08/25/2014
- 10:17 PM Revision 14578: lib/tnrs.py: single_tnrs_request(): use_tnrs_export=False: need to obtain export columns
- 10:16 PM Revision 14577: lib/csvs.py: added header(stream)
- 10:16 PM Revision 14576: fix: lib/tnrs.py: single_tnrs_request(): need to `assert name_ct >= 1`, because with no names, TNRS hangs indefinitely
- 09:13 PM Revision 14575: bin/tnrs_client: added env var to configure use_tnrs_export
- 08:18 PM Revision 14574: /README.TXT: to back up vegbiendev: use inplace=1 to speed stopping and resuming transfer
- 07:54 PM Revision 14573: fix: /README.TXT: to back up the local machine's hard drive: removed --extended-attributes (after initial sync) because rsync apparently has to visit every file for this
- 07:35 PM Revision 14572: fix: /README.TXT: to back up the local machine's hard drive: also need --extended-attributes
- 07:34 PM Revision 14571: /README.TXT: to back up the local machine's hard drive: removed --delete-before now that that partition has been expanded
- 07:16 PM Revision 14570: fix: /README.TXT: to back up vegbiendev: exclude /var/lib/mysql.bak,postgresql.bak because the local machine doesn't need 2 copies of this information
- 07:05 PM Revision 14569: /README.TXT: to back up vegbiendev: removed no longer needed exclude of Dropbox subdir backup
- 06:58 PM Revision 14568: fix: /README.TXT: to back up vegbiendev: also need to do steps under Maintenance > "to synchronize vegbiendev, jupiter, and your local machine" because /home/aaronmk/bien is not synced here
- 06:52 PM Revision 14567: bugfix: /README.TXT: to back up vegbiendev: need `overwrite=1`
- 06:47 PM Revision 14566: /README.TXT: to back up vegbiendev: removed no longer needed exclude of Dropbox subdir backup
- 06:46 PM Revision 14565: /README.TXT: to back up the version history: don't also need this on vegbiendev because it's already on jupiter and the local machine
- 06:43 PM Revision 14564: bugfix: /README.TXT: to back up vegbiendev: need to include Postgres config files
- 06:24 PM Revision 14563: /README.TXT: to back up the local machine's hard drive: don't back up temp files: added /.fseventsd/
- 05:54 PM Revision 14562: fix: /README.TXT: to back up the local machine's hard drive: initial runtime: use range instead because some of the later runtime might have been from the same files
- 05:52 PM Revision 14561: /README.TXT: to back up the local machine's hard drive: updated initial runtime to include additional transferred files (17 h)
- 05:36 PM Revision 14560: fix: /README.TXT: to back up the local machine's hard drive: need to use --delete-before because the backup partition is near capacity
- 05:34 PM Revision 14559: /README.TXT: to back up the local machine's hard drive: don't back up temp files such as /private/var/vm/*
- 05:30 PM Revision 14558: fix: /README.TXT: to back up the local machine's hard drive: back up most Dropbox/Postgres files before stopping processes, to minimize downtime
- 05:17 PM Bug #949 (New): fix shortened view_full_occurrence_individual column names that are now ambiguous
- * some, like @taxon_observation_id@/@taxonobservation_id@, are now ambiguous as a result of the removal of the scopin...
- 04:56 PM Bug #948 (Resolved): fix duplicated rows in view_full_occurrence_individual
- h3. issue
"from Brody":mailto:brody.sandelATbiology.au.dk?Brody_Sandel.2014-8-22-9:43PT:
> SELECT * FROM view_f...
08/21/2014
- 07:35 PM Revision 14557: bugfix: /README.TXT: to back up the local machine's hard drive: can't use ~ with --exclude
- 07:31 PM Revision 14556: fix: inputs/.geoscrub/geoscrub_output/postprocess.sql: map_geovalidity(): unscrubbable names should actually be geo*in*valid, not geovalid=NULL, according to Brad
- 07:24 PM Revision 14555: /README.TXT: to back up the local machine's hard drive: back up the non-Dropbox, non-Postgres files separately to minimize the Dropbox and Postgres downtime
- 06:03 PM Revision 14554: /README.TXT: to back up the vegbiendev databases: don't need to review diff for these as it's always unidirectional
- 05:55 PM Revision 14553: /README.TXT: added instructions to back up vegbiendev
- 05:12 PM Revision 14552: fix: /README.TXT: to back up the local machine's hard drive: also need to repeat backup command until only minimal changes
- 05:11 PM Revision 14551: /README.TXT: to back up the local machine's hard drive: added step to stop Postgres
- 05:10 PM Revision 14550: bugfix: /README.TXT: to back up the local machine's hard drive: also need to stop Dropbox
- 05:06 PM Revision 14549: /README.TXT: to back up the local machine's settings: added step to remove .DS_Store
- 04:47 PM Revision 14548: fix: /README.TXT: to back up the local machine's settings: Dropbox: shoudl not run with `del=`, because the backup should be an exact replica
- 04:25 PM Revision 14547: backups/TNRS.*: removed no longer needed old TNRS backups, which are part of the respective full-database backups in any case
- 02:57 PM Revision 14546: added config/phpMyAdmin/ symlink to schemas/VegCore/phpMyAdmin/
- 12:40 PM Revision 14545: bugfix: lib/sh/archives.sh: compress(): don't include dir prefix in zip archive
- 12:40 PM Revision 14544: lib/sh/util.sh: cd(): use echo_run instead of a manual echo_cmd call
- 12:35 PM Revision 14543: fix: lib/sh/util.sh: cd(): indent after running cd rather than before
- 12:32 PM Revision 14542: lib/sh/util.sh: cd(): support rebasing path vars for the new dir
- 11:51 AM Revision 14541: bugfix: lib/sh/archives.sh: compress(): need to use zip's path syntax to avoid the file in the archive being named "-"
- 08:56 AM Revision 14540: lib/tnrs.py: added option to avoid using TNRS's TSV export feature, which currently returns incorrect selected matches (vegpath.org/issues/943). this has been implemented up through the GWT/JSON decoding.
- 08:50 AM Revision 14539: lib/tnrs.py: added gwt_decode()
- 08:49 AM Revision 14538: lib/strings.py: added unesc_quotes() and helper functions
- 08:49 AM Revision 14537: lib/strings.py: added json_decode()
- 08:47 AM Task #947 (New): have all scripts that replace DB items rename the existing item out of the way rather than dropping it
- * entire DB _done in README.TXT but not scripts_
* -full-database import schema-
* -analytical DB-
* -individual d... - 08:38 AM Revision 14536: /README.TXT: To re-run geoscrubbing: updated runtimes
- 08:25 AM Revision 14535: exports/*_GBIF.csv.run: documented compress_() runtime (20 min-1 h)
Also available in: Atom