bugfix: web/index.php: full directory index: appending query string: need to use $_SERVER["QUERY_STRING"], not $_SERVER["HTTP_AUTHORIZATION"] for this
schemas/public_.sql: sync_*(): renamed to *_modify() to facilitate finding these functions when modifying the corresponding view (using the new naming convention for a view's on-modify function)
bugfix: inputs/.TNRS/schema.sql: MatchedTaxon_modify(): updated to include taxon_scrub derived fields
bugfix: schemas/util.sql: mk_drop_from_create(): need to match first rather than last CREATE
inputs/.TNRS/schema.sql: *_modify(): allow running without a view_query, as recreate_view() now supports this
schemas/util.sql: recreate_view(): support omitting the view_query if the view has already been modified (eg. for public.*_view, which allow changing the view as a separate step)
fix: schemas/public_.sql: sync_*(): use util.copy() instead of CREATE TABLE AS so that table and column comments are also copied. this avoids the need to separately add the same comments to the view and its materialized table.
bugfix: schemas/util.sql: recreate(): need to handle case where util.mk_drop_from_create() is NULL
bugfix: schemas/util.sql: mk_drop_from_create(): only match CREATE if no custom DROP came before it
bugfix: schemas/public_.sql: sync_geoscrub_input_to_view(): `CREATE TABLE geoscrub_input AS __`: needs `LIMIT 0`
fix: schemas/util.sql: explain2notice_msg_if_can(): also need to catch invalid_cursor_definition ("cannot open multi-query plan as cursor")
schemas/public_.sql: sync_analytical_stem_to_view(): removed DROP TABLE IF EXISTS because this is now done automatically by util.recreate()
schemas/util.sql: added copy()
schemas/util.sql: added copy_data()
fix: lib/tnrs.py: Constrain by Source: turn it on so that the download settings reflect what TNRS actually used, while this is broken
fix: lib/tnrs.py: max_names: reduced back to 500 because even 5000 crashes the dev TNRS server
lib/tnrs.py: max_names: reduced to 5000 because 100,000 causes an internal server error
bugfix: /README.TXT: Full database import: To run TNRS: to rescrub all names: also need to re-create public-schema views that were cascadingly deleted
/README.TXT: Full database import: To run TNRS: added steps to rescrub all names
backups/TNRS.backup.md5: updated
lib/tnrs.py: switched to downloading all matches per name, as is needed to implement #917. note that this will break the parts of the schema that use the tnrs table, until Brad's match-picking algorithm can be implemented, but this tradeoff is necessary to be able to begin scrubbing sooner (Martha; wiki.vegpath.org/2014-05-29_conference_call#TNRS)
schemas/vegbien.sql: tnrs_input_name: don't scrub accepted names, as using multiple matches per name no longer provides a single accepted name to scrub. instead, the Accepted_* fields can be whitespace-split to generate the same columns that would have been generated by the scrubbing (and without the overhead of the extra TNRS call).
fix: inputs/.TNRS/schema.sql: added back index on Name_submitted, which is needed for tnrs_input_name to work properly (now that there is no automatic index created by a unique constraint)
fix: inputs/.TNRS/schema.sql: tnrs: removed unique constraint on Name_submitted, Name_matched because there can be more than one match with the same Name_matched (but different accepted names, etc.)
fix: inputs/.TNRS/schema.sql: tnrs.tnrs__valid_match index: made it non-unique to allow multiple matches per name, as is needed to implement #917
bugfix: inputs/.TNRS/schema.sql: tnrs__match_num__fill(): only fill if not set, to support case where tnrs is being restored from a .sql file (where match_num is already set)
inputs/.TNRS/schema.sql: tnrs: documented runtime to add a constraint (3 min)
inputs/.TNRS/schema.sql: unique constraint on Name_submitted: added Name_matched to allow multiple matches per name, as is needed to implement #917
inputs/.TNRS/schema.sql: tnrs: documented how to populate a new column
inputs/.TNRS/schema.sql: tnrs: pkey: use match_num instead of Name_number to allow multiple matches per name, as is needed to implement #917
inputs/.TNRS/schema.sql: tnrs.match_num: made it NOT NULL now that it's populated
inputs/.TNRS/schema.sql: tnrs: populate match_num
inputs/.TNRS/schema.sql: tnrs: documented how to add and remove columns
inputs/.TNRS/schema.sql: made COMMENTs start on their own line, using the steps at wiki.vegpath.org/Postgres_queries#make-COMMENTs-start-on-their-own-line
inputs/test_taxonomic_names/_scrub/*: updated to TNRS schema
inputs/.TNRS/schema.sql: tnrs: added match_num
inputs/.TNRS/data.sql.run: refresh(): documented runtime (1 min)
schemas/Makefile: added back rename/%, which is used by `inputs/.TNRS/data.sql.run refresh`. updated it to use schema bundles.
inputs/.TNRS/schema.sql: added tnrs__match_num__next()
inputs/.TNRS/schema.sql: added tnrs__batch_begin() trigger to populate the match_num (match sort order)
exports/2014-3-11.Jeff_Ott.climatic_range_determinants.csv.run: documented export_() runtime (11 min) and rowcount (11 million matching the filter criteria)
schemas/util.sql: added seq__reset()
schemas/util.sql: added seq__create()
fix: schemas/util.sql: try_cast(), is_castable(): also catch invalid_schema_name, thrown by `'pg_temp.__'::regclass`
lib/tnrs.py: max_names: increased to 100000 because the dev server can handle more names (no simultaneous users), as decided in the conference call (wiki.vegpath.org/2014-05-29_conference_call#TNRS)
schemas/vegbien.ERD.mwb: regenerated exports
schemas/vegbien.ERD.mwb: re-updated to schemas/vegbien.my.sql, which now recognizes the broken tables. fixed sync issues. vegbien.ERD.mwb is now fully in sync with vegbien.my.sql.
fix: lib/PostgreSQL-MySQL.csv: need to replace "double precision" with "double" to work with MySQL Workbench 5.2.47
schemas/vegbien.ERD.mwb: updated to schemas/vegbien.my.sql. some tables weren't recognized (likely due to bugs in MySQL Workbench 5.2.47), and have been left as-is (unsynced). note that downgrading to 5.2.35 is not an option, because that is fatally broken by a system upgrade.
fix: schemas/vegbien.ERD.mwb: use schemas/vegbien.my.sql instead of schemas/vegbien.my.sql.changes.sql as the sync source
schemas/vegbien.ERD.mwb: switched back to MySQL Workbench 6.1.6 version, which also works with MySQL Workbench 5.2.47
schemas/vegbien.ERD.mwb: restored version for MySQL Workbench 5.2.35 (undid r13549), as 6.1.6 has bugs in the DDL file sync
schemas/vegbien.ERD.mwb: renamed to vegbien.ERD.MySQL_Workbench_6.1.6.mwb to differentiate the versions for different versions of MySQL Workbench
removed no longer used config/VirtualBox_VMs/VegCore ERD/. use the Ubuntu 14.04 VM instead, which now has the VegCore ERD setup.
config/VirtualBox_VMs/Ubuntu 14.04/Ubuntu 14.04.vbox: added VegCore ERD setup
schemas/: svn:ignore *.changes.sql, needed for MySQL Workbench 6.1.6
schemas/vegbien.ERD.mwb: updated
schemas/vegbien.ERD.mwb: updated layout for MySQL Workbench 6.1.6, which uses a different line spacing
lib/tnrs.py: commented out the value of max_names that is not active, for clarity
config/VirtualBox_VMs/vegbiendev/vegbiendev.vbox: updated: merged post-bootloader installation steps into one snapshot now that eth1 addition is successful
config/VirtualBox_VMs/vegbiendev/vegbiendev.vbox: updated, which adds eth1
config/VirtualBox_VMs/Ubuntu 12.04/Ubuntu 12.04.vbox: updated
lib/tnrs.py: sources: updated to list/sort order in issue #917
added exports/2014-3-11.Jeff_Ott.climatic_range_determinants.csv.run
schemas/public_.sql: added 2014-3-11.Jeff_Ott.climatic_range_determinants
schemas/public_.sql: analytical_stem_view: added scrubbed_taxon_name_with_author, needed by Jeff Ott's analysis (wiki.vegpath.org/Data_requests)
inputs/.TNRS/schema.sql: taxon_scrub.scrubbed_unique_taxon_name.*: added scrubbed_taxon_name_with_author, needed by Jeff Ott's analysis (wiki.vegpath.org/Data_requests)
schemas/public_.sql: added scrubbed_specific_epithet, scrubbed_species_binomial, which are needed by Jeff Ott's analysis (wiki.vegpath.org/Data_requests)
fix: schemas/public_.sql: sync_analytical_stem_to_view(): removed fkey to source.shortname because this prevents reloading individual datasources
added downloads/
fix: schemas/util.sql: mk_drop_from_create(): also support CREATE queries that include the SELECT statement on the same line as the CREATE
schemas/public_.sql: analytical_stem_view: scrubbed_morphospecies_binomial: use new taxon_scrub.scrubbed_morphospecies_binomial
inputs/.TNRS/schema.sql: taxon_scrub: added scrubbed_morphospecies_binomial, analogous to accepted_morphospecies_binomial for scrubbed_*
inputs/.TNRS/schema.sql: taxon_scrub: documented how to modify it
inputs/.TNRS/schema.sql: added taxon_scrub_modify()
schemas/util.sql: create_if_not_exists(): print message if already exists, so the function doesn't inexplicably appear not to have run at all
inputs/.TNRS/schema.sql: MatchedTaxon_modify(): use simpler util.recreate_view()
inputs/.TNRS/schema.sql: MatchedTaxon_modify(): documented usage
schemas/util.sql: added recreate_view(), a special case of util.recreate()
fix: schemas/util.sql: recreate(): usage: use `schema` instead of `schemas`
config/VirtualBox_VMs/vegbiendev/vegbiendev.vbox: updated
bugfix: schemas/public_.sql: _plots_20_tnrs_names: verbatim_name_with_author: use taxonverbatim.taxonomicname rather than taxonlabel.taxonomicname
config/VirtualBox_VMs/Ubuntu */*.vbox: updated
bugfix: config/VirtualBox_VMs/: switched from symlinks to hard links, because svn does not follow symlinks
added config/VirtualBox_VMs/, containing the .vbox settings and password.txt (a non-empty password is needed for some system commands)
inputs/.TNRS/schema.sql: MatchedTaxon_modify(): removed no longer needed DROP VIEW statement
schemas/util.sql: recreate(): perform the correct DROP VIEW in the function itself so that the caller does not have to worry about forming it properly
bugfix: schemas/util.sql: mk_drop_from_create(): added `DROP`
schemas/util.sql: added mk_drop_from_create()
schemas/util.sql: added regexp_match()
planning/meetings/BIEN conference call availability.xlsx: updated
fix: schemas/util.sql: force_recreate(): renamed to just recreate(), because "force" normally implies that things will be deleted, which this function does not do