Project

General

Profile

2013-01-11 conference call

Upcoming

  • Next meeting on Friday 1/18

To do

Validation

  1. add status column to validation table
    • set to "Complete" if done
  2. send SALVIAS revalidation info to Brad: see SALVIAS validation
  3. for specimens: esp. validate ARIZ, MO, NY, ACAD (Brad), REMIB (Brad), UNCC (Bob), U (Brad): see Brad's comments
    crossed out items now have an extract sent
  4. for plots: esp. validate SALVIAS, Madidi
  5. Brad will talk to Bob about getting a validation contact for FIA
  6. validate by end of January/early February

Database

  1. geovalidation
    • talk to Jim about using his scripts: they are now in Redmine under the *biengeo* project repository
    • next week, put together implementation plan
  2. ability to refresh individual datasources: see the README under Single datasource import
  3. complete analytical DB by middle of February

Attribution

  1. track subprovider metadata, access conditions
  2. for specimens: bien3_adb.ih data
    • acronym, AddWeb, AddEmail, NamOrganisation
  3. for plots: put subprovider metadata in projectcontributor/locationeventcontributor
  4. integrate projectcontributor/locationeventcontributor into analytical DB

Data refresh

  • GBIF (need data provider rules for attribution)
  • MO (Peter Jorgensen): refresh request sent

TNRS

  1. add taxondetermination for TNRS matched name to make TNRS errors easier to spot
  2. ability to delete just the TNRS taxondeterminations for a datasource: use make inputs/<datasource>/unscrub .
  3. separate TNRS operations from DB import operations: see the README under Single datasource import

Long-term

E-mail from Brad on 2013-1-11:

Basically, we divided the remaining DB development time into "By February" and "After February". By sometime in February, I'm hoping the first half, it would be good to have the new BIEN3 database completely ready to go so we can begin re-running the range models again. That means:

  • Obtaining all planned refreshes of BIEN2 data
  • Obtaining any planned new data
  • Rebuilding the BIEN3 core database
  • Completing all provider validation of the original data.
  • Make any final changes to schema and scripts required as a result of validation
  • Completing all secondary validations and enhancements for the BIEN3 analytical database
  • Rebuilding the BIEN3 analytical database

After February, there is a bunch of things we will need to wrap up:

  • Increased automation of import/export so that others can take over loading and exporting data when you move onto other things
  • Revise the import scripts to allow addition of new data and refresh of existing data sources without having the rebuild the entire database
  • Finalizing the exchange schema
  • Publication of schema
  • Prepare complete documentation of the exchange schema to assist new data providers in sharing data with BIEN
  • Improving documentation and support of data provider metadata, to allow us to track data owners, access and authorship requirements. Basically, I need to ensure that we have enough information in BIEN3 that I can implement some data access controls within the BIEN website. Also, any time someone extracts a dataset, we need to be able to pull from the database a complete list of data providers, contact information and list of conditions of access and authorship requirements
  • Fill in any missing documentation of scripts, etc. (although you've done a great job so far)

Availability

  • Brad's availability uncertain after June