Revision 7395
Added by Aaron Marcuse-Kubitza about 12 years ago
lib/sql_io.py | ||
---|---|---|
342 | 342 |
'''Recovers from errors. |
343 | 343 |
Only works under PostgreSQL (uses INSERT RETURNING). |
344 | 344 |
|
345 |
Warning: This function's insert/on duplicate select algorithm does not
|
|
346 |
support database triggers that populate fields covered by a unique
|
|
347 |
constraint. Such fields must be populated by the mappings instead.
|
|
345 |
Warning: This function's normalizing algorithm does not support database
|
|
346 |
triggers that populate fields covered by a unique constraint. Such fields
|
|
347 |
must be populated by the mappings instead. |
|
348 | 348 |
(Other fields are not affected by this restriction on triggers.) |
349 |
|
|
350 |
Note that much of the complexity of the normalizing algorithm is due to |
|
351 |
PostgreSQL (and other DB systems) not having a native command for |
|
352 |
insert/on duplicate select. This operation is a cross between MySQL's |
|
353 |
INSERT ON DUPLICATE KEY UPDATE (which does not provide SELECT |
|
354 |
functionality), and PostgreSQL's INSERT RETURNING (which throws an error |
|
355 |
instead of returning the existing row). |
|
349 | 356 |
@param in_tables The main input table to select from, followed by a list of |
350 | 357 |
tables to join with it using the main input table's pkey |
351 | 358 |
@param mapping dict(out_table_col=in_table_col, ...) |
Also available in: Unified diff
sql_io.py: put_table(): Documented that much of the complexity of the normalizing algorithm is due to PostgreSQL not having a native command for insert/on duplicate select