Even if it hasn't, the file gets imported anyway
(into the `files` table) but we exit early to avoid
an error when trying to determine shooting direction.
This check is not necessary for final P1/11 or gun data.
Fixes#104.
We now check that a file is at least a few seconds old
before attempting to import it.
The actual minimum age can be configured in etc/config.yaml or
else is defaults to 10 seconds.
The idea is that this should give the OS enough time to fully
write the file before we import it.
The timestamp being looked at is the modification time.
Fixes#92.
We do this so that we can look for the "saillineOffset"
parameter, which we expect to be present in source
preplot imports and allows us to correlate source
and sail lines.
The change to bin/sps.py is necessary to let the JSON
serialisation take place.
Doing otherwise will result in the gun data file
appearing has having been read, but no data will
have been saved as there was nowhere to save to.
Fixes#29.
Each of the save_* operations starts a transaction
(which is automatically commited if all goes well).
The main reason for this is to ensure that by the
time raw_lines and final_lines events fire, the
corresponding entries in raw_shots and final_shots
have already been populated.
Unlike system_imports.py and system_exports.py, which
deal with whole tables via COPY, this allows us to
export / import *either* whole tables or specific
columns only.
The data will be exported to text files containing
the selected columns + the primary key columns for
the table.
When importing, those tables for which a selection
of columns was exported must already be populated.
The import process will overwrite the data of the
non primary key columns it knows about. If whole
tables are exported, on the other hand, when
re-importing rows will be appended rather than
updated. It is the user's responsibility to make
sure that this will not cause any conflicts.
Script meant to be run by runner.sh.
It will not overwrite existing files. If a
sequence is modified after the first export,
the resulting file needs to be removed by the
user before a re-export will occur.
The idea is to eventually export on demand
when a new raw is added to final_lines.
When the database is recreated, the sequences
used in the events_timed and events_seq tables
will be at their initial values, which will
almost certainly conflict with existing data
when it is imported via COPY.
With this commit, we set the current value for
those sequences to something usable.
Fixes#33.
Provided that the SmartSource headers are being
saved to file, and that the path to those files
is present in the survey configuration, we now
import SmartSource information as metadata in
raw_shots.meta->'smsrc'.
Closes#19.
It exports the data that is entered directly into
Dougal as opposed to being read from an external
source.
As of this commit, not all direct data is exported.
Specifically, sequence comments (raw and final),
sequence and shot NTBA and shot NTBP statuses are not
exported.
If there is a problem with files matching the capture
globs but not matching the file name regexp patterns,
these routines will emit a message to stderr and skip
the non-matching file.
It will not delete any labels that have been removed
from the configuration, as those may be used, but it
will add new labels and modify existing ones if they
changed.
This script runs the deferred imports. It is meant to
be called from a cronjob at regular intervals – every
one or two minutes is probably a good setting.
It checks if another instance is already running before
doing its thing.
If anything goes wrong (any of the called processes exits
with non-zero condition) it will send an alert to GitLab,
provided that the authorisation key is known.