NOTE: this endpoint is redundant with db.project.configuration.get()
except for its ability to return a partial tree.
TODO: merge this with db.project.configuration.get().
Instead of storing the project configuration in a YAML file
under `etc/surveys/`, this is now stored in public.projects.meta.
NOTE: as of this commit, the runner scripts (`bin/*.py`) are not
aware of this change and they will keep looking for project info
under `etc/surveys`. This means that projects created directly
in the database will be invisible to Dougal until the runner
scripts are changed accordingly.
The old db.project.list() function is now db.project.get()
and the old db.project.get() is not db.project.summary.get().
If a project does not exist, db.project.summary.get() now
throws a 404 rather than a database error.
A configuration item `imports.mounts` is added to
`etc/config.yaml`. This should be a list of paths
which must be non-empty. If any of the paths in that
list is empty, runner.sh will abort.
Closes#200.
Running on bare metal, 127.0.0.1 is a sensible choice of address
to bind on, but that is not the case when running inside a
container, so we add the ability to choose which IP to listen on.
This can be given via the environment variable HTTP_HOST when
starting the server or, if used as a module, as the second
argument of the start(port, host, path) function.
The postinstall script will (rightly) return non-zero if the API
docs cannot be built, but this creates a problem when building a
container (Docker) image. In that case, we expect the postinstall
to fail, as the required files (spec/*) will not have been copied
into the image when `npm install` is run.
By adding an explicit OR clause we allow postinstall to end
gracefully whether or not the API docs have been built.
This event handler checks if there is an UTC date jump between
consecutive shots. If a jump is detected, it sends to new entries
to the event log, for the last shot and first shot of the previous
and current dates, respectively.
Fixes#223.
The idea is to capture incoming real-time data to be able to
replay it later on development systems, e.g., for new development
or troubleshooting.
Issue #230.
The script bin/daily_tasks.py is intended to be run shortly after
midnight every day (e.g., via cron).
At the moment it inserts any missing LDSP / FDSP events. It can
be extended with other tasks as needed either by expanding
Datastore.run_daily_tasks() or by adding to bin/daily_tasks.py.
Fixes#223.
This defines a midnight_shots view and a log_midnight_shots() procedure
(with some overloads). The view returns all points straddling midnight
UTC and belonging to the same sequence (so last shot of the day and
first shot of the next day).
The procedure inserts the corresponding events (optionally constrained
by an earliest and a latest date) in the event log, unless the events
already exist.
Related to #223.
This script now works with the new event log.
Fixes#234. Midnight positions can be added via a cronjob such
as:
$DOUGAL_ROOT/BIN/insert_event.py -t "$(date -I) 00:00:00Z" \
-l Daily -l Prod \
"Midnight position: @DMS@ (@POS@)"
Events being created or edited via the API now call
replace_placeholders() making it possible to use
shortcuts to enter some event-related information.
See #229 for details.
This redefines augment_event_data() to use interpolation rather than
nearest neighbour. It now takes an argument indicating the maximum
allowed interpolation timespan. An overload with a default of ten
minutes is also provided, as an in situ replacement for the previous
version.
The ten minute default is based on Triggerfish headers behaviour seen
on crew 248 during soft starts.