Compare commits

..

592 Commits

Author SHA1 Message Date
D. Berge
37de5ab223 Implement UI for flagging QCs as accepted or unaccepted 2022-05-04 18:21:42 +02:00
D. Berge
d69c6c4150 Add DougalQcAcceptance Vue.js component.
Widget for use in the QC view to show controls for accepting or
unaccepting QCs.
2022-05-04 18:20:28 +02:00
D. Berge
d80f44547b Update API description 2022-05-04 18:13:14 +02:00
D. Berge
6c8515a879 Add QC results accept/unaccept API endpoints 2022-05-04 18:11:05 +02:00
D. Berge
bb9340a0af Add QC results accept/unaccept middleware.
This middleware can only deal with shot QCs, not sequence-wide QCs.
2022-05-04 17:22:18 +02:00
D. Berge
672c14fb67 Add functions to accept/unaccept QCs.
These are only able to deal with shot QCs. At this point, sequence-wide
QCs cannot be marked as accepted.
2022-05-04 17:19:20 +02:00
D. Berge
f4ee798bf0 Implement endpoint for QC deletion.
Closes #217.
2022-05-04 17:15:28 +02:00
D. Berge
c8ef089b28 Log speed value on Hydronav error.
Related to #206.
2022-05-03 23:58:42 +02:00
D. Berge
1f6d560d7e Style log events according to online/offline status.
Strictly speaking, it doesn't consider (or know) what the shooting
status is (but see #214). All it does is colour events differently
if they have all three of: sequence, point and timestamp.

This is probably good enough for the time being to close #134.
2022-05-03 23:42:58 +02:00
D. Berge
f37e07796c Change description of QC test.
It's not an error but only a warning.
2022-05-03 17:27:34 +02:00
D. Berge
349c052db0 Use all sequences to build QC tree.
Fixes #213.
2022-05-03 17:23:50 +02:00
D. Berge
1c291db6c6 Add database upgrade file 18.
* Adds label_in_sequence() function

NOTE: This function is already defined in schema-template.sql but
seemingly never got pushed into production.

Fixes #211.
2022-05-02 13:40:33 +02:00
D. Berge
f46fd4b6bc Cope with non-existing configuration paths.
Fixes #212.
2022-05-02 13:15:41 +02:00
D. Berge
10883eb1a6 Check for invalid speed values in Hydronav header.
Related to #206. If this is indeed what is causing the alerts,
we will change the logic so that it simply logs (or ignores)
invalid speeds rather than throwing.
2022-05-02 13:09:43 +02:00
D. Berge
af6e419aab Run QCs from runner.
When importing an old project, the first QC run could take a while
and cause a bit of backlog, but during normal shooting it is expected
that it will finish quite quickly (and this is monitored anyway).
2022-05-01 21:26:10 +02:00
D. Berge
6516896bae Disable system imports in runner.
They're not really used. Will probably remove at a later date.
2022-05-01 21:24:56 +02:00
D. Berge
c495dce27d Don't show event history widget for guests.
NOTE: guests still do have access to the relevant API endpoint.
In theory, a persistent and computer literate guest user could
visit the API endpoint directly and retrieve the edit history.
As the edit history may need to be given to users who otherwise
do not have write access, it is considered quite acceptable to
allow guest users to access the endpoint.

Closes #194.
2022-05-01 21:20:52 +02:00
D. Berge
40d96230d2 Adjust planner times from runner.
Fixes #167.
2022-05-01 20:27:19 +02:00
D. Berge
d607b4618a Merge branch '182-periodically-scan-the-events-table-for-missing-information' into 'devel'
Resolve "Periodically scan the events table for missing information"

Closes #182

See merge request wgp/dougal/software!26
2022-05-01 18:24:35 +00:00
D. Berge
fd41d2a6fa Launch database housekeeping tasks from runner 2022-05-01 20:10:27 +02:00
D. Berge
39690c991b Update database templates.
* Add index on public.real_time_inputs.meta->>'tstamp'
* Add public.geometry_from_tstamp()
* Add augment_event_data()
2022-05-01 19:47:16 +02:00
D. Berge
09ead4878f Add database upgrade file 17 2022-05-01 19:46:04 +02:00
D. Berge
588d210f24 Fix reporting for “gun pressures” QC test.
Fixes #205.
2022-04-30 17:37:38 +02:00
D. Berge
28be86e7ff Graphs view: delay “no sequences” message until loaded.
Related to #196.
2022-04-30 16:14:32 +02:00
D. Berge
1eac97cbd0 Change “No fire” QC definition 2022-04-30 16:13:12 +02:00
D. Berge
e3a3bdb153 Clean up whitespace.
Commands used:

find . -type f -name '*.js'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
find . -type f -name '*.vue'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
find . -type f -name '*.py'| while read FILE; do if echo $FILE |grep -qv node_modules; then sed -ri 's/^\s+$//' "$FILE"; fi; done
2022-04-29 14:48:21 +02:00
D. Berge
0e534b583c Do not assume that lines have remarks.
Fixes #202.
2022-04-29 14:32:46 +02:00
D. Berge
51480e52ef Recognise "dark", "light" label view attributes.
In a label definition (in etc/surveys/*.yaml) we can now have
"dark" or "light" attributes under "view" to force the label
text to always use either the dark or light theme. This is
useful when a label's colour causes a bad contrast in either
theme.

Example:

  labels:
      Daily:
          view:
              colour: "#EFEBE9"
              description: "Of interest in the daily report"
              light: true # Text always displayed in a dark colour
          model:
              user: true
              multiple: true
2022-04-29 12:18:09 +02:00
D. Berge
187807cfb1 Enable Save button as soon as the remarks are changed.
Closes #199.
2022-04-27 19:45:26 +02:00
D. Berge
d386b97e42 Database upgrade 16: fix event edits.
Fixes #198.
2022-04-27 17:41:53 +02:00
D. Berge
da578d2e50 Fix project_summary view returning unwanted rows.
Fixes #197.
2022-04-27 10:49:46 +02:00
D. Berge
7cf89d48dd Fix whitespace 2022-04-26 17:41:48 +02:00
D. Berge
c0ec8298fa Don't try to show QC graphs on a new project.
If there are no sequences, just show a message to the effect.

Fixes #196.
2022-04-26 17:39:59 +02:00
D. Berge
68322ef562 Fix misleading comment.
Use an EPSG code that is actually in the work area of the Dougal boats.
2022-04-26 17:36:48 +02:00
D. Berge
888228c9a2 Do not crash if a project doesn't have QCs defined.
Fixes #195.
2022-04-26 14:50:34 +02:00
D. Berge
74d6f0b9a0 Accept mime query parameter 2022-04-16 17:18:04 +02:00
D. Berge
cf475ce2df Adapt middleware to new database schema.
As introduced by commit 0c6567d8f8.
2022-04-16 17:18:04 +02:00
D. Berge
26033b2a37 Fix syntax error.
Introduced by commit ead938b40f.
2022-04-13 09:04:52 +02:00
D. Berge
fafd4928d9 Fix Marked call (adapt to new Marked version) 2022-04-13 08:18:21 +02:00
D. Berge
ec38fdb290 Pin package sass version to avoid annoying warning 2022-03-18 20:07:50 +01:00
D. Berge
086172c5e7 Upgrade dependencies.
This is a conservative upgrade.

The upgraded version of leaflet-arrowheads uses optional chaining which
seems to cause webpack to choke, so added to "transpileDependencies" in
vue.config.js.

Closes #189.
2022-03-18 16:29:50 +01:00
D. Berge
3db453a271 Add keys to v-for loops 2022-03-18 16:15:06 +01:00
D. Berge
a5db9c984b Show sequence comments in log page 2022-03-18 15:05:08 +01:00
D. Berge
ead938b40f Inhibit exports.
They don't seem to be used, and for backups it's better to
just back up the whole database instead, which is being done
remotely.
2022-03-18 13:32:43 +01:00
D. Berge
634a7be3f1 Merge branch '184-refactor-qcs' into devel 2022-03-17 20:12:15 +01:00
D. Berge
913606e7f1 Allow forcing QCs.
QCs may be re-run for specific sequences or for a whole
project by defining an environment variable, as follows:

For an entire project:

* DOUGAL_FORCE_QC="project-id"

For specific sequences:

* DOUGAL_FORCE_QC="project-id sequence1 sequence2 … sequenceN"
2022-03-17 20:10:26 +01:00
D. Berge
49b7747ded Remove *all* QC events when saving sequence results.
When saving shot-by-shot results for a sequence,
*all* existing QC events for that sequence will be
removed first.

We do this because otherwise we may end up with QC
data for shots that no longer exist. Also, in the
case that we have QCed based on raw data, QC results
for shots which are not in the final data would stay
around even though those shots are no longer valid.
2022-03-17 20:07:11 +01:00
D. Berge
1fd265cc74 Update dependencies 2022-03-17 20:05:07 +01:00
D. Berge
13389706a9 Merge branch '184-refactor-qcs' into devel 2022-03-17 18:43:38 +01:00
D. Berge
818cd8b070 Add pg-cursor dependency, needed by QCs 2022-03-17 18:43:12 +01:00
D. Berge
a3d3c7aea7 Merge branch '184-refactor-qcs' into devel 2022-03-17 18:37:14 +01:00
D. Berge
a592ab5f6c Use digests rather than timestamps for QC execution.
Using timestamps does not work as we might be
importing files with timestamps older than the
last QC run. Those would not be detected by a
timestamp based method but would be by this
digest based approach.

There is a project-wide digest and per sequence
digests. The former takes the path and hashes of
all files known to Dougal for this project (the
`files` table), concatenantes them and computes
the MD5 checksum. Sequence digests do the same
but only including the files related to that
sequence.
2022-03-17 18:32:09 +01:00
D. Berge
9b571ce34d Merge branch '138-keep-edit-history-of-event-log-entries' into devel 2022-03-16 21:31:38 +01:00
D. Berge
aa2b158088 Remove spurious actions from DB template 2022-03-16 21:30:32 +01:00
D. Berge
0d1f2b207c Apply changes from 38e4e705a4 to DB schema template 2022-03-16 21:29:53 +01:00
D. Berge
38e4e705a4 Modify database upgrade file 12.
Two function that were dependent on the `events` view were
changed to work with `event_log` instead.
2022-03-16 21:08:42 +01:00
D. Berge
82d7036860 Merge branch '138-keep-edit-history-of-event-log-entries' into 'devel'
Resolve "Keep edit history of event log entries"

Closes #78, #101, #138, #141, #170, #172, and #181

See merge request wgp/dougal/software!20
2022-03-15 13:25:43 +00:00
D. Berge
0727e7db69 Update database templates to schema v0.3.1 2022-03-15 14:17:28 +01:00
D. Berge
2484b1c473 Merge branch '188-adapt-qc-results-view-to-new-api-endpoints' into 138-keep-edit-history-of-event-log-entries 2022-03-09 21:37:27 +01:00
D. Berge
750beb5c02 Add explicit indication of all tests passed 2022-03-09 21:36:49 +01:00
D. Berge
cd2e7bbd0f Merge branch '184-refactor-qcs' into 138-keep-edit-history-of-event-log-entries 2022-03-09 21:26:40 +01:00
D. Berge
21d5383882 Update QC check definitions 2022-03-09 21:25:47 +01:00
D. Berge
2ec484da41 Fix detection of sequence modification time 2022-03-09 21:25:04 +01:00
D. Berge
648ce9970f Interpolate timestamps for non-existing shotpoints 2022-03-09 21:22:33 +01:00
D. Berge
fd278a5ee6 Add database function: tstamp_interpolate 2022-03-09 21:21:48 +01:00
D. Berge
4f5cce33fc Add comments to database functions 2022-03-09 21:21:01 +01:00
D. Berge
53bb75a2c1 Add new database upgrade file 11.
Some of the things in new upgrade file 12 depend
on the functions defined here.
2022-03-09 19:07:58 +01:00
D. Berge
45595bd64f Rename database upgrades 11‒13 → 12‒14 2022-03-09 19:07:58 +01:00
D. Berge
af4d141c6a Merge branch '184-refactor-qcs' into '138-keep-edit-history-of-event-log-entries'
Resolve "Refactor QCs"

See merge request wgp/dougal/software!22
2022-03-09 17:46:20 +00:00
D. Berge
bef2be10d2 Merge branch '188-adapt-qc-results-view-to-new-api-endpoints' into '184-refactor-qcs'
Resolve "Adapt QC results view to new API endpoints"

See merge request wgp/dougal/software!24
2022-03-09 16:56:35 +00:00
D. Berge
803a08a736 Merge branch '187-create-qc-results-api-endpoints' into '184-refactor-qcs'
Resolve "Create QC results API endpoints"

See merge request wgp/dougal/software!23
2022-03-09 16:55:57 +00:00
D. Berge
c86cbdc493 Refactor QC view to use new API endpoint.
This provides essentially the same user experience as the old
endpoint, with one exception as of this commit:

* The user is not able to “accept” or “unaccept” QC events.
2022-03-09 17:50:55 +01:00
D. Berge
186615d988 Add comments for ease of browsing 2022-03-09 17:43:51 +01:00
D. Berge
666f91de18 Add QC results API endpoint 2022-03-09 17:43:10 +01:00
D. Berge
c8ce786e39 Add API middleware for returning QC results 2022-03-09 17:41:27 +01:00
D. Berge
73cb26551b Add library functions for getting QC results from DB.
We return the QC definitions tree structure, augmented with
a `sequences` attribute which contains `raw_lines` tuples
which are in turn augmented with a `shots` attribute
containing `event_log` tuples. The whole structure looks
something like:

qc_test:
  qc_test:
    sequences:
      - sequence0:
          shots: [sp0, sp1, …]
      - sequence1:
          shots: [sp0, sp1, …]
  qc_test:
    sequences:
      - sequence0:
          shots: [sp0, sp1, …]
  …
2022-03-09 17:35:12 +01:00
D. Berge
d90acb1aeb Add utility to convert QC definitions tree into a flat list 2022-03-09 17:32:23 +01:00
D. Berge
14a2f57c8d Refactor QC execution and results saving.
The results are now saved as follows:

For shot QCs, failing tests result in an event being created in
the event_log table. The text of the event is the QC result message,
while the labels are as set in the QC definition. It is conventionally
expected that these include a `QC` label. The event `meta` contains a
`qc_id` attribute with the ID of the failing QC.

For sequences, failing tests result in a `meta` entry under `qc`, with
the QC ID as the key and the result message as the value.

Finally, the project's `info` table still has a `qc` key, but unlike
with the old code, which stored all the QC results in a huge object
under this key, now only the timestamp of the last time a QC was run on
this project is stored, as `{ "updatedOn": timestamp }`.

The QCs are launched by calling the main() function in /lib/qc/index.js.
This function will first check the timestamp of the files imported into
the project and only run QCs if any of the file timestamps are later
than `info.qc.updatedOn`. Likewise, for each sequence, the timestamp of
the files conforming that sequence is checked against
`info.qc.updatedOn` and only those which are newer are actually
processed. This cuts down the running time very considerably.

The logic now is much easier on memory too, as it doesn't load the
whole project at once into memory. Instead, shotpoint QCs are processed
first, and for this a cursor is used, fetching one shotpoint at a
time. Then the sequence QCs are run, also one sequence at a time
(fetched via an individual query touching the `sequences_summary` view,
rather than via a cursor; we reuse some of the lib/db functions here),
for each sequence all its shotpoints and a list of missing shots are
also fetched (via lib/db function reuse) and passed to the QC functions
as predefined variables.

The logic of the QC functions is also changed. Now they can return:

* If a QC passes, the function MUST return boolean `true`.

* If a QC fails, the function MAY return a string describing the nature
  of the failure, or in the case of an `iterate: sequence` type test,
  it may return an object with these attributes:

  - `remarks`: a string describing the nature of the failure;
  - `labels`: a set of labels to associate with this failure;
  - `shots`: a object in which each attribute denotes a shotpoint number
    and the value consists of either a string or an object with
`remarks` (string), `labels` (array of strings) attributes. This allows
us to add detail about which shotpoints exactly contribute to cause a
sequence-wide test failure (this may not be applicable to every
sequence-wide QC) and it's also a handy way to detect and insert events
for missing shots.

* For QCs which may give false positives, such as missing gun data, a
  new QC definition attribute is introduced: if `ignoreAllFailed` is
boolean `true` and all shots fail the test for a sequence, or all
sequences fail the test for a prospect, the results of the QC will be
ignored, as if the test had passed. This is mostly to deal with gun or
any other data that may be temporarily missing.
2022-03-07 21:41:10 +01:00
D. Berge
67f8b9c6dd Bypass permissions check on info.put() if role is null.
The comparison is strict non-equality so a null role cannot
be forced via the API.

The need for this is so that we can reuse this function to
save QC results, which is something that does not take
place over the API.
2022-03-07 21:20:21 +01:00
D. Berge
d3336c6cf7 Add fetchRow DB function.
Helper function to fetch a row at a time using a cursor.
2022-03-07 21:16:43 +01:00
D. Berge
17bb88faf4 Cope with P1/11s with no S records 2022-03-07 21:08:22 +01:00
D. Berge
a52c7e91f5 Document in runner.sh how to run ASAQC in test mode 2022-03-07 21:07:20 +01:00
D. Berge
8debe60d5c Cope with undefined labels 2022-03-02 19:39:29 +01:00
D. Berge
ee9a33513a Update database README 2022-02-28 21:27:20 +01:00
D. Berge
723c9cc166 Make it possible to repeatedly apply DB upgrade 11.
Even though this makes PostgreSQL 14 a hard dependency.
2022-02-28 21:26:19 +01:00
D. Berge
cb952d37f7 Fix: do not require file that no longer exists 2022-02-28 21:25:00 +01:00
D. Berge
d5fc04795d Make rows dense.
This should probably be turned into an option controlled by the
user.
2022-02-27 19:59:06 +01:00
D. Berge
4e0737335f Add row context menu.
It replaces the `Actions` column in the old table and provides
more actions.

The user can now edit not just the comments and labels but also
the timestamp / shotpoint as requested in #78 (closes #78).

Because events are grouped by timestamp / shotpoint (each row
represents a unique timestamp or shotpoint), the behaviour is
slightly different depending on whether the user clicks on a
row containing a single (editable) event, or on one of multiple
editable events in the same row. Also, rows containing only
read-only events are recognised and no edition actions are
provided for those.
2022-02-27 19:59:06 +01:00
D. Berge
d47c8a9e10 Add (disabled) active row highlighter.
It implements the same functionality as in other tabs
such as sequences, lines, etc., but it is disabled here
because in my opinion it doesn't look too nice.

It will probably be a matter of enabling it at some point
and asking for feedback on user preference.
2022-02-27 19:56:21 +01:00
D. Berge
7ea0105d9f Add popularLabels computed property.
Returns a list of labels used in the current view,
in order of popularity (most used first).

NOTE: this property is not actually used. It's
technically dead code.
2022-02-27 19:56:21 +01:00
D. Berge
8f4bda011b Add dialogue to edit event labels.
This assumes that adding or removing labels is a relatively
common action to do on an event and provides a quicker
and simpler mechanism than bringing up the full event
dialogue.

This is meant to be invoked from a context menu action or
similar.
2022-02-27 19:56:21 +01:00
D. Berge
48505dbaeb View event history.
When an event has been modified, this control opens a dialogue
where the previous version of the event may be reviewed and if
necessary restored.

Technically, this was the quid of and closes #138.
2022-02-27 19:56:21 +01:00
D. Berge
278c46f975 Adapt events view to new schema 2022-02-27 19:56:21 +01:00
D. Berge
180343754a Remove old event edit dialogue 2022-02-27 19:56:21 +01:00
D. Berge
9aa9ce979b Replace event edit dialogue.
The old <dougal-event-edit-dialog/> gets replaced by
<dougal-event-edit/> which handles the new events schema.
2022-02-27 19:56:21 +01:00
D. Berge
1e5be9c655 Add new event edit dialogue.
Replaces <dougal-event-edit-dialog/>.
2022-02-27 19:56:21 +01:00
D. Berge
0be5dba2b9 Return also labels from <dougal-context-menu/>.
Keeping in mind that the input model is a tree and labels
may be at any level in the tree, not just in the leaves.
2022-02-27 19:56:21 +01:00
D. Berge
0c91e40817 Fix <dougal-context-menu/> default prop value 2022-02-27 19:56:21 +01:00
D. Berge
c1440c7ac8 Simplifiy <dougal-context-menu/> model 2022-02-27 19:56:21 +01:00
D. Berge
606f18c016 Add Vuex position and timestamp getters for real-time event 2022-02-27 19:56:21 +01:00
D. Berge
febf109cce Update API description 2022-02-27 19:56:21 +01:00
D. Berge
9b700ffb46 Update required database schema 2022-02-27 19:56:21 +01:00
D. Berge
9aca927e49 Update version checking mechanism.
Checks both database schema and API versions.
2022-02-27 19:56:21 +01:00
D. Berge
adaa1a6b8a Add version number to API 2022-02-27 19:56:21 +01:00
D. Berge
8790a797d9 Allow restricting by timestamp or position.
Closes #181.
2022-02-27 19:56:21 +01:00
D. Berge
d7d75f34cd Remove event caching.
That was a horrible kludge and should not be necessary with the
new schema, which is simpler and much faster.
2022-02-27 19:56:21 +01:00
D. Berge
950582a5c6 Refactor event middleware and db code to use new tables 2022-02-27 19:56:21 +01:00
D. Berge
d0da1b005b Add replaceMarkers utility function 2022-02-27 19:56:21 +01:00
D. Berge
1e2c816ef3 Add database upgrade file 13.
Drops the old event tables.

NOTE: consider not applying this patch until confident that
the migration has proceeded smoothly. Dougal can operate just
fine without it.
2022-02-27 19:56:21 +01:00
D. Berge
54b457b4ea Add database upgrade file 12.
Migrates data from old event tables to new.
2022-02-27 19:56:21 +01:00
D. Berge
4d2efd1e04 Move sequence events middleware to a different path.
This is to make room for a new endpoint to retrieve
data for individual events.
2022-02-27 19:56:21 +01:00
D. Berge
920ea83ece Add API endpoint to retrieve a single shotpoint.
This will be used by the new event dialogue in the
frontend to get shotpoint information when creating
or editing events.
2022-02-27 19:56:21 +01:00
D. Berge
d33fe4e936 Add database utilities file.
Intended to contain reusable functions.
2022-02-27 19:56:21 +01:00
D. Berge
c347b873c5 Update database README.
Add information on restoring from backup and troubleshooting
details when migrating PostgreSQL versions.
2022-02-27 19:56:21 +01:00
D. Berge
0c6567d8f8 Add database upgrade file 11 2022-02-27 19:56:12 +01:00
D. Berge
195741a768 Merge branch '173-do-not-use-inodes-as-part-of-a-file-s-fingerprint' into 'devel'
Resolve "Do not use inodes as part of a file's fingerprint"

Closes #173

See merge request wgp/dougal/software!19
2022-02-07 16:08:04 +00:00
D. Berge
0ca44c3861 Add database upgrade file 10.
NOTE: this is the first time we modify the actual data
in the database, as opposed to adding to the schema.
2022-02-07 17:05:19 +01:00
D. Berge
53ed096e1b Modify file hashing function.
We remove the inode from the hash as it is unstable when the
files are on an SMB filesystem, and replace it with an MD5
of the absolute file path.
2022-02-07 17:03:10 +01:00
D. Berge
75f91a9553 Increment schema wanted version 2022-02-07 17:02:59 +01:00
D. Berge
40b07c9169 Merge branch '175-add-database-versioning-and-migration-mechanism' into 'devel'
Resolve "Add database versioning and migration mechanism"

Closes #175

See merge request wgp/dougal/software!18
2022-02-07 14:43:50 +00:00
D. Berge
36e7b1fe21 Add database upgrade file 09 2022-02-06 23:26:57 +01:00
D. Berge
e7fa74326d Add README to database upgrades directory 2022-02-06 23:24:24 +01:00
D. Berge
83be83e4bd Check database schema compatibility.
The server will not start unless it satisfies itself that we're
running against a compatible database schema.
2022-02-06 22:52:45 +01:00
D. Berge
81ce6346b9 Add database schema information to package.json.
Used to determine if the actual schema on the database
is compatible with the version of the server we're
attempting to run.
2022-02-06 22:51:25 +01:00
D. Berge
923ff1acea Add more details to package.json 2022-02-06 22:50:44 +01:00
D. Berge
8ec479805a Add version reporting library.
This reports the current server version, from Git by
default.

Also, and of more interest, it reports whether the
current database schema is compatible with the
server code.
2022-02-06 22:48:20 +01:00
D. Berge
f10103d396 Enfore info key access restrictions on the API.
Obviously, those keys can be edited freely at the database
level. This is intended.
2022-02-06 22:40:53 +01:00
D. Berge
774bde7c00 Reserve certain keys on info tables 2022-02-06 22:39:11 +01:00
D. Berge
b4569c14df Update database README.
Document how to create a Dougal database from scratch
and how to update PostgreSQL.
2022-02-06 22:28:21 +01:00
D. Berge
54eea62e4a Fix require path 2022-02-06 14:24:25 +01:00
D. Berge
69c4f2dd9e Merge branch '161-transfer-files-to-asaqc' into 'devel'
Resolve "Transfer files to ASAQC"

Closes #161

See merge request wgp/dougal/software!16
2021-10-09 09:23:54 +00:00
D. Berge
acc829b978 Switch to production URL in ASAQC configuration 2021-10-06 04:16:17 +02:00
D. Berge
ff4913c0a5 Instrument getLineName to monitor probable cause of #165 2021-10-06 02:12:05 +02:00
D. Berge
51452c978a Add ASAQC task to runner 2021-10-04 21:26:13 +02:00
D. Berge
927ef71ecc Send Ocp-Apim-Subscription-Key with ASAQC requests 2021-10-04 21:00:41 +02:00
D. Berge
14541bcb95 Make code compatible with NodeJS 14 2021-10-04 16:52:04 +02:00
D. Berge
5c190e5554 Add ASAQC queue processor.
This code implements the backend processing side
of the ASAQC queue, i.e., the bit that communicates
with the remote API.

Its expected use it to have it running at regular
intervals, e.g., via cron. The entry point is:

lib/www/server/queues/asaqc/index.js

That file is executable and can be run directly
from the shell or within a script. Read the comments
in that file for further instructions.
2021-10-04 02:21:00 +02:00
D. Berge
0f447fc27d Add ASAQC API mock-up.
To be used for testing and debugging. See
index.js for instructions.
2021-10-04 02:21:00 +02:00
D. Berge
dfbccf3bc6 Add ASAQC (test) server details to configuration.
The URL corresponds to that of a built-in test server.

Note that the /etc/ssl directory is protected against
accidental inclusion into the repository by commit
458b6837. The TLS private key should *never* be
committed.
2021-10-04 02:21:00 +02:00
D. Berge
a491530018 Add ASAQC transfer support to client (sequence list) 2021-10-04 02:21:00 +02:00
D. Berge
c7784aa52f Add ASAQC queue endpoints to API 2021-10-04 02:21:00 +02:00
D. Berge
0533314b01 Add DOUGAL_ROOT property to configuration object 2021-10-04 02:21:00 +02:00
D. Berge
8da664a025 Add directory for TLS certificates.
And add it to .gitignore so its contents do not get committed
by accident.
2021-10-04 02:21:00 +02:00
D. Berge
6debf5c355 Add queue-related functions to the database interface.
These functions, in general following the same HTTP-verb
approach as the rest of the database interface, are for
use with both the HTTP API and the queue processor.
2021-10-04 02:21:00 +02:00
D. Berge
db8efce346 Remove dead code 2021-10-04 02:21:00 +02:00
D. Berge
b107c71c6f Add option to get only summary info for a sequence.
Which is faster when we don't need the shotpoint data.
2021-10-04 02:21:00 +02:00
D. Berge
ef12168811 Make it possible to list one specific sequence 2021-10-04 02:21:00 +02:00
D. Berge
e1dc970db4 Add export functions for SeisJSON data.
These functions abstract the creation of SeisJSON payloads
and their various representations as GeoJSON, HTML or PDF.
2021-10-04 02:21:00 +02:00
D. Berge
f2de8509cc Make Babel support logical assignment operators.
That's ||=, &&=, ^^=, and the like.
2021-10-04 02:21:00 +02:00
D. Berge
1e6c6ef961 Add throttle() helper.
Useful to avoid repeated updates triggered by
incoming row-level database events.
2021-10-04 02:21:00 +02:00
D. Berge
38e56394d4 Add queue_items to the list of DB events to listen for 2021-10-04 02:21:00 +02:00
D. Berge
374fb7de67 Add database upgrade file 08 2021-10-04 02:21:00 +02:00
D. Berge
978256ceab Describe ASAQC-related API endpoints 2021-10-04 02:21:00 +02:00
D. Berge
5a7fe9b38a Update API version description 2021-10-04 02:21:00 +02:00
D. Berge
83c992c0d9 Fix description of endpoints authorisation 2021-10-04 02:21:00 +02:00
D. Berge
18ee28d72e Describe HTTP 401 responses explicitly 2021-10-04 02:21:00 +02:00
D. Berge
6bc3aff587 Change server names in API description 2021-10-04 02:21:00 +02:00
D. Berge
74b3de5c26 Merge branch '75-quality-control-dashboard' into 'devel'
Resolve "Quality control dashboard" – sequence visualisations

Closes #143, #142, and #150

See merge request wgp/dougal/software!14
2021-10-01 21:17:17 +00:00
D. Berge
57a08c93bc Add link to graphics tab from sequence list 2021-09-28 22:16:12 +02:00
D. Berge
fabc9fe757 Do not make graphs editable 2021-09-28 18:30:26 +02:00
D. Berge
6f32f24481 Add configuration dialog to Graphs.
Lets the user choose which aspects (graphs) he wants to
be visible.
2021-09-28 18:17:38 +02:00
D. Berge
dffe7defbb Add tooltips to Graphs toolbar 2021-09-28 18:16:57 +02:00
D. Berge
b9844528f1 Add graphBar to resizeObserver.
This ensures that it is always the right size when it first
gets displayed.
2021-09-28 18:15:19 +02:00
D. Berge
cd78dbd0d8 Fix typos in resizeObserver 2021-09-28 18:14:39 +02:00
D. Berge
798203be9f Add preferences support to DougalGraphGunsPressure 2021-09-28 18:12:37 +02:00
D. Berge
5bfd7dc835 Add preferences support to DougalGraphGunsDepth 2021-09-28 18:11:43 +02:00
D. Berge
c17862fbbb Add preferences support to DougalGraphGunsTiming 2021-09-28 18:11:04 +02:00
D. Berge
04c0369923 Add preferences support to DougalGraphArraysIJScatter 2021-09-28 18:10:08 +02:00
D. Berge
026cfb6f98 Rename GraphArraysIJScatter to DougalGraphArraysIJScatter 2021-09-28 18:08:48 +02:00
D. Berge
a4e6ec0712 Add support for personalising QC graph settings.
Preferences are read from the store and passed to graph components
via the `settings` prop. Component may changed their own settings
by emitting the `update:settings` signal.
2021-09-28 17:59:32 +02:00
D. Berge
b3e052cb12 Add utility function to filter preferences by a prefix 2021-09-28 17:53:07 +02:00
D. Berge
cf88ecf172 Save user preferences to Vuex store.
The user preferences are saved in the browser's localStorage and
read by setCredentials() whenever that function is called. From
that point they are cached in the Vuex store.

Provided that preferences are only modified through the store,
via the saveUserPreference() call, the preferences should always
be in sync between the store and the browser.

The preferences object is a key/value store. Each key is
expected to be in the form of a series of dot-separated prefixes,
e.g., `UserX.RoleY.Graphs.GraphType1.setting0`.

For user preferences, the first two prefix elements should be the
username and role of the user that the setting applies to. These will
be automatically added and stripped by saveUserPreference() and
loadUserPreferences() respectively.
2021-09-28 17:42:49 +02:00
D. Berge
e267440711 Move comment to right place 2021-09-28 17:30:48 +02:00
D. Berge
454094b187 Refactor gun heatmaps component.
Fixes #150.

Contributes towards the goal of #149. As irrelevant data (such
as for non-firing guns) is no longer shown at all. This affects:

* Firetime (only active array data shown)
* Gun deltas (only active array shown)
* Fill time (only non-active array shown)
2021-09-21 00:32:00 +02:00
D. Berge
862e754a6f Fix labelling of gun mode and detect heatmaps.
Fixes #142.
2021-09-20 00:18:31 +02:00
D. Berge
894877750e Make heatmap hover box more informative.
Closes #143.
2021-09-20 00:17:35 +02:00
D. Berge
09b45d5d65 Swap outlier colours 2021-09-11 21:30:12 +02:00
D. Berge
1352c3b312 Make graph colours consistent for port / starboard elements 2021-09-11 19:19:58 +02:00
D. Berge
30aa2c302e Add graphic aesthetics 2021-09-11 12:38:12 +02:00
D. Berge
3eaa2757b9 Add Graphs tab to navigation bar 2021-09-11 12:19:06 +02:00
D. Berge
6f6af1bbc7 Add graphs/ route to client 2021-09-11 12:19:06 +02:00
D. Berge
019561229c Add Graph component.
It displays a series of data plots.
2021-09-11 12:19:06 +02:00
D. Berge
e212dc8b92 Add unpack helper function to frontend.
Convenience function to extract a key from an
array of objects.
2021-09-11 12:19:06 +02:00
D. Berge
5c00013892 Add graphic library dependencies 2021-09-11 12:19:06 +02:00
D. Berge
1e5bdcc068 Add Vuex functions to load / save user preferences 2021-09-11 12:19:06 +02:00
D. Berge
a280a910f5 Add database upgrade file 07 2021-09-11 12:19:06 +02:00
D. Berge
45fe467a21 Implement sequence/get API endpoint.
It returns data for all individual points in a sequence.
2021-09-11 12:19:06 +02:00
D. Berge
8d3b7adc78 Show azimuths to two decimals in SeisJSON exports 2021-09-04 23:34:53 +02:00
D. Berge
079d3a18b0 Merge branch '131-show-missing-shots-in-sequence-reports' into 'devel'
Resolve "Show missing shots in sequence reports"

Closes #131

See merge request wgp/dougal/software!15
2021-09-04 21:32:44 +00:00
D. Berge
f0b1fc2fe6 Show missed shot events in HTML, PDF exports 2021-09-04 23:29:58 +02:00
D. Berge
987bdf6e21 Add option to export missing shots as SeisJSON events 2021-09-04 23:28:43 +02:00
D. Berge
1d3507b3a4 Export missing shots by default.
Unless explicitly requested by the user by setting the
option `missing` to `false`, a list of missing shotpoints
will be included in the SeisJSON file.
2021-09-04 23:19:25 +02:00
D. Berge
a82fc7bc8a Recover from feed XML parsing error 2021-09-04 02:43:58 +02:00
D. Berge
29b3c9a250 Show azimuth to two decimals elsewhere too.
Related to #126, might as well use two decimals throughout.
2021-09-02 01:18:47 +02:00
D. Berge
040c1ead96 Show azimuth to two decimal places.
In planner report template.

Closes #126.
2021-09-02 01:17:40 +02:00
D. Berge
1c7bed0c15 Fix returning next planned sequence number.
If no sequences have been shot, return 1 instead of null as the
next available sequence number.

Fixes #125.
2021-09-02 01:04:38 +02:00
D. Berge
dfcda1b2d9 Merge branch '103-24-hour-lookahead-planning-report' into 'devel'
Resolve "24-hour lookahead planning report"

Closes #103

See merge request wgp/dougal/software!13
2021-06-21 14:53:35 +00:00
D. Berge
b3aadfc33c Merge branch '60-update-planner-as-sequences-are-shot' into 'devel'
Resolve "Update planner as sequences are shot"

Closes #60

See merge request wgp/dougal/software!12
2021-06-21 14:52:11 +00:00
D. Berge
d5980d9154 Add CSV planner output option 2021-06-19 19:04:05 +02:00
D. Berge
b5f2945c8b Fix end time in plan HTML template 2021-06-19 15:43:04 +02:00
D. Berge
9bbffe2ae0 React to changes in planner remarks 2021-06-19 12:27:36 +02:00
D. Berge
09f60d6c18 Add database upgrade file 06 2021-06-19 12:23:25 +02:00
D. Berge
81d9ea19cc Add adjust_planner() function to DB schema.
It updates the planned lines details according to production and current
time.
2021-06-19 12:18:28 +02:00
D. Berge
497d4d68f9 Call notify on changes to schema's info table 2021-06-19 12:17:26 +02:00
D. Berge
853deca3c3 Rename misnamed trigger 2021-06-19 12:16:37 +02:00
D. Berge
99f1530db3 Replace phone icon in template.
Strangely enough, the emoji icon seems to work reliably across
platforms.
2021-05-31 02:54:38 +02:00
D. Berge
b325ae3452 Let the user know when there are no planner comments 2021-05-31 02:47:20 +02:00
D. Berge
f97d334fe5 Improve the aesthetics of the planner remarks section 2021-05-31 02:41:58 +02:00
D. Berge
cb114f01cd Add GUI support for downloading planner data.
Including HTML and PDF formats, which constitutes the lookahead report.
2021-05-31 02:29:50 +02:00
D. Berge
707df76b70 Add GUI support for saving planner remarks.
They get saved to `/project/:project/info/plan/remarks`.
2021-05-31 02:29:50 +02:00
D. Berge
bba050032f Add POST, PUT, DELETE support to /project/:project/info.
It reuses the same backend functions as for the global `/info/` path.
2021-05-31 02:29:50 +02:00
D. Berge
594233c965 Add HTML & PDF planner output options.
Coupled with a suitable Nunjucks template, this is effectively the
24-hour (or whatever period of time) lookahead.
2021-05-31 02:29:50 +02:00
D. Berge
5795c1f87d Add server-side map rendering component.
Based on our own fork of leaflet-headless.
2021-05-31 02:29:50 +02:00
D. Berge
ccd1852f65 Add Nunjucks rendered get filter.
Given an argument consisting of an array of objects and an attribute
name `attr`, it returns an array of all `attr` attributes.
2021-05-31 02:29:50 +02:00
D. Berge
17947df168 Modify Nunjucks rendered timestamp function.
* It accepts a `precision` parameter which truncates the timestamp to a
give precision. Can be `seconds`, `minutes`, `hours` or `days` / `date`.

* It tries to be more flexible in what it accepts as input.

* It accepts an input of "now" which returns the current timestamp. Can
  be used along with `precision`.
2021-05-31 02:29:50 +02:00
D. Berge
041878096d Accept a mime query parameter to force MIME type 2021-05-31 02:29:50 +02:00
D. Berge
ea3e31058f Refactor the planned lines editing logic.
We move most of the logic from the client (as it was until now) to the
server.

The PATCH command maintains the same format but it should provide only
one of the following keys per request:

* ts0
* ts1
* speed
* fsp
* lsp
* lagAfter
* sequence

   Earlier keys in the list above take priority over latter ones.

The following keys may be provided by themselves or in combination with
each other (but not with any of the above):

* name
* remarks
* meta

As a special case, an empty string as the `name` value causes the name
to be auto-generated.

See comments in the code `patch.js` for details on the update logic.
2021-05-28 20:30:59 +02:00
D. Berge
534a54ef75 Add database upgrade file 05 2021-05-28 20:30:59 +02:00
D. Berge
f314536daf Change planned_lines trigger from statement to row.
Because a) it tells us what has changed and b) doesn't fire if we
didn't actually change anything.
2021-05-28 20:30:59 +02:00
D. Berge
de4aa52417 Make planned_lines primary key deferrable.
Helps when we need to renumber sequences.
2021-05-28 20:30:59 +02:00
D. Berge
758b13b189 Add saillines layer to map 2021-05-28 20:30:29 +02:00
D. Berge
967db1dec6 Include NTBA status in preplot GIS output 2021-05-28 20:29:57 +02:00
D. Berge
91fd5e4559 Ensure that timestamp is always a Date object 2021-05-27 17:50:01 +02:00
D. Berge
cf171628cd Fix error in editing of planned line start time 2021-05-27 17:49:32 +02:00
D. Berge
94c29f4723 Change the sunset / sunrise times reported via the tooltip.
The icon still uses the lower edge of the sun to calculate day / night,
but the tooltip shows actual sunrise and sunset times.
2021-05-27 02:08:30 +02:00
D. Berge
14b2e55a2e Remove edit controls from planner for read-only users.
Left over from #108.
2021-05-27 01:32:03 +02:00
D. Berge
c30e54a515 Round vessel speeds to 0.1 kt 2021-05-27 01:09:28 +02:00
D. Berge
7ead826677 Show sunrise / sunset times in the planner.
* A ‘sun’ icon is shown when a line starts and ends in daytime
* A ‘moon’ icon is shown when a line starts and ends in nighttime
* A ‘sun/moon’ icon is shown in other cases

Sunrise and sunset times are provided as a tooltip when hovering over
the icon.

Closes #72.
2021-05-27 01:02:42 +02:00
D. Berge
7aecb514db Clear QC metadata when importing gun data.
Fixes #118.
2021-05-26 00:30:58 +02:00
D. Berge
ad395aa6e4 Include the planned lines table in system dumps 2021-05-26 00:15:09 +02:00
D. Berge
523ec937dd Always merge metadata on import.
The INSERT INTO raw_lines / final_lines will not always be executed as
the lines may already exist (particularly in raw_lines because of
*online*), so whether it worked or not we merge the metadata immediately
afterwards (this may cause an extra notification to be fired).
2021-05-25 03:19:42 +02:00
D. Berge
9d2ccd75dd Do not try to use line name if there isn't one 2021-05-25 03:19:00 +02:00
D. Berge
3985a6226b Suggest ${lineName}-NavLog.${extension} as file name.
This is for the usual case where only one sequence is requested.

When more than one sequence is requested, the suggested name comes out
as ${projectId}-${sequenceList}.${extension}, where `sequenceList` is
the list of sequence numbers separated by semicolons, e.g.:
eq21203-37;38;39.html.

Closes #116.
2021-05-25 02:23:41 +02:00
D. Berge
7d354ffdb6 Add database upgrade file 2021-05-25 02:21:11 +02:00
D. Berge
3d70a460ac Output raw and final lines metadata in summary views 2021-05-25 02:13:50 +02:00
D. Berge
caae656aae Fix event detection failure.
There was a typo in the channel detection logic, resulting
in bogus events full of `undefined` data values.

Fixes #115.
2021-05-24 18:30:53 +02:00
D. Berge
5708ed1a11 Merge branch '57-make-event-log-entries-for-start-and-end-of-line-upon-import-of-final-sequence-if-the-entries-do' into 'devel'
Resolve "Make event log entries for start and end of line upon import of final sequence, if the entries do not already exist"

Closes #57

See merge request wgp/dougal/software!11
2021-05-24 15:44:58 +00:00
D. Berge
ad3998d4c6 Add database upgrade file 2021-05-24 17:41:11 +02:00
D. Berge
8638f42e6d Add database upgrade files.
These files contain the sequence of SQL commands needed to bring
a database or project schema up to date with the latest template
database or project schema.

These files must be applied manually. Check the comments at the top of
the file for instructions.
2021-05-24 17:39:01 +02:00
D. Berge
bc5aef5144 Run post-import functions after final lines.
The reason why need to do it like this instead of relying on a trigger
is because the entry in final_lines is created first and the final_shots
are populated. If we first the trigger on final_lines it is not going
to find any shots; if we fire it as a row trigger on final_shots it
would try to label every point in sequence as it is imported; finally if
we fire it as a statement trigger on final_shots we have no idea which
sequence was imported.
2021-05-24 16:59:56 +02:00
D. Berge
2b798c3ea3 Ignore attempts to put the same label twice on the same event 2021-05-24 16:59:20 +02:00
D. Berge
4d97784829 Upgrade database project schema template.
Adds:

* label_in_sequence (_sequence integer, _label text):
  Returns events containing the specified label.

* handle_final_line_events (_seq integer, _label text, _column text):
  - If _label does not exist in the events for sequence _seq:
    it adds a new _label label at the shotpoint obtained from
    final_lines_summary[_column].
  - If _label does exist (and hasn't been auto-added by this function
    in a previous run), it will add information about it to the final
    line's metadata.

* final_line_post_import (_seq integer):
  Calls handle_final_line_events() on the given sequence to check
  for FSP, FGSP, LGSP and LSP labels.

* events_seq_labels_single ():
  Trigger function to ensure that labels that have the attribute
  `model.multiple` set to `false` occur at most only once per
  sequence. If a new instance is added to a sequence, the previous
  instance is deleted.

* Trigger on events_seq_labels that calls events_seq_labels_single().

* Trigger on events_timed_labels that calls events_seq_labels_single().
2021-05-24 16:49:39 +02:00
D. Berge
13da38b4cd Make websocket notifications await.
Not sure if this helps much. It might help with avoiding
out of order notifications and reducing the rate at which
the clients get spammed when importing database dumps and
such, but that hasn't been tested.
2021-05-24 15:52:29 +02:00
D. Berge
5af89050fb Refactor SOL/EOL real-time detection handler.
This also implements a generic handler mechanism that can be
reused for other purposes, such as sending email / XMPP notifications,
doing real-time QC checks and so on.

Fixes #113.
2021-05-24 13:48:53 +02:00
D. Berge
d40ceb8343 Refactor list of notification channels into its own file 2021-05-24 13:38:19 +02:00
D. Berge
56d1279584 Allow api action to make arbitrary HTTP(S) requests.
If the URL is an absolute HTTP(S) one, we use it as-is.
2021-05-24 13:35:36 +02:00
D. Berge
d02edb4e76 Force the argument into String prior to splitting 2021-05-24 13:32:03 +02:00
D. Berge
9875ae86f3 Record P1/11 line name in database on import 2021-05-24 13:30:25 +02:00
D. Berge
53f71f7005 Set primary key on events_seq_labels in schema template 2021-05-23 22:27:00 +02:00
D. Berge
5de64e6b45 Add meta column to events view in schema template 2021-05-23 22:26:00 +02:00
D. Berge
67af85eca9 Recognise PENDING status in sequence imports.
If a final sequence file or directory name matches a pattern
which is recognised to indicate a ‘pending acceptance’ status,
the final data (if any exists) for that sequence will be deleted
and a comment added to the effect that the sequence has been
marked as ‘pending’.

To accept the sequence, rename its final file or directory name
accordingly.

Note: it is the *final* data that is searched for a matching
pattern, not the raw.

Closes #91.
2021-05-21 15:15:15 +02:00
D. Berge
779b28a331 Add info table to system dumps 2021-05-21 12:18:36 +02:00
D. Berge
b9a4d18ed9 Do not fail if no equipment has been defined.
Fixes #112.
2021-05-20 21:16:39 +02:00
D. Berge
0dc9ac2b3c Merge branch '71-add-equipment-info-to-the-logs' into 'devel'
Resolve "Add equipment info to the logs"

Closes #71

See merge request wgp/dougal/software!10
2021-05-20 19:05:35 +00:00
D. Berge
39d85a692b Use default Nunjucks template if necessary.
If the survey configuration does not itself have a template
we will use the one in etc/defaults/templates/sequence.html.njk.

It is not very likely that the template will be changed all that
often and it avoids issues when people forget to copy it across
to a new survey, etc.
2021-05-20 20:38:39 +02:00
D. Berge
e7661bfd1c Do not fail if requested object does not exist 2021-05-20 20:38:08 +02:00
D. Berge
1649de6c68 Update default sequence HTML template 2021-05-20 20:37:37 +02:00
D. Berge
1089d1fe75 Add equipment configuration fontend user interface 2021-05-20 18:35:56 +02:00
D. Berge
fc58a4d435 Implement equipment frontend component 2021-05-20 18:35:56 +02:00
D. Berge
c832d8b107 Commit default template for sequences 2021-05-20 18:35:56 +02:00
D. Berge
4a9e61be78 Add unique filter to Nunjucks renderer 2021-05-20 18:35:56 +02:00
D. Berge
8cfd1a7fc9 Export equipment info to Seis+JSON files 2021-05-20 18:35:56 +02:00
D. Berge
315733eec0 Refactor events export middleware.
Uses the `prepare` method for better reusability.
2021-05-20 18:35:56 +02:00
D. Berge
ad422abe94 Add prepare method for Seis+JSON and related exports.
It retrieves the data necessary for a complete Seis+JSON
export, including equipment info.
2021-05-20 18:35:56 +02:00
D. Berge
92210378e1 Listen for and broadcast info notifications 2021-05-20 18:21:01 +02:00
D. Berge
8d3e665206 Expose new API endpoint: /info/:path(*).
Provides CRUD access to values (which may be deeply nested) from the
global `info` table.
2021-05-20 18:19:29 +02:00
D. Berge
4ee65ef284 Implement info/delete middleware 2021-05-20 18:18:26 +02:00
D. Berge
d048a19066 Implement info/put middleware 2021-05-20 18:18:13 +02:00
D. Berge
97ed9bcce4 Implement info/post middleware 2021-05-20 18:17:52 +02:00
D. Berge
316117cb83 Implement info.delete() database method.
It deletes a (possibly deeply nested) element in the
`info` table.
2021-05-20 18:16:26 +02:00
D. Berge
1d38f6526b Implement info.put() database method.
Replaces an existing element with a new one, or inserts it
if there is nothing to replace. The element may be deeply
nested inside a JSON object or array in the `info` table.

Works for both public.info and survey_?.info.
2021-05-20 18:14:43 +02:00
D. Berge
6feb7d49ee Implement info.post() database method.
It adds an element to a JSON array corresponding to a
key in the info table. Errors out if the value is not
an array.
2021-05-20 18:13:15 +02:00
D. Berge
ac51f72180 Ignore empty path parts in info.get() 2021-05-20 18:10:51 +02:00
D. Berge
86d3323869 Remove logging statement 2021-05-20 18:10:27 +02:00
D. Berge
b181e4f424 Let the user set the search path to no survey.
This is so that we can access tables in the `public`
schema which are overloaded by survey tables, as is
the case with `info`.
2021-05-20 18:08:03 +02:00
D. Berge
7917eeeb0b Add table info to schema.
This one is independent of any projects so it goes
into `public`.
2021-05-20 18:07:05 +02:00
D. Berge
b18907fb05 Merge branch '53-mark-points-as-not-to-be-acquired-ntba' into 'devel'
Resolve "Mark points as ‘not to be acquired’ (NTBA)"

Closes #53

See merge request wgp/dougal/software!9
2021-05-17 18:34:46 +00:00
D. Berge
3e1861fcf6 Update API description 2021-05-17 20:30:59 +02:00
D. Berge
820b0c2b91 Add set line complete / incomplete actions.
The following options are shown:

* Set line complete:

If a line has been partially shot and still has points
to be acquired.

This option marks remaining virgin points as NTBA=true.

* Set line incomplete:

If a line has been partially shot and remaining virgin
points have been marked as NTBA.

This option marks all points in the line as NTBA=false.

* Set line NTBA:

If a line has not been (successfully) shot at all, i.e.,
all points on the line are virgin.

This option marks the line itself as NTBA=true.

* Unset line NTBA:

If a line has been marked as NTBA.

This option clears the NTBA flag from the line.
2021-05-17 20:19:53 +02:00
D. Berge
57f4834da8 Add information about virgin and remaining points 2021-05-17 20:19:16 +02:00
D. Berge
08d33e293a React also on preplot point changes, not just lines 2021-05-17 20:18:33 +02:00
D. Berge
8e71b18225 Add complete to line PATCH options.
`complete` is a boolean.

If true, any virgin points remaining on the line
will be marked as `ntba=true`.

If false, *all* points on the line will be marked
as `ntba=false`.
2021-05-17 20:15:34 +02:00
D. Berge
f297458954 Report on virgin points and points to be acquired.
Virgin points are those that have not been acquired
(and processed) at least once.

Points to be acquired are virgin points that do not
have the `ntba` flag set.
2021-05-17 20:13:53 +02:00
D. Berge
eb28648e57 Remove bogus dependency 2021-05-17 17:18:35 +02:00
D. Berge
0c352512b0 Enable the ‘view on map’ log action item. 2021-05-17 17:14:58 +02:00
D. Berge
4d87506720 Show a map marker if position given in URL hash.
If the location URL contains a hash of either:

* #z/x/y
* #x/y

In the first case it will zoom and pan to the location;
in the second case it will only pan while maintaining the
current (or last used) zoom level.

If the location URL does not contain a hash in one of those
formats, the marker will be removed from the map.
2021-05-17 17:14:35 +02:00
D. Berge
20bce40dac Upgrade Vue components 2021-05-17 14:22:26 +02:00
D. Berge
cf79cf86ae Fix ‘this is undefined’ error 2021-05-16 21:38:31 +02:00
D. Berge
8e4f62e5be Reset snack message when hiding.
This is so that the same message will cause the snack
to be shown again.
2021-05-16 19:58:36 +02:00
D. Berge
a8850e5d0c Protect the /project/:project/meta route 2021-05-16 19:58:03 +02:00
D. Berge
b5a762b5e3 Merge branch '108-remove-edit-controls-for-read-only-users' into 'devel'
Resolve "Remove edit controls for read-only users"

Closes #108

See merge request wgp/dougal/software!8
2021-05-16 17:56:35 +00:00
D. Berge
418f1a00b8 Hide edit controls from ready-only users 2021-05-16 19:55:31 +02:00
D. Berge
0d9f7ac4ec Add privilege level getters to Vuex.
* writeaccess: true if user can change data.
* adminaccess: true if user is an administrator.
2021-05-16 19:53:24 +02:00
D. Berge
76c9c3ef2a Assign (some) offline navdata to a survey.
There is no concept of ‘current survey’ in Dougal, and
assigning navigation data to a particular survey is full
of edge cases but sometimes it is necessary or at least
convenient to do so.

This commit implements once such strategy, which consists
of checking the distance to the preplots of all active
surveys (well, those that do have preplots anyway) and
picking the nearest one.

To reduce load, we only do this every once in a while as
governed by the `offline_survey_detect_interval` option
in the configuration.

This strategy is only active if the configuration option
`offline_survey_heuristics == "nearest_preplot"` for the
corresponding navigation header.
2021-05-16 03:16:19 +02:00
D. Berge
ef798860cd Add collect filter to template renderer.
This filter can collect attributes from items having the
same key into a single item.

Can be used in templates like this:

{% for Entry in Sequence.Entries |
   collect("ShotPointId", ["EntryType", "Comment"]) %}

to avoid duplicating shotpoint numbers.
2021-05-15 20:07:02 +02:00
D. Berge
e57c362d94 Fix error with timestamp filter (again) 2021-05-15 20:06:36 +02:00
D. Berge
7605b11fdb Fix error with timestamp Nunjucks filter 2021-05-15 18:59:47 +02:00
D. Berge
84e791fc66 Add more sequence information to SeisJSON file 2021-05-15 18:37:32 +02:00
D. Berge
3e2126cc32 Add option to download reports from sequence list.
The context menu includes options to download the sequence
report in different formats.
2021-05-15 17:12:41 +02:00
D. Berge
b0f4559b83 Allow direct downloading of sequence reports.
If the `download` or `d` query parameter is supplied (even
without any value), the response will include a
`Content-Disposition: attachment` header. A filename will
also be suggested.
2021-05-15 17:10:28 +02:00
D. Berge
c7e2e18cc8 Merge branch '84-produce-human-readable-versions-of-json-structured-sequence-data-exports-sse' into 'devel'
Resolve "Produce human-readable versions of JSON structured sequence data exports (SSE)"

Closes #84

See merge request wgp/dougal/software!7
2021-05-15 13:07:07 +00:00
D. Berge
42697fe91d Provide a default replacement for @POS@ markers 2021-05-15 01:57:46 +02:00
D. Berge
900d7f7a3e Ensure that a geometry exists 2021-05-15 01:57:46 +02:00
D. Berge
f1953807db Add position filters to Vue.
Given some text and an item containing a Point geometry,
the `position` filter replaces occurences of @POS@ or
@POSITION@ with the item's geometry (it has to be lat/lon).

Occurrences of @DMS@ are replaced with the position in
sexagesimal degrees.

This can be used anywhere a Vue filter can. However, we
have used it in the event comments edit dialogue. The positions
are replaced before saving the comment to the database.
2021-05-15 01:57:46 +02:00
D. Berge
814e071698 Add Markdown support to map tooltips 2021-05-15 01:57:46 +02:00
D. Berge
2aba132220 Add Markdown support to preplot lines comments 2021-05-15 01:57:46 +02:00
D. Berge
15a802227d Add Markdown support to planned lines comments 2021-05-15 01:57:46 +02:00
D. Berge
6745757712 Add Markdown support to log comments 2021-05-15 01:57:45 +02:00
D. Berge
9ff76867c9 Add Markdown support to sequence list comments 2021-05-15 01:57:45 +02:00
D. Berge
e8811560de Add global .markdown class.
It changes textareas to be monospaced.
2021-05-15 01:57:45 +02:00
D. Berge
65b33a6b0f Add Vue Markdown filters.
{{ '**strong** _em_' |markdown }} gives:
<p><strong>strong</strong> <em>em</em></p>

{{ '**strong** _em_' |markdownInline }} gives:
<strong>strong</strong> <em>em</em>
2021-05-15 01:57:45 +02:00
D. Berge
b8b5765b46 Split markdown Nunjucks filter into two new ones.
{{ '**strong** _em_' |markdown }} gives:
<p><strong>strong</strong> <em>em</em></p>

{{ '**strong** _em_' |markdownInline }} gives:
<strong>strong</strong> <em>em</em>
2021-05-15 01:57:45 +02:00
D. Berge
53f4e167f8 Update ‘marked’ version on server 2021-05-15 01:57:45 +02:00
D. Berge
3d8f524d4a Expose PDF output option in user interface 2021-05-15 01:57:45 +02:00
D. Berge
1e68676ac6 Add PDF output option for events log 2021-05-15 01:57:45 +02:00
D. Berge
2c2d594877 Add Selenium webdriver to backend.
Used for generating PDFs via a Firefox instance.
2021-05-15 01:57:45 +02:00
D. Berge
fae849aeab Send specific error message if HTML template not found 2021-05-15 01:57:45 +02:00
D. Berge
1d47495799 Adapt log view controls to small viewports 2021-05-15 01:57:45 +02:00
D. Berge
592632d669 Add timestamp filter to renderer 2021-05-15 01:57:45 +02:00
D. Berge
26c05b9e3c Add Markdown support to template renderer 2021-05-15 01:57:45 +02:00
D. Berge
3f9a40724d Add download menu to sequence logs.
The menu lets the user retrieve a sequence's events
in a variety of formats:

* JSON
* Seis+JSON
* GeoJSON
* HTML
2021-05-15 01:57:45 +02:00
D. Berge
a652a08815 Add GET endpoint for sequence events.
Provides a variety of formats:

* JSON
* Seis+JSON
* GeoJSON
* HTML
2021-05-15 01:57:45 +02:00
D. Berge
61ffd1b766 Refactor the function producing Seis+JSON into its own file.
For reuse.
2021-05-15 01:57:45 +02:00
D. Berge
d9f4583224 Implement GET middleware for events.
Produces a choice of outputs: JSON, GeoJSON, Seis+JSON and HTML.
2021-05-15 01:57:45 +02:00
D. Berge
95647337aa Add Nunjucks renderer.
The render function takes a JSON file and a Nunjucks
template and outputs a rendered version of the JSON
data according to the template.
2021-05-15 01:57:45 +02:00
D. Berge
b1e152179e Add new command: insert_event.py
Used to insert a timed event in the log.
2021-05-15 01:56:49 +02:00
D. Berge
142a820ed7 Process comment markers server-side.
Replace @POS@, @POSITION@ and @DMS@ in the remarks
with the event's position (sexagesimal degrees for
the last one).
2021-05-15 01:54:07 +02:00
D. Berge
838b45ef26 Do not fail if some data files are missing 2021-05-15 01:51:55 +02:00
D. Berge
30914b267a Set the right Content-Type for error outputs 2021-05-13 21:48:46 +02:00
D. Berge
f1cbbdb56b Check if raw P1/11 has records.
Even if it hasn't, the file gets imported anyway
(into the `files` table) but we exit early to avoid
an error when trying to determine shooting direction.

This check is not necessary for final P1/11 or gun data.

Fixes #104.
2021-05-12 20:35:51 +02:00
D. Berge
9973e8f132 Merge branch '94-let-users-assign-a-colour-to-preplot-lines' into 'devel'
Resolve "Let users assign a colour to preplot lines"

Closes #94

See merge request wgp/dougal/software!6
2021-05-09 22:41:51 +00:00
D. Berge
f53c479262 Add option to assign colours to preplot lines.
A ‘Set colour…’ option is available from the context menu;
it presents a dialogue allowing the user to choose a colour
that will be assigned to that preplot line and used as the
background colour for the corresponding row on the table
(may also be used for other things).

Because there is a good chance that the user may decide to
colour a large number of lines and it is cumbersome to do
it one at a time, a multiple selection option has also been
added. The context menu then shows options which will apply
to all selected rows. At this time only the change colour
option is available, but it can be extended easily.
2021-05-10 00:22:57 +02:00
D. Berge
73a415a038 Return preplot metadata via the API 2021-05-10 00:21:56 +02:00
D. Berge
0b24e3224f Let calendar toolbar follow theme.
Fixes #80.
2021-05-09 21:23:34 +02:00
D. Berge
c271256015 Remember map view settings.
Save the layer and overlay selection + map zoom and
position per user per project in the browser's
localStorage.

Closes #96.
2021-05-09 15:29:17 +02:00
D. Berge
4887ddaa26 Do not update page location on map change.
Fixes #77.
See also #96.
2021-05-09 03:52:25 +02:00
D. Berge
788c582f98 Show planned sequences in status field of preplots table.
Closes #86.
2021-05-09 00:20:02 +02:00
D. Berge
df9f7f33cf Retrieve user data for LineList.
Fixes a bug in commit fd2e0399f8.
2021-05-09 00:16:08 +02:00
D. Berge
fd2e0399f8 Remember last applied number of table rows.
A hopefully sensible default is applied, but if the
user changes it, the last selected value is saved
in the browsers localStorage.

Preferences are saved per user, project and table. And
per browser, of course, as those are only saved locally.

Closes #41.
2021-05-08 21:54:55 +02:00
D. Berge
db733ceef8 Add key to feed items 2021-05-08 21:54:06 +02:00
D. Berge
f905eb3fdf Upgrade Vuetify to latest version 2021-05-08 21:53:31 +02:00
D. Berge
e707887702 Change colour of planned lines on map.
As magenta is already used for the real-time track.
2021-05-08 20:36:19 +02:00
D. Berge
c0ace1fe07 Make check mark green if non-leaf QC item has no children.
If a test passes for all items, show the (single) check mark
and colour it green.

Leaf nodes always have their check mark in the default colour.

Related to #90.
2021-05-08 04:08:37 +02:00
D. Berge
7bb3a3910b Show development activity log.
A button in the help dialogue takes the user to the
/feed/… frontend URL, where the latest development
activity is shown, taken from the GitLab RSS feed
for the project.
2021-05-08 00:46:31 +02:00
D. Berge
983113b6cc Add flag to api action to fetch non-JSON data.
If {text:true} or another truthy value is passed as the
`text` option, the api action will use Response.text()
instead of Response.json().
2021-05-08 00:44:05 +02:00
D. Berge
ff66c9a88d Handle planner sequence value for first line in prospect.
The next sequence to shoot is normally retrieved from the
database via getSequence(), but it returns false if no
sequences have been shot yet.

In that case we use a default value of `1` to build the
name of the planned line.

Fixes #81
Fixes #82
2021-05-08 00:20:15 +02:00
D. Berge
56d30d48c5 Adapt help dialogue to small viewports 2021-05-07 23:52:36 +02:00
D. Berge
df3a0b4c50 Be explicit about what type of data is being QC'ed.
The source deviation QCs now tell the user whether raw
or final data is being QC'ed.
2021-05-07 21:29:39 +02:00
D. Berge
f87aa08246 Check if gun data missing for entire line.
The `sequences` object now carries the attribute
`has_smsrc_data`, a boolean which is true iff
there is at least one `smsrc` record in the raw
shots metadata.

This is used by:

1. A new sequence-wise test which reports if gun
   data is missing for the entire sequence.

2. The individual `missing_gun_data` test which
   is inhibited if `has_smsrc_data` for the
   corresponding sequence is false.

Closes #93.
2021-05-07 14:04:48 +02:00
D. Berge
ea499a645b Update package dependencies 2021-05-07 14:04:12 +02:00
D. Berge
0fdb42c593 Do not import files that have just been modified.
We now check that a file is at least a few seconds old
before attempting to import it.

The actual minimum age can be configured in etc/config.yaml or
else is defaults to 10 seconds.

The idea is that this should give the OS enough time to fully
write the file before we import it.

The timestamp being looked at is the modification time.

Fixes #92.
2021-05-07 13:50:32 +02:00
D. Berge
6e5584a433 Make the QC double-tick green if all items accepted.
Closes #90.
2021-05-07 13:38:26 +02:00
D. Berge
0a4df0793d Update package dependencies 2021-05-07 13:37:44 +02:00
D. Berge
1e6cc67b05 Merge branch '63-serve-api-specification' into 'devel'
Resolve "Serve API specification"

Closes #63

See merge request wgp/dougal/software!5
2020-12-30 08:45:55 +00:00
D. Berge
3c4a558e02 Serve OpenAPI document on API root.
When a client makes a request for `/` (the root of
the API), the OpenAPI description is served in an
appropriate format according to the `Accept` request
header, as follows:

Accept: text/html => HTML version
Accept: application/json => JSON version
Accept: * => YAML version
2020-12-29 16:20:57 +01:00
D. Berge
76001cffe1 Create HTML version of OpenAPI doc on install.
When running `npm install`, a self-contained HTML document
with the contents of the OpenAPI specification is saved as
openapi.html in the same directory as openapi.yaml.
2020-12-29 16:18:53 +01:00
D. Berge
45a9c5aa07 Document login and logout endpoints 2020-10-23 17:28:41 +02:00
D. Berge
f926184471 Add label descriptions to API spec 2020-10-23 15:14:52 +02:00
D. Berge
5ffd3712cf Merge branch '61-user-authentication' into devel 2020-10-23 15:14:09 +02:00
D. Berge
80451796e1 Convert expiry time to milliseconds for set-cookie 2020-10-23 14:59:45 +02:00
D. Berge
141d5805ae Reissue user login tokens when close to expiring 2020-10-23 14:50:35 +02:00
D. Berge
250ffe243d Fix JWT token time to live.
Now half an hour.
2020-10-23 14:49:52 +02:00
D. Berge
b4decd018a Add API documentation 2020-10-23 11:09:08 +02:00
D. Berge
46d489c91f Fix metadata retrieval from preplots 2020-10-23 11:01:38 +02:00
D. Berge
8a0bcc5cb4 Change HTTP response status from 201 to 204 2020-10-23 11:00:56 +02:00
D. Berge
77258b12e9 Merge branch '62-service-desk-from-ss-om-magseisfairfield-com-bug-report' into 'devel'
Resolve "Service Desk (from ss.om@magseisfairfield.com): Bug report"

Closes #62

See merge request wgp/dougal/software!4
2020-10-15 17:20:30 +00:00
D. Berge
6896d8bc87 Change sequence renumbering behaviour.
By default, change just the number of the sequence
being edited. It is checked for conflict with other
planned sequences but not with anything already acquired.

If the user ticks the ‘shift all’ checkbox, then all
planned sequences are shifted by the same amount.
2020-10-15 19:07:27 +02:00
D. Berge
80b463fbb7 Change default sequence assignment for planned lines.
If there are other lines in the planner, we increment the
highest numbered sequence in the planner by one.

If there are no planned lines, we take the highest numbered
raw sequence and increment by one.
2020-10-15 19:05:14 +02:00
D. Berge
59aaacbeee Apply access restrictions to writable routes 2020-10-12 19:43:07 +02:00
D. Berge
3c86981dc6 Add authorisation middleware.
Defines three levels of access:
* read: anyone who is logged in
* write: `user` and `admin` roles
* admin: `admin` roles
2020-10-12 19:42:02 +02:00
D. Berge
5594b6863c Do not run authentication if headers already sent 2020-10-12 19:41:00 +02:00
D. Berge
7201c29df5 Inject auth middleware after login routes.
Routes not requiring authentication must,
self-evidently, go before the authentication
middleware.
2020-10-11 22:11:36 +02:00
D. Berge
947736e8c1 Check code rather than errno.
Different versions of that library work
differently.
2020-10-11 22:10:21 +02:00
D. Berge
d782a30e90 Avoid decoding empty cookies 2020-10-11 19:59:28 +02:00
D. Berge
987dbb7700 Handle null/invalid cookies 2020-10-11 19:36:11 +02:00
D. Berge
cdd007ce88 Fix authentification middleware 2020-10-11 19:08:36 +02:00
D. Berge
a38066ec82 Set cookie / user to null if failing to decode JWT 2020-10-11 19:06:57 +02:00
D. Berge
2aca34e488 Read user login info from discrete file.
`$DOUGAL_ROOT/etc/users.yaml` to be exact.
2020-10-11 18:21:19 +02:00
D. Berge
324306a77d Remove logging statement 2020-10-11 18:20:41 +02:00
D. Berge
ab8a66bdcf Set JWT default options 2020-10-11 17:58:41 +02:00
D. Berge
b3f393a6f1 Make navigation bar user control functional.
Shows whether the user is logged in and presents
appropriate options according to whether this is
a manual or automatic login (a manual login is
when the user explicitly logs in with a user name
and password).
2020-10-11 17:57:00 +02:00
D. Berge
1ee886db63 Add login/logout views to frontend 2020-10-11 17:56:32 +02:00
D. Berge
fc9450434c Read credentials from cookie store when loading app 2020-10-11 17:55:17 +02:00
D. Berge
00f4fcf292 Read credentials from API responses 2020-10-11 17:54:34 +02:00
D. Berge
0512ac2c3c Add user module to Vuex store 2020-10-11 17:53:39 +02:00
D. Berge
dd32982cbe Add login/logout middleware 2020-10-11 17:52:13 +02:00
D. Berge
a3bfb73937 Add authentication middleware.
The user is authenticated by one of the following
methods, in order of priority:

* The presence of a valid JWT.
* Its IP.
* Its hostname.

In the case of the latter two methods, if authentication
is successful a JWT valid for 15 minutes will be generated
and passed back to the user in a cookie.
2020-10-11 13:11:43 +02:00
D. Berge
e0cd52f21a Replace favicon 2020-10-11 12:17:40 +02:00
D. Berge
d902806c32 Replace logo 2020-10-11 12:08:00 +02:00
D. Berge
f3e171264c Fix development websocket URL 2020-10-09 18:10:14 +02:00
D. Berge
6e7016e2ac Merge branch '59-planner' into 'devel'
Resolve "Planner"

Closes #59

See merge request wgp/dougal/software!2
2020-10-09 15:58:06 +00:00
D. Berge
c0e25ac36f Allow editing fsp/lsp in planner.
This is a very basic implementation and doesn't
check that the points are indeed valid.

A proper solution is to request the list of preplots
for that line from the server and validate against those.
2020-10-09 15:09:43 +02:00
D. Berge
2031922d68 Update line names when renumbering sequences 2020-10-09 15:04:45 +02:00
D. Berge
aeae758744 Let patch regenerate line name.
If the user sends a patch to a line with
a `name` which is exactly `null` or the
empty string `""`, the server will regenerate
the name based on the defaults script.
2020-10-09 15:02:31 +02:00
D. Berge
b7f65c4f78 Fix transaction handling 2020-10-09 15:01:42 +02:00
D. Berge
eb582863bb Move auxiliary functions to a separate file.
So they can be reused.
2020-10-09 15:00:01 +02:00
D. Berge
5415e81334 Avoid conflict when decrementing sequence numbers 2020-10-09 14:06:38 +02:00
D. Berge
9fd48c6a5a Show planned lines on map 2020-10-09 13:59:59 +02:00
D. Berge
851a076c06 Implement editing of most planned sequence details 2020-10-09 13:59:11 +02:00
D. Berge
72922560d2 Add GeoJSON output of planned lines 2020-10-09 13:58:11 +02:00
D. Berge
60ff4f57b1 Allow patching planned sequence name 2020-10-09 13:57:12 +02:00
D. Berge
22b7aa5112 Use view instead of ad-hoc query 2020-10-09 13:56:29 +02:00
D. Berge
ba8eeb82d3 Listen for planner events 2020-10-09 13:55:34 +02:00
D. Berge
6e7ba82ed3 Add planner elements to DB schema 2020-10-09 13:54:45 +02:00
D. Berge
6f521d1968 Add explanatory text to planner's no-data message 2020-10-08 17:06:04 +02:00
D. Berge
bc54d4ad59 Add option to add sequence to planner as reshoot 2020-10-08 16:40:52 +02:00
D. Berge
c4915e43d7 Add option to append line to planner 2020-10-08 16:40:52 +02:00
D. Berge
a8fa238e68 Add planner component to site 2020-10-08 16:40:52 +02:00
D. Berge
d86a5a2feb Add planner frontend component 2020-10-08 16:40:52 +02:00
D. Berge
63254a6bf7 Add planner endpoints 2020-10-08 16:40:52 +02:00
D. Berge
2a19caf219 Fix SQL error in QC shots selection query 2020-10-06 19:58:55 +02:00
D. Berge
6d427c4b1a Update version in package-lock 2020-10-06 19:48:59 +02:00
D. Berge
eb6329e6f7 Catch DB connection errors.
If we can't connect straight away (either first time
or after a disconnection), keep retrying until we
manage.
2020-10-06 19:44:07 +02:00
D. Berge
2486cb3944 Monitor for disconnection from DB.
The events listener now listens to the 'end' event from
the PostgreSQL driver and will attempt to reconnect if
we get disconnected.
2020-10-06 19:22:56 +02:00
D. Berge
739cf4b9ec Add missing triggers in preplot, raw and final tables 2020-10-06 19:19:35 +02:00
D. Berge
dd9be0ea82 Do not QC online data.
It's kind of pointless at the moment (we will probably
want a separate QC for online data) and it may cause shots
to be, at least temporarily, flagged as having missing
gun data.

Closes #29.
2020-10-06 18:34:33 +02:00
D. Berge
83d966c4b7 Do not import gun data into online shots.
These shots will be deleted when the raw P1 is
imported deleting the gun data, but the gun data
file will show as having been imported.

(#29)
2020-10-06 18:33:59 +02:00
D. Berge
efc1711158 Fix previous fix to preplot azimuth calculation.
Closes #58.
2020-10-06 18:29:59 +02:00
D. Berge
4b7f544e28 Order preplot points according to incr flag.
Fixes #58.
2020-10-06 16:39:07 +02:00
D. Berge
0c1fde09c6 Update dependencies 2020-10-04 20:17:49 +02:00
D. Berge
d17a2ce463 Make EPSG selection subquery more specific.
More of a kludge than a fix. See #56 for
a cleaner solution.

Closes #55.
2020-10-04 19:27:16 +02:00
D. Berge
afa34867f5 Modify functions accessing file_data to return jsonb.
Following schema change in 2d270cdef9.
2020-10-04 05:13:17 +02:00
D. Berge
3ed5558490 Fix query to account for new column type.
The type of the file_data.data column was
changed from JSON to JSONB by commit
2d270cdef9.

Fixes #54.
2020-10-04 04:29:14 +02:00
D. Berge
3de7d5d334 Re-export schema template.
No changes, it merely shuffles the position of raw_lines
in the export.
2020-10-04 04:07:50 +02:00
D. Berge
3c215e9973 Fix error with definition of missing_final_points view 2020-10-04 04:06:15 +02:00
D. Berge
35a6a9188a Add missing shots to structured sequence export files.
Closes #8, #9, #12.
2020-10-04 03:51:30 +02:00
D. Berge
963f75fd51 Add exporting of missing shots.
It is enabled by default but the user should pass
`missing=t` (or some other truty value) in the query
part of the request.
2020-10-04 03:49:04 +02:00
D. Berge
d3d535a8be Add missing option to sequence.list DB method.
If present and truthy, it will cause the output
to contain two extra fields: missing_raw and
missing_final, each consisting of a JSON array
containing missing raw and final shots for the
corresponding sequence.

In the event that the option was passed but
there are no missing shots, the two aforementioned
fields will still be present and consist of
empty arrays.

Note that this makes the query significantly
slower.
2020-10-04 03:48:15 +02:00
D. Berge
700e683022 Import preplot file configuration into database.
We do this so that we can look for the "saillineOffset"
parameter, which we expect to be present in source
preplot imports and allows us to correlate source
and sail lines.

The change to bin/sps.py is necessary to let the JSON
serialisation take place.
2020-10-04 03:42:19 +02:00
D. Berge
0a684cd02a Add option to import file data for preplot files 2020-10-04 03:41:20 +02:00
D. Berge
947bf72260 Rename variable.
To be consistent with the other methods.
2020-10-04 03:39:58 +02:00
D. Berge
362d9dc1a5 Re-export schema template.
No changes, it merely shuffles the position of raw_lines
in the export.
2020-10-04 03:37:20 +02:00
D. Berge
1249c976ef Add views related to missing shots and outstanding production 2020-10-04 03:33:23 +02:00
D. Berge
2d270cdef9 Change file_data column type from JSON to JSONB 2020-10-04 01:22:42 +02:00
D. Berge
a101542bc2 Do not import gun data if sequence has no shots.
Doing otherwise will result in the gun data file
appearing has having been read, but no data will
have been saved as there was nowhere to save to.

Fixes #29.
2020-10-03 00:44:55 +02:00
D. Berge
39256a4917 Fix copy/paste error in Log view 2020-10-03 00:36:53 +02:00
D. Berge
2ca83f9a60 Remove debugging statements 2020-10-02 21:21:58 +02:00
D. Berge
2f7315f133 Mark or unmark entire lines as NTBA.
Closes #52.
2020-10-02 20:40:13 +02:00
D. Berge
632a056f98 Include NTBA status in line endpoint response 2020-10-02 20:39:48 +02:00
D. Berge
c013073104 Save deferred import data as a single transaction.
Each of the save_* operations starts a transaction
(which is automatically commited if all goes well).

The main reason for this is to ensure that by the
time raw_lines and final_lines events fire, the
corresponding entries in raw_shots and final_shots
have already been populated.
2020-10-02 19:31:14 +02:00
D. Berge
de2deedfd2 Do not refresh sequence list on raw_shots event.
Otherwise, it will get refreshed continuously while
online, which is not great.
2020-10-02 19:30:24 +02:00
D. Berge
eb37a6b6c6 Listen also to shot events.
We need this because the *_lines event will
always fire before any shots have been imported.
2020-10-02 18:35:05 +02:00
D. Berge
198d0072d4 Show sequence remarks + NTBP status on map.
Closes #50.
2020-10-02 17:58:44 +02:00
D. Berge
11a5020004 Remove old QCs when importing final shots.
When final shots are inserted, updated or
deleted, the corresponding QC info (which
is always held in raw_shots.meta->'qc')
is deleted.

If applicable, it will be recreated on the
next QC run.
2020-10-02 17:30:42 +02:00
D. Berge
e8c230ccc2 React to sequence change notifications in SequenceList 2020-10-02 16:33:06 +02:00
D. Berge
cf1678ed25 Increase body-parser limits.
Fixes #51.
2020-10-02 15:28:43 +02:00
D. Berge
940aea8c7b Run events manager in a separate process.
Not sure if this is going to do anything in
terms of improving the handling of and reacting
to events, but it doesn't seem to hurt terribly.

Eventually, all this will probably need to be
refactored to use EventEmitter.
2020-10-02 01:34:49 +02:00
D. Berge
c404edc4b3 Improve reliability of reaction to events.
Hopefully.
2020-10-02 01:33:58 +02:00
D. Berge
7451433aa4 Remove debugging statements 2020-10-02 01:33:33 +02:00
D. Berge
cf9cb393a9 Fix table labelling 2020-10-02 00:47:39 +02:00
D. Berge
b77ffa5d0f Ensure line statuses are visible on mobile 2020-10-02 00:42:19 +02:00
D. Berge
f30f108e08 React to preplot_lines notifications 2020-10-02 00:41:16 +02:00
D. Berge
746e3405fb Let websocket listen to all DB notification channels 2020-10-02 00:39:53 +02:00
D. Berge
902338f835 Restrict patching actions to vessel lines only.
These are the only type of lines returned by list().
2020-10-01 18:37:49 +02:00
D. Berge
381e3773c6 Allow editing of remarks in preplot lines list 2020-10-01 18:28:59 +02:00
D. Berge
f9ef971802 Add preplot line patching endpoint.
Allows us to change remarks, meta and ntba
fields in preplot lines.
2020-10-01 18:28:02 +02:00
D. Berge
3a87f8959a Add slot for empty line status 2020-10-01 15:31:24 +02:00
D. Berge
c12c2a3861 Update events view.
Searches for timed events geometry within
the event's own metadata.
2020-09-30 22:49:01 +02:00
D. Berge
21439fdd3e Associate a position with timed events if possible.
When inserting or updating a timed event, a trigger
searches the real time events table for a position
close (within one minute) of the event time and
adds it to the event's metadata.
2020-09-30 22:45:50 +02:00
D. Berge
79a751393c Highlight active row in preplot lines list 2020-09-30 20:12:15 +02:00
D. Berge
60000eeaf1 Make preplot lines list searchable 2020-09-30 20:04:35 +02:00
D. Berge
d2c65b480b Add status information to preplot lines list 2020-09-30 19:44:15 +02:00
D. Berge
589fe07ad6 Add <dougal-line-status/> component.
It shows a graphical representation of the
acquisition status of a preplot line, as
stacked bars where each bar represents the
acquisition extent of a sequence. This is
complemented by colour denoting the raw/final/
ntbp status of the sequence and by a tooltip
with the same information. Sequence bars may be
made clickable by providing a function to the
"sequence-href" property; this function should
take a sequence object and return a URL.
2020-09-30 19:41:09 +02:00
D. Berge
bdf573d4a6 Add Dougal-specific data to structured sequence exports 2020-09-30 15:54:16 +02:00
D. Berge
873c29ad00 Remove logging statement 2020-09-30 15:40:31 +02:00
D. Berge
3c1a5da1a8 Suspend trigger during system data import 2020-09-30 15:39:37 +02:00
D. Berge
19db65c999 Change configuration paths for SSE export 2020-09-30 15:39:10 +02:00
D. Berge
32d97a4856 Fix errors in Multiseis export 2020-09-29 20:28:56 +02:00
D. Berge
e7099643f5 Try harder to produce Multiseis export files.
We try to ensure that for each sequence we have
at least FSP and LSP entries. We do this by:

* If there exists a FGSP / LGSP, we clone those as
FSP / LSP respectively.

* Otherwise, we take the first and last shots
found in the final P1 which have a preplot.

We also include log comments whenever possible and
format the azimuth a bit better.
2020-09-29 18:05:46 +02:00
D. Berge
65c26f56c7 Import proper UTF-8.
What's the point of Python defaulting to ASCII
when JSON is explicitly defined as a binary
format with a default character encoding of
UTF-8? 🙄
2020-09-29 17:46:56 +02:00
D. Berge
2733223037 Refresh the log more aggressively.
On the backend, the events endpoint caches responses
by ETag in order to reduce the load on the database
and response times. That cache is supposed to be
invalidated with the middleware receives a notification
that the underlying data has changed. However, that
change might arrive just *after* the HTTP request,
meaning that if we let the browser do its thing we
will probably be returning stale data.

So what we do is we explicitly request a non-cached
response when we know that something has changed
because we changed it ourselves.

What we do not do is bypass the cache if we receive
a change event notification, as we assume that the
HTTP roundtrip will be more than sufficient for
the server to have invalidated (and perhaps even
refreshed) its response.
2020-09-29 00:30:40 +02:00
D. Berge
77ff9a047c Pass a different copy of the data to each listener 2020-09-28 23:07:27 +02:00
D. Berge
2886bd4943 Do not react to events until we have at least two of them.
This is to avoid false positives when restarting the API
mid-line.
2020-09-28 21:43:24 +02:00
D. Berge
03563bdaf2 Auto-create FSP/LSP events from nav header.
If the nav header is being received, this will
try to detect the start and end of line and create
an entry in the log.

It doesn't check whether FSP/FGSP LSP/LGSP entries
already do exist for that sequence.

Closes # 28.
2020-09-28 21:36:32 +02:00
D. Berge
7758f08a79 Use array instead of set for storing callbacks 2020-09-28 21:33:16 +02:00
D. Berge
b73dd3fe1e Add function to return project ID from schema name 2020-09-28 21:32:28 +02:00
D. Berge
0d72ea3c88 Purge QCs that are no longer failing.
If a QC has failed in the past there will be a
record of it in raw_lines, raw_shots or preplot_points.

If that QC then stopped failing, e.g., because of a
change of parameters, then the QC results would correctly
reflect the change but not the line/shot tables and
hence, the event log.

This commit hopefully takes care of that.
2020-09-28 14:58:48 +02:00
D. Berge
e75e866285 Tidy up formatting 2020-09-28 13:12:18 +02:00
D. Berge
ca41bd8132 Do not lose data during database upgrades.
The database upgrade script is updated to export
also user-entered data stored in columns of tables
containing also derived data, and to re-import
everything after the upgrade.
2020-09-27 19:36:28 +02:00
D. Berge
a05ecfd41c Add functions to export/import specific columns from DB.
Unlike system_imports.py and system_exports.py, which
deal with whole tables via COPY, this allows us to
export / import *either* whole tables or specific
columns only.

The data will be exported to text files containing
the selected columns + the primary key columns for
the table.

When importing, those tables for which a selection
of columns was exported must already be populated.
The import process will overwrite the data of the
non primary key columns it knows about. If whole
tables are exported, on the other hand, when
re-importing rows will be appended rather than
updated. It is the user's responsibility to make
sure that this will not cause any conflicts.
2020-09-27 19:29:48 +02:00
D. Berge
bf313dd8e5 Allow editing of remarks in sequence list 2020-09-27 19:25:45 +02:00
D. Berge
371030e61e Add optional callback to Vuex API action.
The callback has the signature (err, res) where
res is the result object from the fetch() request
and err is non-null if an error occurred and fetch()
threw.

The callback is called before res.json() has had a
chance to run it is really not recommended to
consume the body from this callback as this will cause
an error in the API action itself.
2020-09-27 19:23:04 +02:00
D. Berge
fd1f1a2c1a Implement sequence patching endpoint.
Allows us to change remarks and meta fields in sequences.
2020-09-27 19:21:59 +02:00
D. Berge
bfcc02a140 Comment out logging statement 2020-09-27 19:19:47 +02:00
D. Berge
a7b4b70d59 Remove spurious semicolon 2020-09-27 19:19:19 +02:00
D. Berge
3bd8cbe860 Handle projects for which there is (yet) no QC data 2020-09-27 19:18:19 +02:00
D. Berge
5a74953739 Add vars() method to configuration.py.
Returns some shell variables that are used
by various deferred import processes.
2020-09-27 19:16:44 +02:00
D. Berge
303befef3b Add human exports to runner 2020-09-26 23:41:09 +02:00
D. Berge
4e70090b40 Export structured sequence data to JSON files.
Script meant to be run by runner.sh.

It will not overwrite existing files. If a
sequence is modified after the first export,
the resulting file needs to be removed by the
user before a re-export will occur.

The idea is to eventually export on demand
when a new raw is added to final_lines.
2020-09-26 22:57:36 +02:00
D. Berge
acf58df59f Return events in structured sequence export format.
The events endpoint will return data in the format
agreed with Multiseis if the request has an
Accept: application/vnd.seis+json
header.

Related to #12.
2020-09-26 22:55:11 +02:00
D. Berge
78adb2bef7 Make set_survey argument case insensitive 2020-09-26 22:53:48 +02:00
D. Berge
be242e109a Request map events from server as GeoJSON 2020-09-26 18:06:22 +02:00
D. Berge
b76f1f166b Refactor events middleware.
The reason for refactoring was to accommodate
Multiseis / client sequence exports, which will be
served by this endpoint via a specific Content-Type.

In the process, the cache has been fixed and redesigned.

Related to #12.
2020-09-26 17:41:47 +02:00
D. Berge
ae8a25f240 Avoid making unnecessary map requests 2020-09-26 01:34:14 +02:00
D. Berge
39b425b392 Add comment 2020-09-26 01:24:12 +02:00
D. Berge
f2024a7b99 Abort pending requests on successive map refreshes 2020-09-26 01:18:12 +02:00
D. Berge
b383f4e4c0 Slightly increase size of event markers 2020-09-26 01:17:34 +02:00
D. Berge
1d8036b429 Protect against events without payload.
These will eventually arrive when we stop ignoring
events > 8 KiB.
2020-09-26 01:16:25 +02:00
D. Berge
358eb44de3 Don't report error if a fetch was aborted 2020-09-26 01:15:32 +02:00
D. Berge
6badff2f76 Switch event map layers to ‘push’ style update.
Still no updating on new / deleted events though.
2020-09-25 22:42:33 +02:00
D. Berge
949f42c1dc Show events on map.
Partial implementation. Notably, it does not yet update
when events are added/modified/deleted.

Related to #48.
2020-09-25 18:33:55 +02:00
D. Berge
42d453f714 Cache event responses at the middleware level.
This is not really a substitute for proper database
design but if deemed useful it might be refactored
into a more generic caching middleware and applied
to other requests as a low-cost alternative to
database refactoring while we gain usage and
performance information.
2020-09-25 18:29:40 +02:00
D. Berge
77aae68603 Include project ID in DB notifications 2020-09-25 18:27:59 +02:00
D. Berge
ab2cf81327 Broadcast an already parsed JSON payload 2020-09-25 18:26:04 +02:00
D. Berge
55cb3856c3 Remove logging statement 2020-09-25 18:23:03 +02:00
D. Berge
9470f41f4b Fix lookup of timed event labels.
Fixes #44.
2020-09-24 15:32:25 +02:00
D. Berge
4dc1c7df8e Re-export schema template.
No changes to the actual schema.
2020-09-24 15:31:53 +02:00
D. Berge
80324130f9 Return event geometries as GeoJSON 2020-09-24 15:24:47 +02:00
D. Berge
72e06c9f2a Give more info to the user in disconnection icon 2020-09-23 22:23:42 +02:00
D. Berge
c25f350c7a Show online indicator.
Closes #37.
2020-09-23 22:18:51 +02:00
D. Berge
46b9978d3f Allow searching sequence list by line.
Closes #40.
2020-09-23 21:13:55 +02:00
D. Berge
808fa71c5f Change colour of highlighted row in sequence list.
The default pink colour in night mode wasn't overly popular.

Related to #36.
2020-09-23 21:05:08 +02:00
D. Berge
a4ed7f7b62 Return all QC events, not just those with labels 2020-09-23 20:19:13 +02:00
D. Berge
60ffff15bf Disconnect DB sessions before starting upgrade 2020-09-23 18:18:39 +02:00
D. Berge
33c23c1239 Return event labels directly from events view.
This speeds up the query by orders of magnitude.
2020-09-23 16:31:55 +02:00
D. Berge
0e5e54b680 Ensure views are synchronised after re-import 2020-09-23 15:48:42 +02:00
D. Berge
6b52383056 Reset sequences after re-import 2020-09-23 15:48:21 +02:00
D. Berge
97104556b7 Do not hard fail if imports fail for one project.
It may be the case that we have already re-imported
some of the data, so we just move on to the next
project.
2020-09-23 15:46:17 +02:00
D. Berge
6bab21bce4 Fix QC test definition 2020-09-23 15:45:41 +02:00
D. Berge
7898cc907d Set, retrieve and process QC labels on frontend 2020-09-20 18:12:48 +02:00
D. Berge
80e8ccef9c Add endpoints for setting and retrieving metadata 2020-09-20 18:11:33 +02:00
D. Berge
7e36305472 Remove logging statements 2020-09-20 18:10:25 +02:00
D. Berge
fe3a825bf7 Do not overwrite other qc info 2020-09-20 18:09:43 +02:00
D. Berge
bdb2fb9c3f Cache QC results associated with each shot / sequence.
This is to enable the user to associate information, in
particular “QCAccepted” override labels, to individual
results. The information stays associated with tests
unless the data is removed or the results change (e.g.,
because the data was reprocessed or the test parameters
or algorithm were changed).
2020-09-20 17:10:47 +02:00
D. Berge
cd392a33df When QCs called without projectId, associate pid with results.
Otherwise the return value would be ambiguous as we wouldn't
know to which project the values belong.
2020-09-20 17:09:31 +02:00
D. Berge
2107a5087a Skip disabled items 2020-09-20 17:08:47 +02:00
D. Berge
2bd4b895b7 Save item id in results 2020-09-20 17:08:15 +02:00
D. Berge
1f837b12df Save item type in results.
Not sure if this is actually used though.
2020-09-20 17:05:32 +02:00
D. Berge
5324a71523 Fix assignment of _id values to tests 2020-09-20 17:04:13 +02:00
D. Berge
ae0052de0c Add id to QC tests.
Each test has to have a unique id, in order that
we can associate results and labels with them and
cope with reorganisation of the QC tree.
2020-09-20 17:00:52 +02:00
D. Berge
92f15d00de Check for Notifications in the window object 2020-09-19 14:48:03 +02:00
D. Berge
b58dfc847a Adapt footer to small viewports 2020-09-15 03:06:31 +02:00
D. Berge
6c582e6b4b Notifications control: cancel waiting state if switched off 2020-09-15 03:02:12 +02:00
D. Berge
47166b65e0 Adapt footer to small viewports 2020-09-15 02:54:48 +02:00
D. Berge
f32066695e Add notifications control.
Experimental. Pushes updates of interest (currently
new / changed log entries) via the Notifications API.
Can be enabled or disabled by the user at any time.
2020-09-15 02:41:05 +02:00
D. Berge
3d091eec53 Adapt to small viewports 2020-09-15 02:40:19 +02:00
D. Berge
00e00a2ae6 Change page title 2020-09-15 00:39:46 +02:00
D. Berge
0bbe29febc Make QC tab more mobile friendly.
Closes #38.
2020-09-14 23:58:26 +02:00
D. Berge
d93c70b9eb Make log reactive.
Automatically reloads when events have been changed.
2020-09-14 23:56:30 +02:00
D. Berge
946e05c283 Add project schema to Vuex state 2020-09-14 23:55:46 +02:00
D. Berge
94c3ed1584 Update schema.
* Report also schema in projects_summary
* Add notify triggers to events_timed, events_seq
2020-09-14 23:54:48 +02:00
D. Berge
5980b7d231 Implementation of QC qualification UI (non-functional).
This commit implements a possible interface for QC
qualifications, with a mechanism to override QC results.

As of this commit, no changes are saved to the backend.
2020-09-14 20:48:27 +02:00
D. Berge
ffcc3ef8cb Remove dead code 2020-09-14 20:46:56 +02:00
D. Berge
1348eb9a8a Rename QCFail label to QC 2020-09-14 20:45:34 +02:00
D. Berge
4b3a254119 Ensure labels are unique 2020-09-14 14:06:49 +02:00
D. Berge
5a96701e46 Use labels from preset remarks 2020-09-14 13:46:46 +02:00
D. Berge
165640599c Highlight clicked row in sequence list.
Clicking again unhighlights.

Closes #36.
2020-09-14 11:36:37 +02:00
D. Berge
c9b9a009af Speed up the events view.
We do this with the help of some denormalisation
and a handy trigger.

Closes #35.
2020-09-14 03:01:55 +02:00
D. Berge
b5b91d41c9 Reset event serial ids after re-import.
When the database is recreated, the sequences
used in the events_timed and events_seq tables
will be at their initial values, which will
almost certainly conflict with existing data
when it is imported via COPY.

With this commit, we set the current value for
those sequences to something usable.

Fixes #33.
2020-09-14 01:36:40 +02:00
D. Berge
53077f0baf Remove logging statement 2020-09-13 17:30:25 +02:00
D. Berge
d45e17fce3 Add scripts to launch Dougal services in production 2020-09-13 15:35:36 +02:00
D. Berge
225c710142 Allow taking JWT secret from environment in production 2020-09-13 15:02:58 +02:00
D. Berge
da7a977c59 Change definition of Preplot QC objects 2020-09-12 22:25:45 +02:00
D. Berge
13d4771589 Add description of QC tests 2020-09-12 22:07:08 +02:00
D. Berge
52cdc8904b QC: Add gun deltas warning check 2020-09-12 21:22:34 +02:00
D. Berge
69f43c129b Remove spurious line from QC definition 2020-09-12 21:22:13 +02:00
D. Berge
8a5d103754 Fix typo 2020-09-12 20:59:20 +02:00
D. Berge
1d3d202d1f QC: change threshold of inline moving average check 2020-09-12 19:56:34 +02:00
D. Berge
cd76df9329 QC: check inline position shot by shot 2020-09-12 19:56:00 +02:00
D. Berge
5e130c3e42 QC: measure gun not manifold pressure 2020-09-12 19:54:01 +02:00
D. Berge
7b0bcb5256 Report timestamp when QCs were last run 2020-09-12 19:53:20 +02:00
D. Berge
6d5167c052 Add more QCs 2020-09-12 19:17:14 +02:00
D. Berge
ea34bbc7bb Let the user know if there are no QCs 2020-09-12 19:16:41 +02:00
D. Berge
b28224475e Make style scoped 2020-09-12 19:16:18 +02:00
D. Berge
17d8041945 Update standard QC definitions + parameters 2020-09-10 23:36:13 +02:00
D. Berge
c82caa1d1f Add QC view to web interface.
This commit implements most of what is required
by #23, with the exception of labels processing
as stated in https://gitlab.com/wgp/dougal/software/-/issues/23/designs/20200908_101516.jpg#note_408968681
2020-09-10 23:27:53 +02:00
D. Berge
606f1b8125 Export QC results to text files.
Script meant to be run by runner.sh, either directly
or via an intermediate script that consolidates all
the user outputs in one place.
2020-09-10 23:24:11 +02:00
D. Berge
8841ffc10b Add QC definitions and parameters for Siddis surveys.
These are incomplete but meant to serve as a starting
point for other projects.
2020-09-10 23:20:58 +02:00
D. Berge
a596a3be48 Implement QC routine.
Even though for practical reasons it's inside lib/www
this is meant to be run by bin/runner.sh at regular
intervals and refresh the QC data. This is a work
in progress. This version does not process or produce
labels, it doesn't create events in the event log and
it doesn't do online data, amongst other things.
2020-09-10 23:18:28 +02:00
D. Berge
bda11cc22f Add raw_shots_preplots view.
Used by the QC routines.
2020-09-10 20:50:08 +02:00
D. Berge
48f2931a13 Change type of info table value to JSONB 2020-09-10 20:49:39 +02:00
D. Berge
d192eb3668 Add CRUD operations for project info table 2020-09-10 20:39:06 +02:00
D. Berge
3b5b200f08 Change address of system exports path 2020-09-10 20:37:52 +02:00
D. Berge
08656a0b5e Fix reversed actions in calendar 2020-09-10 02:19:46 +02:00
D. Berge
5fdd84fadf Change style of raw data columns in sequence list 2020-09-10 02:19:07 +02:00
D. Berge
db25878fdd Add info API endpoint.
It queries a project's `info` table.
2020-09-09 15:55:04 +02:00
D. Berge
1a612f74d6 Refactor configuration methods in DB 2020-09-09 15:53:01 +02:00
D. Berge
c40a859efa Deal with Hydronav header fields overflowing.
Fixes #21.
2020-09-09 12:43:37 +02:00
D. Berge
351d2a474b Fix NTBP detection.
Fixes #25.
2020-09-08 18:36:30 +02:00
D. Berge
3c4ed6665c Reorder columns in sequence list 2020-09-08 17:46:25 +02:00
D. Berge
c0592cb72f Show correct sequence status.
Fixes #24.
2020-09-08 16:11:06 +02:00
D. Berge
0a547666e3 Fix filtering of events log.
Closes #22.
2020-09-07 17:23:00 +02:00
D. Berge
2db376e1cc Bubble errors in db.event.post 2020-09-06 23:26:12 +02:00
D. Berge
d285a63746 Remove debug statements 2020-09-06 21:00:06 +02:00
D. Berge
8411eea29d Stop race condition between multiple WebSockets.
Only one websocket should be active or initialising
at any one time.
2020-09-06 18:09:04 +02:00
D. Berge
513e6d6bc5 Fix Webpack development proxy configuration 2020-09-06 14:54:03 +02:00
D. Berge
906dcc6a7e Make event id serial rather than plain integer 2020-09-06 14:49:42 +02:00
D. Berge
4eb0a643c7 Merge branch '18-implement-server-push' into 'devel'
Resolve "Implement server push"

Closes #18

See merge request wgp/dougal/software!1
2020-09-06 12:30:33 +00:00
D. Berge
19ce158329 Import SmartSource header data.
Provided that the SmartSource headers are being
saved to file, and that the path to those files
is present in the survey configuration, we now
import SmartSource information as metadata in
raw_shots.meta->'smsrc'.

Closes #19.
2020-09-06 13:45:56 +02:00
D. Berge
c6285e881e Add function to delete data by file hash 2020-09-06 13:38:37 +02:00
D. Berge
3e949f6185 Add meta fields to schema.
Many tables now have a JSON (jsonb) column to
hold unstructured data.
2020-09-06 13:37:11 +02:00
D. Berge
1124a48e8c Add function to convert regex flags string to Python 2020-09-06 13:33:32 +02:00
332 changed files with 52263 additions and 4845 deletions

1
.gitignore vendored
View File

@@ -10,3 +10,4 @@ lib/www/client/source/dist/
lib/www/client/dist/
etc/surveys/*.yaml
!etc/surveys/_*.yaml
etc/ssl/*

View File

@@ -13,6 +13,17 @@ $HOME is the home directory of the user running this script.
prefix = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
DOUGAL_ROOT = os.environ.get("DOUGAL_ROOT", os.environ.get("HOME", ".")+"/software")
VARDIR = os.environ.get("VARDIR", DOUGAL_ROOT+"/var")
LOCKFILE = os.environ.get("LOCKFILE", VARDIR+"/runner.lock")
def vars ():
return {
"DOUGAL_ROOT": DOUGAL_ROOT,
"VARDIR": VARDIR,
"LOCKFILE": LOCKFILE
}
def read (file = None):
if file is None:
file = prefix+"/etc/config.yaml"
@@ -64,3 +75,15 @@ def files (globspec = None, include_archived = False):
def surveys (globspec = None, include_archived = False):
return [i[1] for i in files(globspec, include_archived)]
def rxflags (flagstr):
"""
Convert flags string into a Python flags argument.
"""
flags = 0
cases = {
"i": re.I
}
for flag in flagstr:
flags |= cases.get(flag, 0)
return flags

View File

@@ -10,7 +10,7 @@
# be known to the database.
# * PROJECT_NAME is a more descriptive name for human consumption.
# * EPSG_CODE is the EPSG code identifying the CRS for the grid data in the
# navigation files, e.g., 32031.
# navigation files, e.g., 23031.
#
# In addition to this, certain other parameters may be controlled via
# environment variables:

View File

@@ -4,6 +4,7 @@ import psycopg2
import configuration
import preplots
import p111
from hashlib import md5 # Because it's good enough
"""
Interface to the PostgreSQL database.
@@ -11,13 +12,16 @@ Interface to the PostgreSQL database.
def file_hash(file):
"""
Calculate a file hash based on its size, inode, modification and creation times.
Calculate a file hash based on its name, size, modification and creation times.
The hash is used to uniquely identify files in the database and detect if they
have changed.
"""
h = md5()
h.update(file.encode())
name_digest = h.hexdigest()[:16]
st = os.stat(file)
return ":".join([str(v) for v in [st.st_size, st.st_mtime, st.st_ctime, st.st_ino]])
return ":".join([str(v) for v in [st.st_size, st.st_mtime, st.st_ctime, name_digest]])
class Datastore:
"""
@@ -128,6 +132,22 @@ class Datastore:
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def del_hash(self, hash, cursor = None):
"""
Remove a hash from a survey's `file` table.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "DELETE FROM files WHERE hash = %s;"
cur.execute(qry, (hash,))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def list_files(self, cursor = None):
"""
List all files known to a survey.
@@ -147,7 +167,30 @@ class Datastore:
# we assume that we are in the middle of a transaction
return res
def save_preplots(self, lines, path, preplot_class, epsg = 0):
def set_ntbp(self, path, ntbp, cursor = None):
"""
Set or remove a sequence's NTBP flag
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
hash = file_hash(path)
qry = """
UPDATE raw_lines rl
SET ntbp = %s
FROM raw_shots rs, files f
WHERE rs.hash = f.hash AND rs.sequence = rl.sequence AND f.hash = %s;
"""
cur.execute(qry, (ntbp, hash))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def save_preplots(self, lines, filepath, preplot_class, epsg = 0, filedata = None):
"""
Save preplot data.
@@ -156,7 +199,7 @@ class Datastore:
lines (iterable): should be a collection of lines returned from
one of the preplot-reading functions (see preplots.py).
path (string): the full path to the preplot file from where the lines
filepath (string): the full path to the preplot file from where the lines
have been read. It will be added to the survey's `file` table so that
it can be monitored for changes.
@@ -168,7 +211,9 @@ class Datastore:
"""
with self.conn.cursor() as cursor:
hash = self.add_file(path, cursor)
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
count=0
for line in lines:
count += 1
@@ -205,6 +250,9 @@ class Datastore:
cursor.executemany(qry, points)
if filedata is not None:
self.save_file_data(filepath, json.dumps(filedata), cursor)
self.maybe_commit()
def save_raw_p190(self, records, fileinfo, filepath, epsg = 0, filedata = None, ntbp = False):
@@ -232,20 +280,13 @@ class Datastore:
"""
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
incr = records[0]["point_number"] <= records[-1]["point_number"]
# Start by deleting any online data we may have for this sequence
# FIXME Factor this out into its own function
qry = """
DELETE
FROM raw_lines rl
USING raw_lines_files rlf
WHERE
rl.sequence = rlf.sequence
AND rlf.hash = '*online*'
AND rl.sequence = %s;
"""
self.del_hash("*online*", cursor)
qry = """
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr)
@@ -307,6 +348,8 @@ class Datastore:
"""
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
#print(records[0])
#print(records[-1])
@@ -350,30 +393,37 @@ class Datastore:
def save_raw_p111 (self, records, fileinfo, filepath, epsg = 0, filedata = None, ntbp = False):
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
if not records or len(records) == 0:
print("File has no records (or none have been detected)")
# We add the file to the database anyway to signal that we have
# actually seen it.
self.maybe_commit()
return
incr = p111.point_number(records[0]) <= p111.point_number(records[-1])
# Start by deleting any online data we may have for this sequence
# FIXME Factor this out into its own function
qry = """
DELETE
FROM raw_lines rl
USING raw_lines_files rlf
WHERE
rl.sequence = rlf.sequence
AND rlf.hash = '*online*'
AND rl.sequence = %s;
"""
cursor.execute(qry, (fileinfo["sequence"],))
self.del_hash("*online*", cursor)
qry = """
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr)
VALUES (%s, %s, '', %s, %s)
INSERT INTO raw_lines (sequence, line, remarks, ntbp, incr, meta)
VALUES (%s, %s, '', %s, %s, %s)
ON CONFLICT DO NOTHING;
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr))
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], ntbp, incr, json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
INSERT INTO raw_lines_files (sequence, hash)
@@ -405,15 +455,25 @@ class Datastore:
def save_final_p111 (self, records, fileinfo, filepath, epsg = 0, filedata = None):
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
qry = """
INSERT INTO final_lines (sequence, line, remarks)
VALUES (%s, %s, '')
INSERT INTO final_lines (sequence, line, remarks, meta)
VALUES (%s, %s, '', %s)
ON CONFLICT DO NOTHING;
"""
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"]))
cursor.execute(qry, (fileinfo["sequence"], fileinfo["line"], json.dumps(fileinfo["meta"])))
qry = """
UPDATE raw_lines
SET meta = meta || %s
WHERE sequence = %s;
"""
cursor.execute(qry, (json.dumps(fileinfo["meta"]), fileinfo["sequence"]))
qry = """
INSERT INTO final_lines_files (sequence, hash)
@@ -440,6 +500,51 @@ class Datastore:
if filedata is not None:
self.save_file_data(filepath, json.dumps(filedata), cursor)
cursor.execute("CALL final_line_post_import(%s);", (fileinfo["sequence"],))
self.maybe_commit()
def save_raw_smsrc (self, records, fileinfo, filepath, filedata = None):
with self.conn.cursor() as cursor:
cursor.execute("BEGIN;")
hash = self.add_file(filepath, cursor)
# Start by deleting any online data we may have for this sequence
# NOTE: Do I need to do this?
#self.del_hash("*online*", cursor)
# The shots should already exist, e.g., from a P1 import
# …but what about if the SMSRC file gets read *before* the P1?
# We need to check
qry = "SELECT count(*) FROM raw_shots WHERE sequence = %s AND hash != '*online*';"
values = (fileinfo["sequence"],)
cursor.execute(qry, values)
shotcount = cursor.fetchone()[0]
if shotcount == 0:
# No shots yet or not all imported, so we do *not*
# save the gun data. It will eventually get picked
# up in the next run.
# Let's remove the file from the file list and bail
# out.
print("No raw shots for sequence", fileinfo["sequence"])
self.conn.rollback()
return
values = [ (json.dumps(record), fileinfo["sequence"], record["shot"]) for record in records ]
qry = """
UPDATE raw_shots
SET meta = jsonb_set(meta, '{smsrc}', %s::jsonb, true) - 'qc'
WHERE sequence = %s AND point = %s;
"""
cursor.executemany(qry, values)
if filedata is not None:
self.save_file_data(filepath, json.dumps(filedata), cursor)
self.maybe_commit()
@@ -488,7 +593,7 @@ class Datastore:
INSERT INTO labels (name, data)
SELECT l.key, l.value
FROM file_data fd,
json_each(fd.data->'labels') l
jsonb_each(fd.data->'labels') l
WHERE fd.data::jsonb ? 'labels'
ON CONFLICT (name) DO UPDATE SET data = excluded.data;
"""
@@ -499,3 +604,109 @@ class Datastore:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def add_info(self, key, value, cursor = None):
"""
Add an item of information to the project
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = """
INSERT INTO info (key, value)
VALUES(%s, %s)
ON CONFLICT (key) DO UPDATE
SET value = EXCLUDED.value;
"""
cur.execute(qry, (key, value))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def get_info(self, key, cursor = None):
"""
Retrieve an item of information from the project
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "SELECT value FROM info WHERE key = %s;"
cur.execute(qry, (key,))
res = cur.fetchone()
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
return res
def del_info(self, key, cursor = None):
"""
Remove a an item of information from the project
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "DELETE FROM info WHERE key = %s;"
cur.execute(qry, (key,))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def del_sequence_final(self, sequence, cursor = None):
"""
Remove final data for a sequence.
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "DELETE FROM files WHERE hash = (SELECT hash FROM final_lines_files WHERE sequence = %s);"
cur.execute(qry, (sequence,))
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def adjust_planner(self, cursor = None):
"""
Adjust estimated times on the planner
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "CALL adjust_planner();"
cur.execute(qry)
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction
def housekeep_event_log(self, cursor = None):
"""
Call housekeeping actions on the event log
"""
if cursor is None:
cur = self.conn.cursor()
else:
cur = cursor
qry = "CALL augment_event_data();"
cur.execute(qry)
if cursor is None:
self.maybe_commit()
# We do not commit if we've been passed a cursor, instead
# we assume that we are in the middle of a transaction

26
bin/housekeep_database.py Executable file
View File

@@ -0,0 +1,26 @@
#!/usr/bin/python3
"""
Do housekeeping actions on the database.
"""
import configuration
from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
db.adjust_planner()
db.housekeep_event_log()
print("Done")

95
bin/human_exports_qc.py Executable file
View File

@@ -0,0 +1,95 @@
#!/usr/bin/python3
"""
Export data that is entered directly into Dougal
as opposed to being read from external sources.
This data will be read back in when the database
is recreated for an existing survey.
"""
import os
from glob import glob
import pathlib
import string
import configuration
import preplots
from datastore import Datastore
def sane_name(filename):
allowed_chars = string.ascii_letters + string.digits + " _-#+&^%$!();:.,"
return ''.join([c for c in filename if c in allowed_chars])
def write_file (filename, items):
filename.parent.mkdir(parents=True, exist_ok=True)
with open(filename, "w") as fd:
for item in items:
sequence = point = line = ""
if type(item["_id"]) == list:
if len(item["_id"]) == 2:
sequence, point = item["_id"]
elif len(item["_id"]) == 3:
sequence, point, line = item["_id"]
else:
sequence = item["_id"]
line = f"{sequence}\t{point}\t{line}\t{item['results']}\n"
fd.write(line)
def qc_item(item, prefixes = [], index = None):
leader = "{:0>2d}".format(index) if index is not None else ""
name = sane_name(leader+" "+item["name"])
if "check" in item:
filename = pathlib.Path(*prefixes, name+".txt")
print("MKFILE", filename)
print("Export", len(item["check"]), "results")
write_file(filename, item["check"])
if "children" in item:
print("MKDIR", pathlib.Path(*prefixes, name))
subindex = 0
for child in item["children"]:
subindex += 1
qc_item(child, [*prefixes, name], subindex)
def qc_data (cursor, prefix):
qc = db.get_info('qc', cursor)
if qc is not None:
qc = qc[0]
else:
print("No QC data found");
return
#print("QC", qc)
index = 0
for item in qc["results"]:
index += 1
qc_item(item, [prefix, "QC"], index)
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
with db.conn.cursor() as cursor:
try:
pathPrefix = survey["exports"]["human"]["path"]
except KeyError:
print("Survey does not define an export path for human data")
continue
if not pathlib.Path(pathPrefix).exists():
print(pathPrefix)
raise ValueError("Export path does not exist")
qc_data(cursor, pathPrefix)
print("Done")

74
bin/human_exports_seis.py Executable file
View File

@@ -0,0 +1,74 @@
#!/usr/bin/python3
"""
Export data that is entered directly into Dougal
as opposed to being read from external sources.
This data will be read back in when the database
is recreated for an existing survey.
"""
import os
from glob import glob
import pathlib
import string
import configuration
import requests
import json
#from datastore import Datastore
def sane_name(filename):
allowed_chars = string.ascii_letters + string.digits + " _-#+&^%$!();:.,"
return ''.join([c for c in filename if c in allowed_chars])
def write_file (filename, payload):
print("Writing to", filename)
tmpname = filename.parent / (filename.name + ".tmp")
filename.parent.mkdir(parents=True, exist_ok=True)
with open(tmpname, "w", encoding="utf8") as fd:
json.dump(payload, fd, indent=4, ensure_ascii=False)
os.rename(tmpname, filename)
def seis_data (survey):
try:
pathPrefix = survey["sse"]["path"]
except KeyError:
print("Survey does not define an export path for human data")
return
if not pathlib.Path(pathPrefix).exists():
print(pathPrefix)
raise ValueError("Export path does not exist")
print(f"Requesting sequences for {survey['id']}")
url = f"http://localhost:3000/api/project/{survey['id']}/sequence"
r = requests.get(url)
print(r.status_code, url)
for sequence in r.json():
if sequence['status'] not in ["final", "ntbp"]:
continue
filename = pathlib.Path(pathPrefix, "sequence{:0>3d}.json".format(sequence['sequence']))
if filename.exists():
print(f"Skipping export for sequence {sequence['sequence']} file already exists")
continue
print(f"Processing sequence {sequence['sequence']}")
url = f"http://localhost:3000/api/project/{survey['id']}/event?sequence={sequence['sequence']}&missing=t"
headers = { "Accept": "application/vnd.seis+json" }
r = requests.get(url, headers=headers)
if r.status_code == requests.codes.ok:
write_file(filename, r.json())
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
seis_data(survey)
print("Done")

View File

@@ -12,14 +12,47 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p111
from datastore import Datastore
def add_pending_remark(db, sequence):
text = '<!-- @@DGL:PENDING@@ --><h4 style="color:red;cursor:help;" title="Edit the sequence file or directory name to import final data">Marked as <code>PENDING</code>.</h4><!-- @@/DGL:PENDING@@ -->\n'
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
remarks = cursor.fetchone()[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is None:
remarks = text + remarks
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
def del_pending_remark(db, sequence):
with db.conn.cursor() as cursor:
qry = "SELECT remarks FROM raw_lines WHERE sequence = %s;"
cursor.execute(qry, (sequence,))
row = cursor.fetchone()
if row is not None:
remarks = row[0]
rx = re.compile("^(<!-- @@DGL:PENDING@@ -->.*<!-- @@/DGL:PENDING@@ -->\n)")
m = rx.match(remarks)
if m is not None:
remarks = rx.sub("",remarks)
qry = "UPDATE raw_lines SET remarks = %s WHERE sequence = %s;"
cursor.execute(qry, (remarks, sequence))
db.maybe_commit()
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -40,6 +73,9 @@ if __name__ == '__main__':
pattern = final_p111["pattern"]
rx = re.compile(pattern["regex"])
if "pending" in survey["final"]:
pendingRx = re.compile(survey["final"]["pending"]["pattern"]["regex"])
for fileprefix in final_p111["paths"]:
print(f"Path prefix: {fileprefix}")
@@ -48,7 +84,17 @@ if __name__ == '__main__':
filepath = str(filepath)
print(f"Found {filepath}")
pending = False
if pendingRx:
pending = pendingRx.search(filepath) is not None
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
@@ -59,16 +105,30 @@ if __name__ == '__main__':
continue
file_info = dict(zip(pattern["captures"], match.groups()))
file_info["meta"] = {}
if pending:
print("Skipping / removing final file because marked as PENDING", filepath)
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
continue
else:
del_pending_remark(db, file_info["sequence"])
p111_data = p111.from_file(filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_final_p111(p111_records, file_info, filepath, survey["epsg"])
else:
print("Already in DB")
if pending:
print("Removing from database because marked as PENDING")
db.del_sequence_final(file_info["sequence"])
add_pending_remark(db, file_info["sequence"])
print("Done")

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p190
from datastore import Datastore
@@ -20,6 +21,7 @@ if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -49,6 +51,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

View File

@@ -8,7 +8,9 @@ or modified preplots and (re-)import them into the database.
"""
from glob import glob
import os
import sys
import time
import configuration
import preplots
from datastore import Datastore
@@ -17,6 +19,7 @@ if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -28,6 +31,12 @@ if __name__ == '__main__':
for file in survey["preplots"]:
print(f"Preplot: {file['path']}")
if not db.file_in_db(file["path"]):
age = time.time() - os.path.getmtime(file["path"])
if age < file_min_age:
print("Skipping file because too new", file["path"])
continue
print("Importing")
try:
preplot = preplots.from_file(file)
@@ -37,7 +46,7 @@ if __name__ == '__main__':
if type(preplot) is list:
print("Saving to DB")
db.save_preplots(preplot, file["path"], file["class"], survey["epsg"])
db.save_preplots(preplot, file["path"], file["class"], survey["epsg"], file)
elif type(preplot) is str:
print(preplot)
else:

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p111
from datastore import Datastore
@@ -20,6 +21,7 @@ if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -51,7 +53,18 @@ if __name__ == '__main__':
filepath = str(filepath)
print(f"Found {filepath}")
if ntbpRx:
ntbp = ntbpRx.search(filepath) is not None
else:
ntbp = False
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
@@ -62,20 +75,27 @@ if __name__ == '__main__':
continue
file_info = dict(zip(pattern["captures"], match.groups()))
if ntbpRx:
ntbp = ntbpRx.match(filepath) is not None
else:
ntbp = False
file_info["meta"] = {}
p111_data = p111.from_file(filepath)
print("Saving")
p111_records = p111.p111_type("S", p111_data)
if len(p111_records):
file_info["meta"]["lineName"] = p111.line_name(p111_data)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
db.save_raw_p111(p111_records, file_info, filepath, survey["epsg"], ntbp=ntbp)
else:
print("No source records found in file")
else:
print("Already in DB")
# Update the NTBP status to whatever the latest is,
# as it might have changed.
db.set_ntbp(filepath, ntbp)
if ntbp:
print("Sequence is NTBP")
print("Done")

View File

@@ -12,6 +12,7 @@ import os
import sys
import pathlib
import re
import time
import configuration
import p190
from datastore import Datastore
@@ -20,6 +21,7 @@ if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
@@ -52,6 +54,12 @@ if __name__ == '__main__':
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))

84
bin/import_smsrc.py Executable file
View File

@@ -0,0 +1,84 @@
#!/usr/bin/python3
"""
Import SmartSource data.
For each survey in configuration.surveys(), check for new
or modified final gun header files and (re-)import them into the
database.
"""
import os
import sys
import pathlib
import re
import time
import configuration
import smsrc
from datastore import Datastore
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
file_min_age = configuration.read().get('imports', {}).get('file_min_age', 10)
print("Connecting to database")
db = Datastore()
db.connect()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
try:
raw_smsrc = survey["raw"]["smsrc"]
except KeyError:
print("No SmartSource data configuration")
continue
flags = 0
if "flags" in raw_smsrc:
configuration.rxflags(raw_smsrc["flags"])
pattern = raw_smsrc["pattern"]
rx = re.compile(pattern["regex"], flags)
for fileprefix in raw_smsrc["paths"]:
print(f"Path prefix: {fileprefix}")
for globspec in raw_smsrc["globs"]:
for filepath in pathlib.Path(fileprefix).glob(globspec):
filepath = str(filepath)
print(f"Found {filepath}")
if not db.file_in_db(filepath):
age = time.time() - os.path.getmtime(filepath)
if age < file_min_age:
print("Skipping file because too new", filepath)
continue
print("Importing")
match = rx.match(os.path.basename(filepath))
if not match:
error_message = f"File path not matching the expected format! ({filepath} ~ {pattern['regex']})"
print(error_message, file=sys.stderr)
print("This file will be ignored!")
continue
file_info = dict(zip(pattern["captures"], match.groups()))
smsrc_records = smsrc.from_file(filepath)
print("Saving")
db.save_raw_smsrc(smsrc_records, file_info, filepath)
else:
print("Already in DB")
print("Done")

48
bin/insert_event.py Executable file
View File

@@ -0,0 +1,48 @@
#!/usr/bin/python3
from datetime import datetime
from datastore import Datastore
def detect_schema (conn):
with conn.cursor() as cursor:
qry = "SELECT meta->>'_schema' AS schema, tstamp, age(current_timestamp, tstamp) age FROM real_time_inputs WHERE meta ? '_schema' AND age(current_timestamp, tstamp) < '02:00:00' ORDER BY tstamp DESC LIMIT 1"
cursor.execute(qry)
res = cursor.fetchone()
if res and len(res):
return res[0]
return None
if __name__ == '__main__':
import argparse
ap = argparse.ArgumentParser()
ap.add_argument("-s", "--schema", required=False, default=None, help="survey where to insert the event")
ap.add_argument("-t", "--tstamp", required=False, default=None, help="event timestamp")
ap.add_argument("-l", "--label", required=False, default=None, action="append", help="event label")
ap.add_argument('remarks', type=str, nargs="+", help="event message")
args = vars(ap.parse_args())
db = Datastore()
db.connect()
if args["schema"]:
schema = args["schema"]
else:
schema = detect_schema(db.conn)
if args["tstamp"]:
tstamp = args["tstamp"]
else:
tstamp = datetime.utcnow().isoformat()
message = " ".join(args["remarks"])
print("new event:", schema, tstamp, message)
if schema and tstamp and message:
db.set_survey(schema)
with db.conn.cursor() as cursor:
qry = "INSERT INTO events_timed (tstamp, remarks) VALUES (%s, %s);"
cursor.execute(qry, (tstamp, message))
db.maybe_commit()

View File

@@ -153,6 +153,9 @@ def parse_line (string):
return None
def line_name(records):
return set([ r['Acquisition Line Name'] for r in p111_type("S", records) ]).pop()
def p111_type(type, records):
return [ r for r in records if r["type"] == type ]

View File

@@ -12,7 +12,7 @@ from parse_fwr import parse_fwr
def parse_p190_header (string):
"""Parse a generic P1/90 header record.
Returns a dictionary of fields.
"""
names = [ "record_type", "header_type", "header_type_modifier", "description", "data" ]
@@ -27,7 +27,7 @@ def parse_p190_type1 (string):
"doy", "time", "spare2" ]
record = parse_fwr(string, [1, 12, 3, 1, 1, 1, 6, 10, 11, 9, 9, 6, 3, 6, 1])
return dict(zip(names, record))
def parse_p190_rcv_group (string):
"""Parse a P1/90 Type 1 receiver group record."""
names = [ "record_type",
@@ -37,7 +37,7 @@ def parse_p190_rcv_group (string):
"streamer_id" ]
record = parse_fwr(string, [1, 4, 9, 9, 4, 4, 9, 9, 4, 4, 9, 9, 4, 1])
return dict(zip(names, record))
def parse_line (string):
type = string[0]
if string[:3] == "EOF":
@@ -52,7 +52,7 @@ def parse_line (string):
def p190_type(type, records):
return [ r for r in records if r["record_type"] == type ]
def p190_header(code, records):
return [ h for h in p190_type("H", records) if h["header_type"]+h["header_type_modifier"] == code ]
@@ -86,15 +86,15 @@ def normalise_record(record):
# These are probably strings
elif "strip" in dir(record[key]):
record[key] = record[key].strip()
return record
def normalise(records):
for record in records:
normalise_record(record)
return records
def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
records = []
with open(path) as fd:
@@ -102,10 +102,10 @@ def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
line = fd.readline()
while line:
cnt = cnt + 1
if line == "EOF":
break
record = parse_line(line)
if record is not None:
if only_records:
@@ -121,9 +121,9 @@ def from_file(path, only_records=None, shot_range=None, with_objrefs=False):
records.append(record)
line = fd.readline()
return records
def apply_tstamps(recordset, tstamp=None, fix_bad_seconds=False):
#print("tstamp", tstamp, type(tstamp))
if type(tstamp) is int:
@@ -161,16 +161,16 @@ def apply_tstamps(recordset, tstamp=None, fix_bad_seconds=False):
record["tstamp"] = ts
prev[object_id(record)] = doy
break
return recordset
def dms(value):
# 591544.61N
hemisphere = 1 if value[-1] in "NnEe" else -1
seconds = float(value[-6:-1])
minutes = int(value[-8:-6])
degrees = int(value[:-8])
return (degrees + minutes/60 + seconds/3600) * hemisphere
def tod(record):
@@ -183,7 +183,7 @@ def tod(record):
m = int(time[2:4])
s = float(time[4:])
return d*86400 + h*3600 + m*60 + s
def duration(record0, record1):
ts0 = tod(record0)
ts1 = tod(record1)
@@ -198,10 +198,10 @@ def azimuth(record0, record1):
x0, y0 = float(record0["easting"]), float(record0["northing"])
x1, y1 = float(record1["easting"]), float(record1["northing"])
return math.degrees(math.atan2(x1-x0, y1-y0)) % 360
def speed(record0, record1, knots=False):
scale = 3600/1852 if knots else 1
t0 = tod(record0)
t1 = tod(record1)
return (distance(record0, record1) / math.fabs(t1-t0)) * scale

View File

@@ -95,15 +95,35 @@ run $BINDIR/import_final_p111.py
print_log "Import final P1/90"
run $BINDIR/import_final_p190.py
if [[ -z "$RUNNER_NOEXPORT" ]]; then
print_log "Export system data"
run $BINDIR/system_exports.py
fi
print_log "Import SmartSource data"
run $BINDIR/import_smsrc.py
if [[ -n "$RUNNER_IMPORT" ]]; then
print_log "Import system data"
run $BINDIR/system_imports.py
fi
# if [[ -z "$RUNNER_NOEXPORT" ]]; then
# print_log "Export system data"
# run $BINDIR/system_exports.py
# fi
# if [[ -n "$RUNNER_IMPORT" ]]; then
# print_log "Import system data"
# run $BINDIR/system_imports.py
# fi
# print_log "Export QC data"
# run $BINDIR/human_exports_qc.py
# print_log "Export sequence data"
# run $BINDIR/human_exports_seis.py
print_log "Process ASAQC queue"
# Run insecure in test mode:
# export NODE_TLS_REJECT_UNAUTHORIZED=0
run $DOUGAL_ROOT/lib/www/server/queues/asaqc/index.js
print_log "Run database housekeeping actions"
run $BINDIR/housekeep_database.py
print_log "Run QCs"
run $DOUGAL_ROOT/lib/www/server/lib/qc/index.js
rm "$LOCKFILE"

95
bin/smsrc.py Normal file
View File

@@ -0,0 +1,95 @@
#!/usr/bin/python3
"""
SmartSource parsing functions.
"""
import mmap
import struct
from collections import namedtuple
def _str (v):
return str(v, 'ascii').strip()
def _tstamp (v):
return str(v) # TODO
def _f10 (v):
return float(v)/10
def _ignore (v):
return None
st_smartsource_header = struct.Struct(">6s 4s 30s 10s 2s 1s 17s 1s 1s 2s 2s 2s 2s 2s 4s 6s 5s 5s 6s 4s 88s")
fn_smartsource_header = (
_str, int, _str, int, int, _str, _tstamp, int, int, int, int, int, int, int, int, int,
float, float, float, int, _str
)
SmartsourceHeader = namedtuple("SmartsourceHeader", "header blk_siz line shot mask trg_mode time src_number num_subarray num_guns num_active num_delta num_auto num_nofire spread volume avg_delta std_delta baro_press manifold spare")
st_smartsource_gun = struct.Struct(">1s 2s 1s 1s 1s 1s 1s 6s 6s 4s 4s 4s 4s 4s")
fn_smartsource_gun = (
int, int, int, _str, _str, lambda v: v=="Y", _str,
_f10, _f10, _f10, _f10,
int, int, int
)
SmartsourceGun = namedtuple("SmartsourceGun", "string gun source mode detect autofire spare aim_point firetime delay depth pressure volume filltime")
SmartSourceRecord = namedtuple("SmartSourceRecord", "header guns")
def safe_apply (iter):
def safe_fn (fn, v):
try:
return fn(v)
except ValueError:
return None
return [safe_fn(v[0], v[1]) for v in iter]
def _check_chunk_size(chunk, size):
return len(chunk) == size
def from_file(path):
records = []
with open(path, "rb") as fd:
with mmap.mmap(fd.fileno(), length=0, access=mmap.ACCESS_READ) as buffer:
while True:
offset = buffer.find(b"*SMSRC")
if offset == -1:
break
buffer = buffer[offset:]
record, length = read_smartsource(buffer)
if record is not None:
records.append(record)
if length != 0:
buffer = buffer[length:]
else:
buffer = buffer[1:]
return records
def read_smartsource(buffer):
length = 0
header = st_smartsource_header.unpack_from(buffer, 0)
length += st_smartsource_header.size
header = SmartsourceHeader(*safe_apply(zip(fn_smartsource_header, header)))
record = dict(header._asdict())
record["guns"] = []
for _ in range(header.num_guns):
gun = st_smartsource_gun.unpack_from(buffer, length)
record["guns"].append(SmartsourceGun(*safe_apply(zip(fn_smartsource_gun, gun))))
length += st_smartsource_gun.size
return (record, length)

View File

@@ -47,4 +47,5 @@ def from_file(path, spec = None):
line = fd.readline()
del spec["normalisers"]
return records

95
bin/system_dump.py Executable file
View File

@@ -0,0 +1,95 @@
#!/usr/bin/python3
"""
Export data that is entered directly into Dougal
as opposed to being read from external sources.
This data will be read back in when the database
is recreated for an existing survey.
Unlike system_exports.py, which exports whole tables
via COPY, this exports a selection of columns from
tables containing both directly entered and imported
data.
"""
import os
from glob import glob
import configuration
import preplots
from datastore import Datastore
locals().update(configuration.vars())
exportables = {
"public": {
"projects": [ "meta" ],
"info": None,
"real_time_inputs": None
},
"survey": {
"final_lines": [ "remarks", "meta" ],
"final_shots": [ "meta" ],
"preplot_lines": [ "remarks", "ntba", "meta" ],
"preplot_points": [ "ntba", "meta" ],
"raw_lines": [ "remarks", "meta" ],
"raw_shots": [ "meta" ],
"planned_lines": None
}
}
def primary_key (table, cursor):
# https://wiki.postgresql.org/wiki/Retrieve_primary_key_columns
qry = """
SELECT a.attname, format_type(a.atttypid, a.atttypmod) AS data_type
FROM pg_index i
JOIN pg_attribute a
ON a.attrelid = i.indrelid
AND a.attnum = ANY(i.indkey)
WHERE i.indrelid = %s::regclass
AND i.indisprimary;
"""
cursor.execute(qry, (table,))
return cursor.fetchall()
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
db.connect()
for table in exportables["public"]:
with db.conn.cursor() as cursor:
pk = [ r[0] for r in primary_key(table, cursor) ]
columns = (pk + exportables["public"][table]) if exportables["public"][table] is not None else None
path = os.path.join(VARDIR, "-"+table)
with open(path, "wb") as fd:
print(" →→ ", path, " ←← ", table, columns)
cursor.copy_to(fd, table, columns=columns)
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
with db.conn.cursor() as cursor:
try:
pathPrefix = survey["exports"]["machine"]["path"]
except KeyError:
print("Survey does not define an export path for machine data")
continue
for table in exportables["survey"]:
pk = [ r[0] for r in primary_key(table, cursor) ]
columns = (pk + exportables["survey"][table]) if exportables["survey"][table] is not None else None
path = os.path.join(pathPrefix, "-"+table)
print(" →→ ", path, " ←← ", table, columns)
with open(path, "wb") as fd:
cursor.copy_to(fd, table, columns=columns)
print("Done")

View File

@@ -36,9 +36,9 @@ if __name__ == '__main__':
with db.conn.cursor() as cursor:
try:
pathPrefix = survey["exports"]["path"]
except ValueError:
print("Survey does not define an export path")
pathPrefix = survey["exports"]["machine"]["path"]
except KeyError:
print("Survey does not define an export path for machine data")
continue
for table in exportables:

View File

@@ -9,7 +9,7 @@ import os
from glob import glob
import configuration
import preplots
from datastore import Datastore
from datastore import Datastore, psycopg2
exportables = [
"events_seq",
@@ -31,20 +31,34 @@ if __name__ == '__main__':
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
with db.conn.cursor() as cursor:
cursor.execute("SET session_replication_role = replica;")
try:
pathPrefix = survey["exports"]["path"]
except ValueError:
print("Survey does not define an export path")
pathPrefix = survey["exports"]["machine"]["path"]
except KeyError:
print("Survey does not define an export path for machine data")
continue
for table in exportables:
path = os.path.join(pathPrefix, table)
print("", path, "", table)
with open(path, "rb") as fd:
cursor.copy_from(fd, table);
try:
for table in exportables:
path = os.path.join(pathPrefix, table)
if os.path.exists(path):
cursor.execute(f"DELETE FROM {table};")
for table in exportables:
path = os.path.join(pathPrefix, table)
print("", path, "", table)
with open(path, "rb") as fd:
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print("It looks like data for this survey may have already been imported (unique constraint violation)")
# If we don't commit the data does not actually get copied
db.conn.commit()
cursor.execute("SET session_replication_role = DEFAULT;")
# Update the sequences that generate event ids
cursor.execute("SELECT reset_events_serials();")
# Let us ensure events_timed_seq is up to date, even though
# the triggers will have taken care of this already.
cursor.execute("CALL events_timed_seq_update_all();")
print("Done")

150
bin/system_load.py Executable file
View File

@@ -0,0 +1,150 @@
#!/usr/bin/python3
"""
Re-import Dougal-exported data created by
system_dump.py
The target tables must already be populated with
imported data in order for the import to succeed.
"""
import os
from glob import glob
import configuration
import preplots
from datastore import Datastore, psycopg2
locals().update(configuration.vars())
exportables = {
"public": {
"projects": [ "meta" ],
"info": None,
"real_time_inputs": None
},
"survey": {
"final_lines": [ "remarks", "meta" ],
"final_shots": [ "meta" ],
"preplot_lines": [ "remarks", "ntba", "meta" ],
"preplot_points": [ "ntba", "meta" ],
"raw_lines": [ "remarks", "meta" ],
"raw_shots": [ "meta" ],
"planned_lines": None
}
}
def primary_key (table, cursor):
# https://wiki.postgresql.org/wiki/Retrieve_primary_key_columns
qry = """
SELECT a.attname, format_type(a.atttypid, a.atttypmod) AS data_type
FROM pg_index i
JOIN pg_attribute a
ON a.attrelid = i.indrelid
AND a.attnum = ANY(i.indkey)
WHERE i.indrelid = %s::regclass
AND i.indisprimary;
"""
cursor.execute(qry, (table,))
return cursor.fetchall()
def import_table(fd, table, columns, cursor):
pk = [ r[0] for r in primary_key(table, cursor) ]
# Create temporary table to import into
temptable = "import_"+table
print("Creating temporary table", temptable)
qry = f"""
CREATE TEMPORARY TABLE {temptable}
ON COMMIT DROP
AS SELECT {', '.join(pk + columns)} FROM {table}
WITH NO DATA;
"""
#print(qry)
cursor.execute(qry)
# Import into the temp table
print("Import data into temporary table")
cursor.copy_from(fd, temptable)
# Update the destination table
print("Updating destination table")
setcols = ", ".join([ f"{c} = t.{c}" for c in columns ])
wherecols = " AND ".join([ f"{table}.{c} = t.{c}" for c in pk ])
qry = f"""
UPDATE {table}
SET {setcols}
FROM {temptable} t
WHERE {wherecols};
"""
#print(qry)
cursor.execute(qry)
if __name__ == '__main__':
print("Reading configuration")
surveys = configuration.surveys()
print("Connecting to database")
db = Datastore()
db.connect()
for table in exportables["public"]:
with db.conn.cursor() as cursor:
columns = exportables["public"][table]
path = os.path.join(VARDIR, "-"+table)
try:
with open(path, "rb") as fd:
print(" →→ ", path, " ←← ", table, columns)
if columns is not None:
import_table(fd, table, columns, cursor)
else:
try:
print(f"Copying from {path} into {table}")
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print(f"It looks like table {table} may have already been imported. Skipping it.")
except FileNotFoundError:
print(f"File not found. Skipping {path}")
db.conn.commit()
print("Reading surveys")
for survey in surveys:
print(f'Survey: {survey["id"]} ({survey["schema"]})')
db.set_survey(survey["schema"])
with db.conn.cursor() as cursor:
try:
pathPrefix = survey["exports"]["machine"]["path"]
except KeyError:
print("Survey does not define an export path for machine data")
continue
for table in exportables["survey"]:
columns = exportables["survey"][table]
path = os.path.join(pathPrefix, "-"+table)
print(" ←← ", path, " →→ ", table, columns)
try:
with open(path, "rb") as fd:
if columns is not None:
import_table(fd, table, columns, cursor)
else:
try:
print(f"Copying from {path} into {table}")
cursor.copy_from(fd, table)
except psycopg2.errors.UniqueViolation:
print(f"It looks like table {table} may have already been imported. Skipping it.")
except FileNotFoundError:
print(f"File not found. Skipping {path}")
# If we don't commit the data does not actually get copied
db.conn.commit()
print("Done")

View File

@@ -21,4 +21,26 @@ navigation:
# Anything here gets passed as options to the packet
# saving routine.
epsg: 23031 # Assume this CRS for unqualified E/N data
# Heuristics to apply to detect survey when offline
offline_survey_heuristics: "nearest_preplot"
# Apply the heuristics at most once every…
offline_survey_detect_interval: 10000 # ms
imports:
# For a file to be imported, it must have been last modified at
# least this many seconds ago.
file_min_age: 60
queues:
asaqc:
request:
url: "https://api.gateway.equinor.com/vt/v1/api/upload-file-encoded"
args:
method: POST
headers:
Content-Type: application/json
httpsAgent: # The paths here are relative to $DOUGAL_ROOT
cert: etc/ssl/asaqc.crt
key: etc/ssl/asaqc.key

View File

@@ -19,3 +19,124 @@ Created with:
```bash
SCHEMA_NAME=survey_X EPSG_CODE=XXXXX $DOUGAL_ROOT/sbin/dump_schema.sh
```
## To create a new Dougal database
Ensure that the following packages are installed:
* `postgresql*-postgis-utils`
* `postgresql*-postgis`
* `postgresql*-contrib` # For B-trees
```bash
psql -U postgres <./database-template.sql
psql -U postgres <./database-version.sql
```
---
# Upgrading PostgreSQL
The following is based on https://en.opensuse.org/SDB:PostgreSQL#Upgrading_major_PostgreSQL_version
```bash
# The following bash code should be checked and executed
# line for line whenever you do an upgrade. The example
# shows the upgrade process from an original installation
# of version 12 up to version 14.
# install the new server as well as the required postgresql-contrib packages:
zypper in postgresql14-server postgresql14-contrib postgresql12-contrib
# If not yet done, copy the configuration create a new PostgreSQL configuration directory...
mkdir /etc/postgresql
# and copy the original file to this global directory
cd /srv/pgsql/data
for i in pg_hba.conf pg_ident.conf postgresql.conf postgresql.auto.conf ; do cp -a $i /etc/postgresql/$i ; done
# Now create a new data-directory and initialize it for usage with the new server
install -d -m 0700 -o postgres -g postgres /srv/pgsql/data14
cd /srv/pgsql/data14
sudo -u postgres /usr/lib/postgresql14/bin/initdb .
# replace the newly generated files by a symlink to the global files.
# After doing so, you may check the difference of the created backup files and
# the files from the former installation
for i in pg_hba.conf pg_ident.conf postgresql.conf postgresql.auto.conf ; do old $i ; ln -s /etc/postgresql/$i .; done
# Copy over special thesaurus files if some exists.
#cp -a /usr/share/postgresql12/tsearch_data/my_thesaurus_german.ths /usr/share/postgresql14/tsearch_data/
# Now it's time to disable the service...
systemctl stop postgresql.service
# And to start the migration. Please ensure, the directories fit to your upgrade path
sudo -u postgres /usr/lib/postgresql14/bin/pg_upgrade --link \
--old-bindir="/usr/lib/postgresql12/bin" \
--new-bindir="/usr/lib/postgresql14/bin" \
--old-datadir="/srv/pgsql/data/" \
--new-datadir="/srv/pgsql/data14/"
# NOTE: If getting the following error:
# lc_collate values for database "postgres" do not match: old "en_US.UTF-8", new "C"
# then:
# cd ..
# rm -rf /srv/pgsql/data14
# install -d -m 0700 -o postgres -g postgres /srv/pgsql/data14
# cd /srv/pgsql/data14
# sudo -u postgres /usr/lib/postgresql14/bin/initdb --locale=en_US.UTF-8 .
#
# and repeat the migration command
# After successfully migrating the data...
cd ..
# if not already symlinked move the old data to a versioned directory matching
# your old installation...
mv data data12
# and set a symlink to the new data directory
ln -sf data14/ data
# Now start the new service
systemctl start postgresql.service
# If everything has been sucessful, you should uninstall old packages...
#zypper rm -u postgresql12 postgresql13
# and remove old data directories
#rm -rf /srv/pgsql/data_OLD_POSTGRES_VERSION_NUMBER
# For good measure:
sudo -u postgres /usr/lib/postgresql14/bin/vacuumdb --all --analyze-in-stages
# If update_extensions.sql exists, apply it.
```
# Restoring from backup
## Whole database
Ensure that nothing is connected to the database.
```bash
psql -U postgres --dbname postgres <<EOF
-- Database: dougal
DROP DATABASE IF EXISTS dougal;
CREATE DATABASE dougal
WITH
OWNER = postgres
ENCODING = 'UTF8'
LC_COLLATE = 'en_GB.UTF-8'
LC_CTYPE = 'en_GB.UTF-8'
TABLESPACE = pg_default
CONNECTION LIMIT = -1;
ALTER DATABASE dougal
SET search_path TO "$user", public, topology;
EOF
# Adjust --jobs according to host machine
pg_restore -U postgres --dbname dougal --clean --if-exists --jobs 32 /path/to/backup
```

View File

@@ -2,8 +2,8 @@
-- PostgreSQL database dump
--
-- Dumped from database version 12.4
-- Dumped by pg_dump version 12.4
-- Dumped from database version 14.2
-- Dumped by pg_dump version 14.2
SET statement_timeout = 0;
SET lock_timeout = 0;
@@ -20,7 +20,7 @@ SET row_security = off;
-- Name: dougal; Type: DATABASE; Schema: -; Owner: postgres
--
CREATE DATABASE dougal WITH TEMPLATE = template0 ENCODING = 'UTF8' LC_COLLATE = 'C' LC_CTYPE = 'en_GB.UTF-8';
CREATE DATABASE dougal WITH TEMPLATE = template0 ENCODING = 'UTF8' LOCALE = 'en_GB.UTF-8';
ALTER DATABASE dougal OWNER TO postgres;
@@ -102,20 +102,6 @@ CREATE EXTENSION IF NOT EXISTS postgis WITH SCHEMA public;
COMMENT ON EXTENSION postgis IS 'PostGIS geometry, geography, and raster spatial types and functions';
--
-- Name: postgis_raster; Type: EXTENSION; Schema: -; Owner: -
--
CREATE EXTENSION IF NOT EXISTS postgis_raster WITH SCHEMA public;
--
-- Name: EXTENSION postgis_raster; Type: COMMENT; Schema: -; Owner:
--
COMMENT ON EXTENSION postgis_raster IS 'PostGIS raster types and functions';
--
-- Name: postgis_sfcgal; Type: EXTENSION; Schema: -; Owner: -
--
@@ -144,6 +130,48 @@ CREATE EXTENSION IF NOT EXISTS postgis_topology WITH SCHEMA topology;
COMMENT ON EXTENSION postgis_topology IS 'PostGIS topology spatial types and functions';
--
-- Name: queue_item_status; Type: TYPE; Schema: public; Owner: postgres
--
CREATE TYPE public.queue_item_status AS ENUM (
'queued',
'cancelled',
'failed',
'sent'
);
ALTER TYPE public.queue_item_status OWNER TO postgres;
--
-- Name: geometry_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT
geometry,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$$;
ALTER FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.geometry_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT geometry public.geometry, OUT delta numeric) IS 'Get geometry from timestamp';
--
-- Name: notify(); Type: FUNCTION; Schema: public; Owner: postgres
--
@@ -154,14 +182,19 @@ CREATE FUNCTION public.notify() RETURNS trigger
DECLARE
channel text := TG_ARGV[0];
payload text;
pid text;
BEGIN
SELECT projects.pid INTO pid FROM projects WHERE schema = TG_TABLE_SCHEMA;
payload := json_build_object(
'tstamp', CURRENT_TIMESTAMP,
'operation', TG_OP,
'schema', TG_TABLE_SCHEMA,
'table', TG_TABLE_NAME,
'old', row_to_json(OLD),
'new', row_to_json(NEW)
'new', row_to_json(NEW),
'pid', pid
)::text;
IF octet_length(payload) < 8000 THEN
@@ -177,23 +210,110 @@ $$;
ALTER FUNCTION public.notify() OWNER TO postgres;
--
-- Name: sequence_shot_from_tstamp(timestamp with time zone); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT * FROM public.sequence_shot_from_tstamp(ts, 3);
$$;
ALTER FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, OUT sequence numeric, OUT point numeric, OUT delta numeric) IS 'Get sequence and shotpoint from timestamp.
Overloaded form in which the tolerance value is implied and defaults to three seconds.';
--
-- Name: sequence_shot_from_tstamp(timestamp with time zone, numeric); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) RETURNS record
LANGUAGE sql
AS $$
SELECT
(meta->>'_sequence')::numeric AS sequence,
(meta->>'_point')::numeric AS point,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
meta ? '_sequence' AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$$;
ALTER FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) OWNER TO postgres;
--
-- Name: FUNCTION sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric); Type: COMMENT; Schema: public; Owner: postgres
--
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(ts timestamp with time zone, tolerance numeric, OUT sequence numeric, OUT point numeric, OUT delta numeric) IS 'Get sequence and shotpoint from timestamp.
Given a timestamp this function returns the closest shot to it within the given tolerance value.
This uses the `real_time_inputs` table and it does not give an indication of which project the shotpoint belongs to. It is assumed that a single project is being acquired at a given time.';
--
-- Name: set_survey(text); Type: PROCEDURE; Schema: public; Owner: postgres
--
CREATE PROCEDURE public.set_survey(project_id text)
CREATE PROCEDURE public.set_survey(IN project_id text)
LANGUAGE sql
AS $$
SELECT set_config('search_path', (SELECT schema||',public' FROM public.projects WHERE pid = project_id), false);
SELECT set_config('search_path', (SELECT schema||',public' FROM public.projects WHERE pid = lower(project_id)), false);
$$;
ALTER PROCEDURE public.set_survey(project_id text) OWNER TO postgres;
ALTER PROCEDURE public.set_survey(IN project_id text) OWNER TO postgres;
--
-- Name: update_timestamp(); Type: FUNCTION; Schema: public; Owner: postgres
--
CREATE FUNCTION public.update_timestamp() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
IF NEW.updated_on IS NOT NULL THEN
NEW.updated_on := current_timestamp;
END IF;
RETURN NEW;
EXCEPTION
WHEN undefined_column THEN RETURN NEW;
END;
$$;
ALTER FUNCTION public.update_timestamp() OWNER TO postgres;
SET default_tablespace = '';
SET default_table_access_method = heap;
--
-- Name: info; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.info (
key text NOT NULL,
value jsonb
);
ALTER TABLE public.info OWNER TO postgres;
--
-- Name: projects; Type: TABLE; Schema: public; Owner: postgres
--
@@ -208,6 +328,46 @@ CREATE TABLE public.projects (
ALTER TABLE public.projects OWNER TO postgres;
--
-- Name: queue_items; Type: TABLE; Schema: public; Owner: postgres
--
CREATE TABLE public.queue_items (
item_id integer NOT NULL,
status public.queue_item_status DEFAULT 'queued'::public.queue_item_status NOT NULL,
payload jsonb NOT NULL,
results jsonb DEFAULT '{}'::jsonb NOT NULL,
created_on timestamp with time zone DEFAULT CURRENT_TIMESTAMP NOT NULL,
updated_on timestamp with time zone DEFAULT CURRENT_TIMESTAMP NOT NULL,
not_before timestamp with time zone DEFAULT '1970-01-01 00:00:00+00'::timestamp with time zone NOT NULL,
parent_id integer
);
ALTER TABLE public.queue_items OWNER TO postgres;
--
-- Name: queue_items_item_id_seq; Type: SEQUENCE; Schema: public; Owner: postgres
--
CREATE SEQUENCE public.queue_items_item_id_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
ALTER TABLE public.queue_items_item_id_seq OWNER TO postgres;
--
-- Name: queue_items_item_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: postgres
--
ALTER SEQUENCE public.queue_items_item_id_seq OWNED BY public.queue_items.item_id;
--
-- Name: real_time_inputs; Type: TABLE; Schema: public; Owner: postgres
--
@@ -221,6 +381,21 @@ CREATE TABLE public.real_time_inputs (
ALTER TABLE public.real_time_inputs OWNER TO postgres;
--
-- Name: queue_items item_id; Type: DEFAULT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.queue_items ALTER COLUMN item_id SET DEFAULT nextval('public.queue_items_item_id_seq'::regclass);
--
-- Name: info info_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.info
ADD CONSTRAINT info_pkey PRIMARY KEY (key);
--
-- Name: projects projects_name_key; Type: CONSTRAINT; Schema: public; Owner: postgres
--
@@ -245,6 +420,21 @@ ALTER TABLE ONLY public.projects
ADD CONSTRAINT projects_schema_key UNIQUE (schema);
--
-- Name: queue_items queue_items_pkey; Type: CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.queue_items
ADD CONSTRAINT queue_items_pkey PRIMARY KEY (item_id);
--
-- Name: meta_tstamp_idx; Type: INDEX; Schema: public; Owner: postgres
--
CREATE INDEX meta_tstamp_idx ON public.real_time_inputs USING btree (((meta ->> 'tstamp'::text)) DESC);
--
-- Name: tstamp_idx; Type: INDEX; Schema: public; Owner: postgres
--
@@ -252,6 +442,13 @@ ALTER TABLE ONLY public.projects
CREATE INDEX tstamp_idx ON public.real_time_inputs USING btree (tstamp DESC);
--
-- Name: info info_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
--
-- Name: projects projects_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
@@ -259,6 +456,20 @@ CREATE INDEX tstamp_idx ON public.real_time_inputs USING btree (tstamp DESC);
CREATE TRIGGER projects_tg AFTER INSERT OR DELETE OR UPDATE ON public.projects FOR EACH ROW EXECUTE FUNCTION public.notify('project');
--
-- Name: queue_items queue_items_tg0; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER queue_items_tg0 BEFORE INSERT OR UPDATE ON public.queue_items FOR EACH ROW EXECUTE FUNCTION public.update_timestamp();
--
-- Name: queue_items queue_items_tg1; Type: TRIGGER; Schema: public; Owner: postgres
--
CREATE TRIGGER queue_items_tg1 AFTER INSERT OR DELETE OR UPDATE ON public.queue_items FOR EACH ROW EXECUTE FUNCTION public.notify('queue_items');
--
-- Name: real_time_inputs real_time_inputs_tg; Type: TRIGGER; Schema: public; Owner: postgres
--
@@ -266,6 +477,14 @@ CREATE TRIGGER projects_tg AFTER INSERT OR DELETE OR UPDATE ON public.projects F
CREATE TRIGGER real_time_inputs_tg AFTER INSERT ON public.real_time_inputs FOR EACH ROW EXECUTE FUNCTION public.notify('realtime');
--
-- Name: queue_items queue_items_parent_id_fkey; Type: FK CONSTRAINT; Schema: public; Owner: postgres
--
ALTER TABLE ONLY public.queue_items
ADD CONSTRAINT queue_items_parent_id_fkey FOREIGN KEY (parent_id) REFERENCES public.queue_items(item_id);
--
-- PostgreSQL database dump complete
--

View File

@@ -0,0 +1,3 @@
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.5"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.5"}' WHERE public.info.key = 'version';

File diff suppressed because it is too large Load Diff

34
etc/db/upgrades/README.md Normal file
View File

@@ -0,0 +1,34 @@
# Database schema upgrades
When the database schema needs to be upgraded in order to provide new functionality, fix errors, etc., an upgrade script should be added to this directory.
The script can be SQL (preferred) or anything else (Bash, Python, …) in the event of complex upgrades.
The script itself should:
* document what the intended changes are;
* contain instructions on how to run it;
* make the user aware of any non-obvious side effects; and
* say if it is safe to run the script multiple times on the
* same schema / database.
## Naming
Script files should be named `upgrade-<index>-<commit-id-old>-<commit-id-new>-v<schema-version>.sql`, where:
* `<index>` is a correlative two-digit index. When reaching 99, existing files will be renamed to a three digit index (001-099) and new files will use three digits.
* `<commit-id-old>` is the ID of the Git commit that last introduced a schema change.
* `<commit-id-new>` is the ID of the first Git commit expecting the updated schema.
* `<schema-version>` is the version of the schema.
Note: the `<schema-version>` value should be updated with every change and it should be the same as reported by:
```sql
select value->>'db_schema' as db_schema from public.info where key = 'version';
```
If necessary, the wanted schema version must also be updated in `package.json`.
## Running
Schema upgrades are always run manually.

View File

@@ -0,0 +1,22 @@
-- Upgrade the database from commit 78adb2be to 7917eeeb.
--
-- This upgrade affects the `public` schema only.
--
-- It creates a new table, `info`, for storing arbitrary JSON
-- data not belonging to a specific project. Currently used
-- for the equipment list, it could also serve to store user
-- details, configuration settings, system state, etc.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It will fail harmlessly if applied twice.
CREATE TABLE IF NOT EXISTS public.info (
key text NOT NULL primary key,
value jsonb
);
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON public.info FOR EACH ROW EXECUTE FUNCTION public.notify('info');

View File

@@ -0,0 +1,160 @@
-- Upgrade the database from commit 6e7ba82e to 53f71f70.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This merges two changes to the database.
-- The first one (commit 5de64e6b) modifies the `event` view to return
-- the `meta` column of timed and sequence events.
-- The second one (commit 53f71f70) adds a primary key constraint to
-- events_seq_labels (there is already an equivalent constraint on
-- events_seq_timed).
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It will fail harmlessly if applied twice.
BEGIN;
DROP VIEW events_seq_timed CASCADE; -- Brings down events too
ALTER TABLE ONLY events_seq_labels
ADD CONSTRAINT events_seq_labels_pkey PRIMARY KEY (id, label);
CREATE OR REPLACE VIEW events_seq_timed AS
SELECT s.sequence,
s.point,
s.id,
s.remarks,
rs.line,
rs.objref,
rs.tstamp,
rs.hash,
s.meta,
rs.geometry
FROM (events_seq s
LEFT JOIN raw_shots rs USING (sequence, point));
CREATE OR REPLACE VIEW events AS
WITH qc AS (
SELECT rs.sequence,
rs.point,
ARRAY[jsonb_array_elements_text(q.labels)] AS labels
FROM raw_shots rs,
LATERAL jsonb_path_query(rs.meta, '$."qc".*."labels"'::jsonpath) q(labels)
)
SELECT 'sequence'::text AS type,
false AS virtual,
s.sequence,
s.point,
s.id,
s.remarks,
s.line,
s.objref,
s.tstamp,
s.hash,
s.meta,
(public.st_asgeojson(public.st_transform(s.geometry, 4326)))::jsonb AS geometry,
ARRAY( SELECT esl.label
FROM events_seq_labels esl
WHERE (esl.id = s.id)) AS labels
FROM events_seq_timed s
UNION
SELECT 'timed'::text AS type,
false AS virtual,
rs.sequence,
rs.point,
t.id,
t.remarks,
rs.line,
rs.objref,
t.tstamp,
rs.hash,
t.meta,
(t.meta -> 'geometry'::text) AS geometry,
ARRAY( SELECT etl.label
FROM events_timed_labels etl
WHERE (etl.id = t.id)) AS labels
FROM ((events_timed t
LEFT JOIN events_timed_seq ts USING (id))
LEFT JOIN raw_shots rs USING (sequence, point))
UNION
SELECT 'midnight shot'::text AS type,
true AS virtual,
v1.sequence,
v1.point,
((v1.sequence * 100000) + v1.point) AS id,
''::text AS remarks,
v1.line,
v1.objref,
v1.tstamp,
v1.hash,
'{}'::jsonb meta,
(public.st_asgeojson(public.st_transform(v1.geometry, 4326)))::jsonb AS geometry,
ARRAY[v1.label] AS labels
FROM events_midnight_shot v1
UNION
SELECT 'qc'::text AS type,
true AS virtual,
rs.sequence,
rs.point,
((10000000 + (rs.sequence * 100000)) + rs.point) AS id,
(q.remarks)::text AS remarks,
rs.line,
rs.objref,
rs.tstamp,
rs.hash,
'{}'::jsonb meta,
(public.st_asgeojson(public.st_transform(rs.geometry, 4326)))::jsonb AS geometry,
('{QC}'::text[] || qc.labels) AS labels
FROM (raw_shots rs
LEFT JOIN qc USING (sequence, point)),
LATERAL jsonb_path_query(rs.meta, '$."qc".*."results"'::jsonpath) q(remarks)
WHERE (rs.meta ? 'qc'::text);
CREATE OR REPLACE VIEW final_lines_summary AS
WITH summary AS (
SELECT DISTINCT fs.sequence,
first_value(fs.point) OVER w AS fsp,
last_value(fs.point) OVER w AS lsp,
first_value(fs.tstamp) OVER w AS ts0,
last_value(fs.tstamp) OVER w AS ts1,
count(fs.point) OVER w AS num_points,
public.st_distance(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(fs.geometry) OVER w, last_value(fs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM final_shots fs
WINDOW w AS (PARTITION BY fs.sequence ORDER BY fs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT fl.sequence,
fl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
(( SELECT count(*) AS count
FROM preplot_points
WHERE ((preplot_points.line = fl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_points) AS missing_shots,
s.length,
s.azimuth,
fl.remarks,
fl.meta
FROM (summary s
JOIN final_lines fl USING (sequence));
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,171 @@
-- Upgrade the database from commit 53f71f70 to 4d977848.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds:
--
-- * label_in_sequence (_sequence integer, _label text):
-- Returns events containing the specified label.
--
-- * handle_final_line_events (_seq integer, _label text, _column text):
-- - If _label does not exist in the events for sequence _seq:
-- it adds a new _label label at the shotpoint obtained from
-- final_lines_summary[_column].
-- - If _label does exist (and hasn't been auto-added by this function
-- in a previous run), it will add information about it to the final
-- line's metadata.
--
-- * final_line_post_import (_seq integer):
-- Calls handle_final_line_events() on the given sequence to check
-- for FSP, FGSP, LGSP and LSP labels.
--
-- * events_seq_labels_single ():
-- Trigger function to ensure that labels that have the attribute
-- `model.multiple` set to `false` occur at most only once per
-- sequence. If a new instance is added to a sequence, the previous
-- instance is deleted.
--
-- * Trigger on events_seq_labels that calls events_seq_labels_single().
--
-- * Trigger on events_timed_labels that calls events_seq_labels_single().
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It will fail harmlessly if applied twice.
BEGIN;
CREATE OR REPLACE FUNCTION label_in_sequence (_sequence integer, _label text)
RETURNS events
LANGUAGE sql
AS $$
SELECT * FROM events WHERE sequence = _sequence AND _label = ANY(labels);
$$;
CREATE OR REPLACE PROCEDURE handle_final_line_events (_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event events%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
SELECT id INTO event_id FROM events_seq WHERE sequence = _seq AND point = _column_value ORDER BY id LIMIT 1;
IF event_id IS NULL THEN
--RAISE NOTICE '… but there is no existing event so we create a new one for sequence % and point %', _line.sequence, _column_value;
INSERT INTO events_seq (sequence, point, remarks)
VALUES (_line.sequence, _column_value, format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)))
RETURNING id INTO event_id;
--RAISE NOTICE 'Created event_id %', event_id;
END IF;
--RAISE NOTICE 'Remove any other auto-inserted % labels in sequence %', _label, _seq;
DELETE FROM events_seq_labels
WHERE label = _label AND id = (SELECT id FROM events_seq WHERE sequence = _seq AND meta->'auto' ? _label);
--RAISE NOTICE 'We now add a label to the event (id, label) = (%, %)', event_id, _label;
INSERT INTO events_seq_labels (id, label) VALUES (event_id, _label) ON CONFLICT ON CONSTRAINT events_seq_labels_pkey DO NOTHING;
--RAISE NOTICE 'And also clear the %: % flag from meta.auto for any existing events for sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE meta->'auto' ? _label AND sequence = _seq AND id <> event_id;
--RAISE NOTICE 'Finally, flag the event as having been had label % auto-created by %', _label, _tg_name;
UPDATE events_seq
SET meta = jsonb_set(jsonb_set(meta, '{auto}', COALESCE(meta->'auto', '{}')), ARRAY['auto', _label], to_jsonb(_tg_name))
WHERE id = event_id;
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
--RAISE NOTICE 'Clearing the %: % flag from meta.auto for any existing events in sequence %', _label, _tg_name, _seq;
UPDATE events_seq
SET meta = meta #- ARRAY['auto', _label]
WHERE sequence = _seq AND meta->'auto'->>_label = _tg_name;
END IF;
END IF;
END;
$$;
CREATE OR REPLACE PROCEDURE final_line_post_import (_seq integer)
LANGUAGE plpgsql
AS $$
BEGIN
CALL handle_final_line_events(_seq, 'FSP', 'fsp');
CALL handle_final_line_events(_seq, 'FGSP', 'fsp');
CALL handle_final_line_events(_seq, 'LGSP', 'lsp');
CALL handle_final_line_events(_seq, 'LSP', 'lsp');
END;
$$;
CREATE OR REPLACE FUNCTION events_seq_labels_single ()
RETURNS trigger
LANGUAGE plpgsql
AS $$
DECLARE _sequence integer;
BEGIN
IF EXISTS(SELECT 1 FROM labels WHERE name = NEW.label AND (data->'model'->'multiple')::boolean IS FALSE) THEN
SELECT sequence INTO _sequence FROM events WHERE id = NEW.id;
DELETE
FROM events_seq_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_seq WHERE sequence = _sequence);
DELETE
FROM events_timed_labels
WHERE
id <> NEW.id
AND label = NEW.label
AND id IN (SELECT id FROM events_timed_seq WHERE sequence = _sequence);
END IF;
RETURN NULL;
END;
$$;
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON events_seq_labels FOR EACH ROW EXECUTE FUNCTION events_seq_labels_single();
CREATE TRIGGER events_seq_labels_single_tg AFTER INSERT OR UPDATE ON events_timed_labels FOR EACH ROW EXECUTE FUNCTION events_seq_labels_single();
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,94 @@
-- Upgrade the database from commit 4d977848 to 3d70a460.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This adds the `meta` column to the output of the following views:
--
-- * raw_lines_summary; and
-- * sequences_summary
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE VIEW raw_lines_summary AS
WITH summary AS (
SELECT DISTINCT rs.sequence,
first_value(rs.point) OVER w AS fsp,
last_value(rs.point) OVER w AS lsp,
first_value(rs.tstamp) OVER w AS ts0,
last_value(rs.tstamp) OVER w AS ts1,
count(rs.point) OVER w AS num_points,
count(pp.point) OVER w AS num_preplots,
public.st_distance(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) AS length,
((public.st_azimuth(first_value(rs.geometry) OVER w, last_value(rs.geometry) OVER w) * (180)::double precision) / pi()) AS azimuth
FROM (raw_shots rs
LEFT JOIN preplot_points pp USING (line, point))
WINDOW w AS (PARTITION BY rs.sequence ORDER BY rs.tstamp ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)
)
SELECT rl.sequence,
rl.line,
s.fsp,
s.lsp,
s.ts0,
s.ts1,
(s.ts1 - s.ts0) AS duration,
s.num_points,
s.num_preplots,
(( SELECT count(*) AS count
FROM preplot_points
WHERE ((preplot_points.line = rl.line) AND (((preplot_points.point >= s.fsp) AND (preplot_points.point <= s.lsp)) OR ((preplot_points.point >= s.lsp) AND (preplot_points.point <= s.fsp))))) - s.num_preplots) AS missing_shots,
s.length,
s.azimuth,
rl.remarks,
rl.ntbp,
rl.meta
FROM (summary s
JOIN raw_lines rl USING (sequence));
DROP VIEW sequences_summary;
CREATE OR REPLACE VIEW sequences_summary AS
SELECT rls.sequence,
rls.line,
rls.fsp,
rls.lsp,
fls.fsp AS fsp_final,
fls.lsp AS lsp_final,
rls.ts0,
rls.ts1,
fls.ts0 AS ts0_final,
fls.ts1 AS ts1_final,
rls.duration,
fls.duration AS duration_final,
rls.num_preplots,
COALESCE(fls.num_points, rls.num_points) AS num_points,
COALESCE(fls.missing_shots, rls.missing_shots) AS missing_shots,
COALESCE(fls.length, rls.length) AS length,
COALESCE(fls.azimuth, rls.azimuth) AS azimuth,
rls.remarks,
fls.remarks AS remarks_final,
rls.meta,
fls.meta AS meta_final,
CASE
WHEN (rls.ntbp IS TRUE) THEN 'ntbp'::text
WHEN (fls.sequence IS NULL) THEN 'raw'::text
ELSE 'final'::text
END AS status
FROM (raw_lines_summary rls
LEFT JOIN final_lines_summary fls USING (sequence));
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,33 @@
-- Upgrade the database from commit 3d70a460 to 0983abac.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This:
--
-- * makes the primary key on planned_lines deferrable; and
-- * changes the planned_lines trigger from statement to row.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
ALTER TABLE planned_lines DROP CONSTRAINT planned_lines_pkey;
ALTER TABLE planned_lines ADD CONSTRAINT planned_lines_pkey PRIMARY KEY (sequence) DEFERRABLE;
DROP TRIGGER planned_lines_tg ON planned_lines;
CREATE TRIGGER planned_lines_tg AFTER INSERT OR DELETE OR UPDATE ON planned_lines FOR EACH ROW EXECUTE FUNCTION public.notify('planned_lines');
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,207 @@
-- Upgrade the database from commit 0983abac to 81d9ea19.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a new procedure adjust_planner() which resolves some
-- conflicts between shot sequences and the planner, such as removing
-- sequences that have been shot, renumbering, or adjusting the planned
-- times.
--
-- It is meant to be called at regular intervals by an external process,
-- such as the runner (software/bin/runner.sh).
--
-- A trigger for changes to the schema's `info` table is also added.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE PROCEDURE adjust_planner ()
LANGUAGE plpgsql
AS $$
DECLARE
_planner_config jsonb;
_planned_line planned_lines%ROWTYPE;
_lag interval;
_last_sequence sequences_summary%ROWTYPE;
_deltatime interval;
_shotinterval interval;
_tstamp timestamptz;
_incr integer;
BEGIN
SET CONSTRAINTS planned_lines_pkey DEFERRED;
SELECT data->'planner'
INTO _planner_config
FROM file_data
WHERE data ? 'planner';
SELECT *
INTO _last_sequence
FROM sequences_summary
ORDER BY sequence DESC
LIMIT 1;
SELECT *
INTO _planned_line
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
SELECT
COALESCE(
((lead(ts0) OVER (ORDER BY sequence)) - ts1),
make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer)
)
INTO _lag
FROM planned_lines
WHERE sequence = _last_sequence.sequence AND line = _last_sequence.line;
_incr = sign(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE '_planner_config: %', _planner_config;
RAISE NOTICE '_last_sequence: %', _last_sequence;
RAISE NOTICE '_planned_line: %', _planned_line;
RAISE NOTICE '_incr: %', _incr;
-- Does the latest sequence match a planned sequence?
IF _planned_line IS NULL THEN -- No it doesn't
RAISE NOTICE 'Latest sequence shot does not match a planned sequence';
SELECT * INTO _planned_line FROM planned_lines ORDER BY sequence ASC LIMIT 1;
RAISE NOTICE '_planned_line: %', _planned_line;
IF _planned_line.sequence <= _last_sequence.sequence THEN
RAISE NOTICE 'Renumbering the planned sequences starting from %', _planned_line.sequence + 1;
-- Renumber the planned sequences starting from last shot sequence number + 1
UPDATE planned_lines
SET sequence = sequence + _last_sequence.sequence - _planned_line.sequence + 1;
END IF;
-- The correction to make to the first planned line's ts0 will be based on either the last
-- sequence's EOL + default line change time or the current time, whichever is later.
_deltatime := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1) + make_interval(mins => (_planner_config->>'defaultLineChangeDuration')::integer), current_timestamp) - _planned_line.ts0;
-- Is the first of the planned lines start time in the past? (±5 mins)
IF _planned_line.ts0 < (current_timestamp - make_interval(mins => 5)) THEN
RAISE NOTICE 'First planned line is in the past. Adjusting times by %', _deltatime;
-- Adjust the start / end time of the planned lines by assuming that we are at
-- `defaultLineChangeDuration` minutes away from SOL of the first planned line.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime;
END IF;
ELSE -- Yes it does
RAISE NOTICE 'Latest sequence does match a planned sequence: %, %', _planned_line.sequence, _planned_line.line;
-- Is it online?
IF EXISTS(SELECT 1 FROM raw_lines_files WHERE sequence = _last_sequence.sequence AND hash = '*online*') THEN
-- Yes it is
RAISE NOTICE 'Sequence % is online', _last_sequence.sequence;
-- Let us get the SOL from the events log if we can
RAISE NOTICE 'Trying to set fsp, ts0 from events log FSP, FGSP';
WITH e AS (
SELECT * FROM events
WHERE
sequence = _last_sequence.sequence
AND ('FSP' = ANY(labels) OR 'FGSP' = ANY(labels))
ORDER BY tstamp LIMIT 1
)
UPDATE planned_lines
SET
fsp = COALESCE(e.point, fsp),
ts0 = COALESCE(e.tstamp, ts0)
FROM e
WHERE planned_lines.sequence = _last_sequence.sequence;
-- Shot interval
_shotinterval := (_last_sequence.ts1 - _last_sequence.ts0) / abs(_last_sequence.lsp - _last_sequence.fsp);
RAISE NOTICE 'Estimating EOL from current shot interval: %', _shotinterval;
SELECT (abs(lsp-fsp) * _shotinterval + ts0) - ts1
INTO _deltatime
FROM planned_lines
WHERE sequence = _last_sequence.sequence;
---- Set ts1 for the current sequence
--UPDATE planned_lines
--SET
--ts1 = (abs(lsp-fsp) * _shotinterval) + ts0
--WHERE sequence = _last_sequence.sequence;
RAISE NOTICE 'Adjustment is %', _deltatime;
IF abs(EXTRACT(EPOCH FROM _deltatime)) < 8 THEN
RAISE NOTICE 'Adjustment too small (< 8 s), so not applying it';
RETURN;
END IF;
-- Adjust ts1 for the current sequence
UPDATE planned_lines
SET ts1 = ts1 + _deltatime
WHERE sequence = _last_sequence.sequence;
-- Now shift all sequences after
UPDATE planned_lines
SET ts0 = ts0 + _deltatime, ts1 = ts1 + _deltatime
WHERE sequence > _last_sequence.sequence;
RAISE NOTICE 'Deleting planned sequences before %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence < _last_sequence.sequence;
ELSE
-- No it isn't
RAISE NOTICE 'Sequence % is offline', _last_sequence.sequence;
-- We were supposed to finish at _planned_line.ts1 but we finished at:
_tstamp := GREATEST(COALESCE(_last_sequence.ts1_final, _last_sequence.ts1), current_timestamp);
-- WARNING Next line is for testing only
--_tstamp := COALESCE(_last_sequence.ts1_final, _last_sequence.ts1);
-- So we need to adjust timestamps by:
_deltatime := _tstamp - _planned_line.ts1;
RAISE NOTICE 'Planned end: %, actual end: % (%, %)', _planned_line.ts1, _tstamp, _planned_line.sequence, _last_sequence.sequence;
RAISE NOTICE 'Shifting times by % for sequences > %', _deltatime, _planned_line.sequence;
-- NOTE: This won't work if sequences are not, err… sequential.
-- NOTE: This has been known to happen in 2020.
UPDATE planned_lines
SET
ts0 = ts0 + _deltatime,
ts1 = ts1 + _deltatime
WHERE sequence > _planned_line.sequence;
RAISE NOTICE 'Deleting planned sequences up to %', _planned_line.sequence;
-- Remove all previous planner entries.
DELETE
FROM planned_lines
WHERE sequence <= _last_sequence.sequence;
END IF;
END IF;
END;
$$;
DROP TRIGGER IF EXISTS info_tg ON info;
CREATE TRIGGER info_tg AFTER INSERT OR DELETE OR UPDATE ON info FOR EACH ROW EXECUTE FUNCTION public.notify('info');
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,91 @@
-- Upgrade the database from commit 81d9ea19 to 0a10c897.
--
-- NOTE: This upgrade must be applied to every schema in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This defines a new function ij_error(line, point, geometry) which
-- returns the crossline and inline distance (in metres) between the
-- geometry (which must be a point) and the preplot corresponding to
-- line / point.
--
-- To apply, run as the dougal user, for every schema in the database:
--
-- psql <<EOF
-- SET search_path TO survey_*,public;
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
-- Return the crossline, inline error of `geom` with respect to `line` and `point`
-- in the project's binning grid.
CREATE OR REPLACE FUNCTION ij_error(line double precision, point double precision, geom public.geometry)
RETURNS public.geometry(Point, 0)
LANGUAGE plpgsql STABLE LEAKPROOF
AS $$
DECLARE
bp jsonb := binning_parameters();
ij public.geometry := to_binning_grid(geom, bp);
theta numeric := (bp->>'theta')::numeric * pi() / 180;
I_inc numeric DEFAULT 1;
J_inc numeric DEFAULT 1;
I_width numeric := (bp->>'I_width')::numeric;
J_width numeric := (bp->>'J_width')::numeric;
a numeric := (I_inc/I_width) * cos(theta);
b numeric := (I_inc/I_width) * -sin(theta);
c numeric := (J_inc/J_width) * sin(theta);
d numeric := (J_inc/J_width) * cos(theta);
xoff numeric := (bp->'origin'->>'I')::numeric;
yoff numeric := (bp->'origin'->>'J')::numeric;
E0 numeric := (bp->'origin'->>'easting')::numeric;
N0 numeric := (bp->'origin'->>'northing')::numeric;
error_i double precision;
error_j double precision;
BEGIN
error_i := (public.st_x(ij) - line) * I_width;
error_j := (public.st_y(ij) - point) * J_width;
RETURN public.ST_MakePoint(error_i, error_j);
END
$$;
-- Return the list of points and metadata for all sequences.
-- Only points which have a corresponding preplot are returned.
-- If available, final positions are returned as well, if not they
-- are NULL.
-- Likewise, crossline / inline errors are also returned as a PostGIS
-- 2D point both for raw and final data.
CREATE OR REPLACE VIEW sequences_detail AS
SELECT
rl.sequence, rl.line AS sailline,
rs.line, rs.point,
rs.tstamp,
rs.objref objRefRaw, fs.objref objRefFinal,
ST_Transform(pp.geometry, 4326) geometryPreplot,
ST_Transform(rs.geometry, 4326) geometryRaw,
ST_Transform(fs.geometry, 4326) geometryFinal,
ij_error(rs.line, rs.point, rs.geometry) errorRaw,
ij_error(rs.line, rs.point, fs.geometry) errorFinal,
json_build_object('preplot', pp.meta, 'raw', rs.meta, 'final', fs.meta) meta
FROM
raw_lines rl
INNER JOIN raw_shots rs USING (sequence)
INNER JOIN preplot_points pp ON rs.line = pp.line AND rs.point = pp.point
LEFT JOIN final_shots fs ON rl.sequence = fs.sequence AND rs.point = fs.point;
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,75 @@
-- Upgrade the database from commit 81d9ea19 to 74b3de5c.
--
-- This upgrade affects the `public` schema only.
--
-- It creates a new table, `queue_items`, for storing
-- requests and responses related to inter-API communication.
-- At the moment this means Equinor's ASAQC API, but it
-- should be applicable to others as well if the need
-- arises.
--
-- As well as the table, it adds:
--
-- * `queue_item_status`, an ENUM type.
-- * `update_timestamp`, a trigger function.
-- * Two triggers on `queue_items`.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It will fail harmlessly if applied twice.
-- Queues are global, not per project,
-- so they go in the `public` schema.
CREATE TYPE queue_item_status
AS ENUM (
'queued',
'cancelled',
'failed',
'sent'
);
CREATE TABLE IF NOT EXISTS queue_items (
item_id serial NOT NULL PRIMARY KEY,
-- One day we may want multiple queues, in that case we will
-- have a queue_id and a relation of queue definitions.
-- But not right now.
-- queue_id integer NOT NULL REFERENCES queues (queue_id),
status queue_item_status NOT NULL DEFAULT 'queued',
payload jsonb NOT NULL,
results jsonb NOT NULL DEFAULT '{}'::jsonb,
created_on timestamptz NOT NULL DEFAULT current_timestamp,
updated_on timestamptz NOT NULL DEFAULT current_timestamp,
not_before timestamptz NOT NULL DEFAULT '1970-01-01T00:00:00Z',
parent_id integer NULL REFERENCES queue_items (item_id)
);
-- Sets `updated_on` to current_timestamp unless an explicit
-- timestamp is part of the update.
--
-- This function can be reused with any table that has (or could have)
-- an `updated_on` column of time timestamptz.
CREATE OR REPLACE FUNCTION update_timestamp () RETURNS trigger AS
$$
BEGIN
IF NEW.updated_on IS NOT NULL THEN
NEW.updated_on := current_timestamp;
END IF;
RETURN NEW;
EXCEPTION
WHEN undefined_column THEN RETURN NEW;
END;
$$
LANGUAGE plpgsql;
CREATE TRIGGER queue_items_tg0
BEFORE INSERT OR UPDATE ON public.queue_items
FOR EACH ROW EXECUTE FUNCTION public.update_timestamp();
CREATE TRIGGER queue_items_tg1
AFTER INSERT OR DELETE OR UPDATE ON public.queue_items
FOR EACH ROW EXECUTE FUNCTION public.notify('queue_items');

View File

@@ -0,0 +1,24 @@
-- Upgrade the database from commit 74b3de5c to commit 83be83e4.
--
-- NOTE: This upgrade only affects the `public` schema.
--
-- This inserts a database schema version into the database.
-- Note that we are not otherwise changing the schema, so older
-- server code will continue to run against this version.
--
-- ATTENTION!
--
-- This value should be incremented every time that the database
-- schema changes (either `public` or any of the survey schemas)
-- and is used by the server at start-up to detect if it is
-- running against a compatible schema version.
--
-- To apply, run as the dougal user:
--
-- psql < $THIS_FILE
--
-- NOTE: It can be applied multiple times without ill effect.
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.1.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.1.0"}' WHERE public.info.key = 'version';

View File

@@ -0,0 +1,84 @@
-- Upgrade the database from commit 83be83e4 to 53ed096e.
--
-- New schema version: 0.2.0
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This migrates the file hashes to address issue #173.
-- The new hashes use size, modification time, creation time and the
-- first half of the MD5 hex digest of the file's absolute path.
--
-- It's a minor (rather than patch) version number increment because
-- changes to `bin/datastore.py` mean that the data is no longer
-- compatible with the hashing function.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE migrate_hashes (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Migrating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
EXECUTE format('UPDATE %I.files SET hash = array_to_string(array_append(trim_array(string_to_array(hash, '':''), 1), left(md5(path), 16)), '':'')', schema_name);
EXECUTE 'SET search_path TO public'; -- Back to the default search path for good measure
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE upgrade_10 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL migrate_hashes(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL upgrade_10();
CALL show_notice('Cleaning up');
DROP PROCEDURE migrate_hashes (schema_name text);
DROP PROCEDURE upgrade_10 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.0"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,189 @@
-- Add function to retrieve sequence/shotpoint from timestamps and vice-versa
--
-- New schema version: 0.2.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects the public schema.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- Two new functions are defined:
--
-- sequence_shot_from_tstamp(tstamp, [tolerance]) → sequence, point, delta
--
-- Returns a sequence + shotpoint if one falls within `tolerance` seconds
-- of `tstamp`. The tolerance may be omitted in which case it defaults to
-- three seconds. If multiple values match, it returns the closest in time.
--
-- tstamp_from_sequence_shot(sequence, point) → tstamp
--
-- Returns a timestamp given a sequence and point number.
--
-- NOTE: This last function must be called from a search path including a
-- project schema, as it accesses the raw_shots table.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the database while the transaction is active.
--
-- WARNING: Applying this upgrade drops the old tables. Ensure that you
-- have migrated the data first.
--
-- NOTE: This is a patch version change so it does not require a
-- backend restart.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION tstamp_from_sequence_shot(
IN s numeric,
IN p numeric,
OUT "ts" timestamptz)
AS $inner$
SELECT tstamp FROM raw_shots WHERE sequence = s AND point = p LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION tstamp_from_sequence_shot(numeric, numeric)
IS 'Get the timestamp of an existing shotpoint.';
CREATE OR REPLACE FUNCTION tstamp_interpolate(s numeric, p numeric) RETURNS timestamptz
AS $inner$
DECLARE
ts0 timestamptz;
ts1 timestamptz;
pt0 numeric;
pt1 numeric;
BEGIN
SELECT tstamp, point
INTO ts0, pt0
FROM raw_shots
WHERE sequence = s AND point < p
ORDER BY point DESC LIMIT 1;
SELECT tstamp, point
INTO ts1, pt1
FROM raw_shots
WHERE sequence = s AND point > p
ORDER BY point ASC LIMIT 1;
RETURN (ts1-ts0)/abs(pt1-pt0)*abs(p-pt0)+ts0;
END;
$inner$ LANGUAGE PLPGSQL;
COMMENT ON FUNCTION tstamp_interpolate(numeric, numeric)
IS 'Interpolate a timestamp given sequence and point values.
It will try to find the points immediately before and after in the sequence and interpolate into the gap, which may consist of multiple missed shots.
If called on an existing shotpoint it will return an interpolated timestamp as if the shotpoint did not exist, as opposed to returning its actual timestamp.
Returns NULL if it is not possible to interpolate.';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
CREATE OR REPLACE FUNCTION public.sequence_shot_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "sequence" numeric,
OUT "point" numeric,
OUT "delta" numeric)
AS $inner$
SELECT
(meta->>'_sequence')::numeric AS sequence,
(meta->>'_point')::numeric AS point,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
meta ? '_sequence' AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(timestamptz, numeric)
IS 'Get sequence and shotpoint from timestamp.
Given a timestamp this function returns the closest shot to it within the given tolerance value.
This uses the `real_time_inputs` table and it does not give an indication of which project the shotpoint belongs to. It is assumed that a single project is being acquired at a given time.';
CREATE OR REPLACE FUNCTION public.sequence_shot_from_tstamp(
IN ts timestamptz,
OUT "sequence" numeric,
OUT "point" numeric,
OUT "delta" numeric)
AS $inner$
SELECT * FROM public.sequence_shot_from_tstamp(ts, 3);
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.sequence_shot_from_tstamp(timestamptz)
IS 'Get sequence and shotpoint from timestamp.
Overloaded form in which the tolerance value is implied and defaults to three seconds.';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.1"}' WHERE public.info.key = 'version';
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,360 @@
-- Add new event log schema.
--
-- New schema version: 0.2.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
-- REQUIRES POSTGRESQL VERSION 14 OR NEWER
-- (Because of CREATE OR REPLACE TRIGGER)
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This is a redesign of the event logging mechanism. The old mechanism
-- relied on a distinction between sequence events (i.e., those which can
-- be associated to a shotpoint within a sequence), timed events (those
-- which occur outside any acquisition sequence) and so-called virtual
-- events (deduced from the data). It was inflexible and inefficient,
-- as most of the time we needed to merge those two types of events into
-- a single view.
--
-- The new mechanism:
-- - uses a single table
-- - accepts sequence event entries for shots or sequences which may not (yet)
-- exist. (https://gitlab.com/wgp/dougal/software/-/issues/170)
-- - keeps edit history (https://gitlab.com/wgp/dougal/software/-/issues/138)
-- - Keeps track of when an entry was made or subsequently edited.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect, as long
-- as the new tables did not previously exist. If they did, they will
-- be emptied before migrating the data.
--
-- WARNING: Applying this upgrade migrates the old event data. It does
-- NOT yet drop the old tables, which is handled in a separate script,
-- leaving the actions here technically reversible without having to
-- restore from backup.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE SEQUENCE IF NOT EXISTS event_log_uid_seq
AS integer
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE IF NOT EXISTS event_log_full (
-- uid is a unique id for each entry in the table,
-- including revisions of an existing entry.
uid integer NOT NULL PRIMARY KEY DEFAULT nextval('event_log_uid_seq'),
-- All revisions of an entry share the same id.
-- If inserting a new entry, id = uid.
id integer NOT NULL,
-- No default tstamp because, for instance, a user could
-- enter a sequence/point event referring to the future.
-- An external process should scan those at regular intervals
-- and populate the tstamp as needed.
tstamp timestamptz NULL,
sequence integer NULL,
point integer NULL,
remarks text NOT NULL DEFAULT '',
labels text[] NOT NULL DEFAULT ARRAY[]::text[],
-- TODO: Need a geometry column? Let us check performance as it is
-- and if needed either add a geometry column + spatial index.
meta jsonb NOT NULL DEFAULT '{}'::jsonb,
validity tstzrange NOT NULL CHECK (NOT isempty(validity)),
-- We accept either:
-- - Just a tstamp
-- - Just a sequence / point pair
-- - All three
-- We don't accept:
-- - A sequence without a point or vice-versa
-- - Nothing being provided
CHECK (
(tstamp IS NOT NULL AND sequence IS NOT NULL AND point IS NOT NULL) OR
(tstamp IS NOT NULL AND sequence IS NULL AND point IS NULL) OR
(tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL)
)
);
CREATE INDEX IF NOT EXISTS event_log_id ON event_log_full USING btree (id);
CREATE OR REPLACE FUNCTION event_log_full_insert() RETURNS TRIGGER AS $inner$
BEGIN
NEW.id := COALESCE(NEW.id, NEW.uid);
NEW.validity := tstzrange(current_timestamp, NULL);
NEW.meta = COALESCE(NEW.meta, '{}'::jsonb);
NEW.labels = COALESCE(NEW.labels, ARRAY[]::text[]);
IF cardinality(NEW.labels) > 0 THEN
-- Remove duplicates
SELECT array_agg(DISTINCT elements)
INTO NEW.labels
FROM (SELECT unnest(NEW.labels) AS elements) AS labels;
END IF;
RETURN NEW;
END;
$inner$ LANGUAGE plpgsql;
CREATE OR REPLACE TRIGGER event_log_full_insert_tg
BEFORE INSERT ON event_log_full
FOR EACH ROW EXECUTE FUNCTION event_log_full_insert();
-- The public.notify() trigger to alert clients that something has changed
CREATE OR REPLACE TRIGGER event_log_full_notify_tg
AFTER INSERT OR DELETE OR UPDATE
ON event_log_full FOR EACH ROW EXECUTE FUNCTION public.notify('event');
--
-- VIEW event_log
--
-- This is what is exposed to the user most of the time.
-- It shows the current version of records in the event_log_full
-- table.
--
-- The user applies edits to this table directly, which are
-- processed via triggers.
--
CREATE OR REPLACE VIEW event_log AS
SELECT
id, tstamp, sequence, point, remarks, labels, meta,
uid <> id AS has_edits,
lower(validity) AS modified_on
FROM event_log_full
WHERE validity @> current_timestamp;
CREATE OR REPLACE FUNCTION event_log_update() RETURNS TRIGGER AS $inner$
BEGIN
IF (TG_OP = 'INSERT') THEN
-- Complete the tstamp if possible
IF NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL AND NEW.tstamp IS NULL THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
-- Any id that is provided will be ignored. The generated
-- id will match uid.
INSERT INTO event_log_full
(tstamp, sequence, point, remarks, labels, meta)
VALUES (NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'UPDATE') THEN
-- Set end of validity and create a new entry with id
-- matching that of the old entry.
-- NOTE: Do not allow updating an event that has meta.readonly = true
IF EXISTS
(SELECT *
FROM event_log_full
WHERE id = OLD.id AND (meta->>'readonly')::boolean IS TRUE)
THEN
RAISE check_violation USING MESSAGE = 'Cannot modify read-only entry';
RETURN NULL;
END IF;
-- If the sequence / point has changed, and no new tstamp is provided, get one
IF NEW.sequence <> OLD.sequence OR NEW.point <> OLD.point
AND NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL
AND NEW.tstamp IS NULL OR NEW.tstamp = OLD.tstamp THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
-- Any attempt to modify id will be ignored.
INSERT INTO event_log_full
(id, tstamp, sequence, point, remarks, labels, meta)
VALUES (OLD.id, NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'DELETE') THEN
-- Set end of validity.
-- NOTE: We *do* allow deleting an event that has meta.readonly = true
-- This could be of interest if for instance we wanted to keep the history
-- of QC results for a point, provided that the QC routines write to
-- event_log and not event_log_full
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
RETURN NULL;
END IF;
END;
$inner$ LANGUAGE plpgsql;
CREATE OR REPLACE TRIGGER event_log_tg
INSTEAD OF INSERT OR UPDATE OR DELETE ON event_log
FOR EACH ROW EXECUTE FUNCTION event_log_update();
-- NOTE
-- This is where we migrate the actual data
RAISE NOTICE 'Migrating schema %', schema_name;
-- We start by deleting any data that the new tables might
-- have had if they already existed.
DELETE FROM event_log_full;
-- We purposefully bypass event_log here, as the tables we're
-- migrating from only contain a single version of each event.
INSERT INTO event_log_full (tstamp, sequence, point, remarks, labels, meta)
SELECT
tstamp, sequence, point, remarks, labels,
meta || json_build_object('geometry', geometry, 'readonly', virtual)::jsonb
FROM events;
UPDATE event_log_full SET meta = meta - 'geometry' WHERE meta->>'geometry' IS NULL;
UPDATE event_log_full SET meta = meta - 'readonly' WHERE (meta->'readonly')::boolean IS false;
-- This function used the superseded `events` view.
-- We need to drop it because we're changing the return type.
DROP FUNCTION IF EXISTS label_in_sequence (_sequence integer, _label text);
CREATE OR REPLACE FUNCTION label_in_sequence (_sequence integer, _label text)
RETURNS event_log
LANGUAGE sql
AS $inner$
SELECT * FROM event_log WHERE sequence = _sequence AND _label = ANY(labels);
$inner$;
-- This function used the superseded `events` view (and a strange logic).
CREATE OR REPLACE PROCEDURE handle_final_line_events (_seq integer, _label text, _column text)
LANGUAGE plpgsql
AS $inner$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event event_log%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
INSERT INTO event_log (sequence, point, remarks, labels, meta)
VALUES (
-- The sequence
_seq,
-- The shotpoint
_column_value,
-- Remark. Something like "FSP <linename>"
format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)),
-- Label
ARRAY[_label],
-- Meta. Something like {"auto" : {"FSP" : "final_line"}}
json_build_object('auto', json_build_object(_label, _tg_name))
);
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
END IF;
END IF;
END;
$inner$;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_12 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_12();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_12 ();
CALL show_notice('Updating db_schema version');
-- This is technically still compatible with 0.2.0 as we are only adding
-- some more tables and views but not yet dropping the old ones, which we
-- will do separately so that these scripts do not get too big.
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.2.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.2.2"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,98 @@
-- Migrate events to new schema
--
-- New schema version: 0.3.0
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This migrates the data from the old event log tables to the new schema.
-- It is a *very* good idea to review the data manually after the migration
-- as issues with the logs that had gone unnoticed may become evident now.
--
-- WARNING: If data exists in the new event tables, IT WILL BE TRUNCATED.
--
-- Other than that, this migration is fairly benign as it does not modify
-- the old data.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the new event tables while the transaction is active.
--
-- WARNING: This is a minor (not patch) version change, meaning that it requires
-- an upgrade and restart of the backend server.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
TRUNCATE event_log_full;
-- NOTE: meta->>'virtual' = TRUE means that the event was created algorithmically
-- and should not be user editable.
INSERT INTO event_log_full (tstamp, sequence, point, remarks, labels, meta)
SELECT
tstamp, sequence, point, remarks, labels,
meta || json_build_object('geometry', geometry, 'readonly', virtual)::jsonb
FROM events;
-- We purposefully bypass event_log here
UPDATE event_log_full SET meta = meta - 'geometry' WHERE meta->>'geometry' IS NULL;
UPDATE event_log_full SET meta = meta - 'readonly' WHERE (meta->'readonly')::boolean IS false;
END
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.0"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.0"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,99 @@
-- Drop old event tables.
--
-- New schema version: 0.3.1
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This completes the migration from the old event logging mechanism by
-- DROPPING THE OLD DATABASE OBJECTS, MAKING THE MIGRATION IRREVERSIBLE,
-- other than by restoring from backup and manually transferring any new
-- data that may have been created in the meanwhile.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can take a while if run on a large database.
-- NOTE: It can be applied multiple times without ill effect.
-- NOTE: This will lock the database while the transaction is active.
--
-- WARNING: Applying this upgrade drops the old tables. Ensure that you
-- have migrated the data first.
--
-- NOTE: This is a patch version change so it does not require a
-- backend restart.
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
DROP FUNCTION IF EXISTS
label_in_sequence(integer,text), reset_events_serials();
DROP VIEW IF EXISTS
events_midnight_shot, events_seq_timed, events_labels, "events";
DROP TABLE IF EXISTS
events_seq_labels, events_timed_labels, events_timed_seq, events_seq, events_timed;
DROP SEQUENCE IF EXISTS
events_seq_id_seq, events_timed_id_seq;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_database () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_database();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_database ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.1"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.1"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,136 @@
-- Fix project_summary view.
--
-- New schema version: 0.3.2
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This fixes a problem with the project_summary view. In its common table
-- expression, the view definition tried to search public.projects based on
-- the search path value with the following expression:
--
-- (current_setting('search_path'::text) ~~ (p.schema || '%'::text))
--
-- That is of course bound to fail as soon as the schema goes above `survey_9`
-- because `survey_10 LIKE ('survey_1' || '%')` is TRUE.
--
-- The new mechanism relies on splitting the search_path.
--
-- NOTE: The survey schema needs to be the leftmost element in search_path.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE VIEW project_summary AS
WITH fls AS (
SELECT avg((final_lines_summary.duration / ((final_lines_summary.num_points - 1))::double precision)) AS shooting_rate,
avg((final_lines_summary.length / date_part('epoch'::text, final_lines_summary.duration))) AS speed,
sum(final_lines_summary.duration) AS prod_duration,
sum(final_lines_summary.length) AS prod_distance
FROM final_lines_summary
), project AS (
SELECT p.pid,
p.name,
p.schema
FROM public.projects p
WHERE (split_part(current_setting('search_path'::text), ','::text, 1) = p.schema)
)
SELECT project.pid,
project.name,
project.schema,
( SELECT count(*) AS count
FROM preplot_lines
WHERE (preplot_lines.class = 'V'::bpchar)) AS lines,
ps.total,
ps.virgin,
ps.prime,
ps.other,
ps.ntba,
ps.remaining,
( SELECT to_json(fs.*) AS to_json
FROM final_shots fs
ORDER BY fs.tstamp
LIMIT 1) AS fsp,
( SELECT to_json(fs.*) AS to_json
FROM final_shots fs
ORDER BY fs.tstamp DESC
LIMIT 1) AS lsp,
( SELECT count(*) AS count
FROM raw_lines rl) AS seq_raw,
( SELECT count(*) AS count
FROM final_lines rl) AS seq_final,
fls.prod_duration,
fls.prod_distance,
fls.speed AS shooting_rate
FROM preplot_summary ps,
fls,
project;
ALTER TABLE project_summary OWNER TO postgres;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_15 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_15();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_15 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.2"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.2"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,169 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.3
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- The event_log_update() function that gets called when trying to update
-- the event_log view will not work if the caller does provide a timestamp
-- or sequence + point in the list of fields to be updated. See:
-- https://gitlab.com/wgp/dougal/software/-/issues/198
--
-- This fixes the problem by liberally using COALESCE() to merge the OLD
-- and NEW records.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION event_log_update() RETURNS trigger
LANGUAGE plpgsql
AS $inner$
BEGIN
IF (TG_OP = 'INSERT') THEN
-- Complete the tstamp if possible
IF NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL AND NEW.tstamp IS NULL THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
-- Any id that is provided will be ignored. The generated
-- id will match uid.
INSERT INTO event_log_full
(tstamp, sequence, point, remarks, labels, meta)
VALUES (NEW.tstamp, NEW.sequence, NEW.point, NEW.remarks, NEW.labels, NEW.meta);
RETURN NEW;
ELSIF (TG_OP = 'UPDATE') THEN
-- Set end of validity and create a new entry with id
-- matching that of the old entry.
-- NOTE: Do not allow updating an event that has meta.readonly = true
IF EXISTS
(SELECT *
FROM event_log_full
WHERE id = OLD.id AND (meta->>'readonly')::boolean IS TRUE)
THEN
RAISE check_violation USING MESSAGE = 'Cannot modify read-only entry';
RETURN NULL;
END IF;
-- If the sequence / point has changed, and no new tstamp is provided, get one
IF NEW.sequence <> OLD.sequence OR NEW.point <> OLD.point
AND NEW.sequence IS NOT NULL AND NEW.point IS NOT NULL
AND NEW.tstamp IS NULL OR NEW.tstamp = OLD.tstamp THEN
SELECT COALESCE(
tstamp_from_sequence_shot(NEW.sequence, NEW.point),
tstamp_interpolate(NEW.sequence, NEW.point)
)
INTO NEW.tstamp;
END IF;
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
-- Any attempt to modify id will be ignored.
INSERT INTO event_log_full
(id, tstamp, sequence, point, remarks, labels, meta)
VALUES (
OLD.id,
COALESCE(NEW.tstamp, OLD.tstamp),
COALESCE(NEW.sequence, OLD.sequence),
COALESCE(NEW.point, OLD.point),
COALESCE(NEW.remarks, OLD.remarks),
COALESCE(NEW.labels, OLD.labels),
COALESCE(NEW.meta, OLD.meta)
);
RETURN NEW;
ELSIF (TG_OP = 'DELETE') THEN
-- Set end of validity.
-- NOTE: We *do* allow deleting an event that has meta.readonly = true
-- This could be of interest if for instance we wanted to keep the history
-- of QC results for a point, provided that the QC routines write to
-- event_log and not event_log_full
UPDATE event_log_full
SET validity = tstzrange(lower(validity), current_timestamp)
WHERE validity @> current_timestamp AND id = OLD.id;
RETURN NULL;
END IF;
END;
$inner$;
CREATE OR REPLACE TRIGGER event_log_tg INSTEAD OF INSERT OR DELETE OR UPDATE ON event_log FOR EACH ROW EXECUTE FUNCTION event_log_update();
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_16 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_16();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_16 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.3"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.3"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,163 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.4
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- This creates a new procedure augment_event_data() which tries to
-- populate missing event_log data, namely timestamps and geometries.
--
-- To do this it also adds a function public.geometry_from_tstamp()
-- which, given a timestamp, tries to fetch a geometry from real_time_inputs.
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE PROCEDURE augment_event_data ()
LANGUAGE sql
AS $inner$
-- Populate the timestamp of sequence / point events
UPDATE event_log_full
SET tstamp = tstamp_from_sequence_shot(sequence, point)
WHERE
tstamp IS NULL AND sequence IS NOT NULL AND point IS NOT NULL;
-- Populate the geometry of sequence / point events for which
-- there is raw_shots data.
UPDATE event_log_full
SET meta = meta ||
jsonb_build_object(
'geometry',
(
SELECT st_transform(geometry, 4326)::jsonb
FROM raw_shots rs
WHERE rs.sequence = event_log_full.sequence AND rs.point = event_log_full.point
)
)
WHERE
sequence IS NOT NULL AND point IS NOT NULL AND
NOT meta ? 'geometry';
-- Populate the geometry of time-based events
UPDATE event_log_full e
SET
meta = meta || jsonb_build_object('geometry',
(SELECT st_transform(g.geometry, 4326)::jsonb
FROM geometry_from_tstamp(e.tstamp, 3) g))
WHERE
tstamp IS NOT NULL AND
sequence IS NULL AND point IS NULL AND
NOT meta ? 'geometry';
-- Get rid of null geometries
UPDATE event_log_full
SET
meta = meta - 'geometry'
WHERE
jsonb_typeof(meta->'geometry') = 'null';
-- Simplify the GeoJSON when the CRS is EPSG:4326
UPDATE event_log_full
SET
meta = meta #- '{geometry, crs}'
WHERE
meta->'geometry'->'crs'->'properties'->>'name' = 'EPSG:4326';
$inner$;
COMMENT ON PROCEDURE augment_event_data()
IS 'Populate missing timestamps and geometries in event_log_full';
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_17 () AS $$
DECLARE
row RECORD;
BEGIN
CALL show_notice('Adding index to real_time_inputs.meta->tstamp');
CREATE INDEX IF NOT EXISTS meta_tstamp_idx
ON public.real_time_inputs
USING btree ((meta->>'tstamp') DESC);
CALL show_notice('Creating function geometry_from_tstamp');
CREATE OR REPLACE FUNCTION public.geometry_from_tstamp(
IN ts timestamptz,
IN tolerance numeric,
OUT "geometry" geometry,
OUT "delta" numeric)
AS $inner$
SELECT
geometry,
extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ) AS delta
FROM real_time_inputs
WHERE
geometry IS NOT NULL AND
abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts )) < tolerance
ORDER BY abs(extract('epoch' FROM (meta->>'tstamp')::timestamptz - ts ))
LIMIT 1;
$inner$ LANGUAGE SQL;
COMMENT ON FUNCTION public.geometry_from_tstamp(timestamptz, numeric)
IS 'Get geometry from timestamp';
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_17();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_17 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.4"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.4"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

View File

@@ -0,0 +1,158 @@
-- Fix not being able to edit a time-based event.
--
-- New schema version: 0.3.5
--
-- ATTENTION:
--
-- ENSURE YOU HAVE BACKED UP THE DATABASE BEFORE RUNNING THIS SCRIPT.
--
--
-- NOTE: This upgrade affects all schemas in the database.
-- NOTE: Each application starts a transaction, which must be committed
-- or rolled back.
--
-- The function label_in_sequence(integer, text) was missing for the
-- production schemas. This patch (re-)defines the function as well
-- as other function that depend on it (otherwise it does not get
-- picked up).
--
-- To apply, run as the dougal user:
--
-- psql <<EOF
-- \i $THIS_FILE
-- COMMIT;
-- EOF
--
-- NOTE: It can be applied multiple times without ill effect.
--
BEGIN;
CREATE OR REPLACE PROCEDURE show_notice (notice text) AS $$
BEGIN
RAISE NOTICE '%', notice;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_survey_schema (schema_name text) AS $$
BEGIN
RAISE NOTICE 'Updating schema %', schema_name;
-- We need to set the search path because some of the trigger
-- functions reference other tables in survey schemas assuming
-- they are in the search path.
EXECUTE format('SET search_path TO %I,public', schema_name);
CREATE OR REPLACE FUNCTION label_in_sequence(_sequence integer, _label text) RETURNS event_log
LANGUAGE sql
AS $inner$
SELECT * FROM event_log WHERE sequence = _sequence AND _label = ANY(labels);
$inner$;
-- We need to redefine the functions / procedures that call label_in_sequence
CREATE OR REPLACE PROCEDURE handle_final_line_events(IN _seq integer, IN _label text, IN _column text)
LANGUAGE plpgsql
AS $inner$
DECLARE
_line final_lines_summary%ROWTYPE;
_column_value integer;
_tg_name text := 'final_line';
_event event_log%ROWTYPE;
event_id integer;
BEGIN
SELECT * INTO _line FROM final_lines_summary WHERE sequence = _seq;
_event := label_in_sequence(_seq, _label);
_column_value := row_to_json(_line)->>_column;
--RAISE NOTICE '% is %', _label, _event;
--RAISE NOTICE 'Line is %', _line;
--RAISE NOTICE '% is % (%)', _column, _column_value, _label;
IF _event IS NULL THEN
--RAISE NOTICE 'We will populate the event log from the sequence data';
INSERT INTO event_log (sequence, point, remarks, labels, meta)
VALUES (
-- The sequence
_seq,
-- The shotpoint
_column_value,
-- Remark. Something like "FSP <linename>"
format('%s %s', _label, (SELECT meta->>'lineName' FROM final_lines WHERE sequence = _seq)),
-- Label
ARRAY[_label],
-- Meta. Something like {"auto" : {"FSP" : "final_line"}}
json_build_object('auto', json_build_object(_label, _tg_name))
);
ELSE
--RAISE NOTICE 'We may populate the sequence meta from the event log';
--RAISE NOTICE 'Unless the event log was populated by us previously';
--RAISE NOTICE 'Populated by us previously? %', _event.meta->'auto'->>_label = _tg_name;
IF _event.meta->'auto'->>_label IS DISTINCT FROM _tg_name THEN
--RAISE NOTICE 'Adding % found in events log to final_line meta', _label;
UPDATE final_lines
SET meta = jsonb_set(meta, ARRAY[_label], to_jsonb(_event.point))
WHERE sequence = _seq;
END IF;
END IF;
END;
$inner$;
CREATE OR REPLACE PROCEDURE final_line_post_import(IN _seq integer)
LANGUAGE plpgsql
AS $inner$
BEGIN
CALL handle_final_line_events(_seq, 'FSP', 'fsp');
CALL handle_final_line_events(_seq, 'FGSP', 'fsp');
CALL handle_final_line_events(_seq, 'LGSP', 'lsp');
CALL handle_final_line_events(_seq, 'LSP', 'lsp');
END;
$inner$;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE PROCEDURE pg_temp.upgrade_18 () AS $$
DECLARE
row RECORD;
BEGIN
FOR row IN
SELECT schema_name FROM information_schema.schemata
WHERE schema_name LIKE 'survey_%'
ORDER BY schema_name
LOOP
CALL pg_temp.upgrade_survey_schema(row.schema_name);
END LOOP;
END;
$$ LANGUAGE plpgsql;
CALL pg_temp.upgrade_18();
CALL show_notice('Cleaning up');
DROP PROCEDURE pg_temp.upgrade_survey_schema (schema_name text);
DROP PROCEDURE pg_temp.upgrade_18 ();
CALL show_notice('Updating db_schema version');
INSERT INTO public.info VALUES ('version', '{"db_schema": "0.3.5"}')
ON CONFLICT (key) DO UPDATE
SET value = public.info.value || '{"db_schema": "0.3.5"}' WHERE public.info.key = 'version';
CALL show_notice('All done. You may now run "COMMIT;" to persist the changes');
DROP PROCEDURE show_notice (notice text);
--
--NOTE Run `COMMIT;` now if all went well
--

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

187
etc/qc/README.md Normal file
View File

@@ -0,0 +1,187 @@
# QC tests
## Introduction
QC tests are defined and parametrised out of source in YAML files. In the project definition file, the `qc.definitions` and `qc.parameters` keys point to, respectively, the definition and parametrisation files for the QCs to be applied to a given project.
Different QCs may be defined for different projects by saving them to separate definition files; conversely, the same QCs may be reused across projects.
The parameters for each QC test are saved to a separate file. This is to allow QCs to be reused across projects, possibly with different parameters.
## Running
For all projects that have QCs defined, the tests can be run by calling the [`lib/www/server/lib/qc.js`](/lib/www/server/lib/qc.js) script. This can be done from a cronjob, e.g.,:
```cron
# max-old-space-size increases the memory available to Node.js, to deal with complicated tests or large projects.
*/5 * * * * NODE_OPTIONS="--max-old-space-size=4096" node $HOME/software/lib/www/server/lib/qc.js >/dev/null
```
## QC definition file
The QC definition YAML file should consist of a list of tests. These may be organised hierarchically if the user wishes to do so.
A QC definition consists of the following attributes:
Attribute | Description
--------------|-------------
`name` | A short name for the test or hierarchical group.
`description` | A more detailed description. Markdown is accepted.
`disabled` | If `true`, the test or branch will not be run.
`iterate` | What to iterate over. It can take one of three values: `shots`, `sequences` or `lines`. If not present it defaults to `shots`, which means the script will be called once per every shot in the prospect.
`labels` | Array of labels to apply to the test or to the branch.
`children` | Any element having a `children` attribute becomes a branch. The contents of this attribute are a list of tests or branches, same as the top-level list.
`check` | A script consisting of JavaScript code which defines the test that is to be run. Tests that pass should return **`true`** whereas failing tests should return a string with a message describing the failure. Note that it is valid to have both `check` and `children` in the same item.
### Test definitions
Tests can be defined as arbitrary JavaScript, which will be run in a sandbox. The sandboxed code does not have access to the filesystem or the `console` object.
The result of the test is the last expression evaluated by the script. This should be the primitive **`true`** if the test is successful, or a string describing the failure if it is not.
The script has access to the following variables:
#### `parameters`
An object containing all the parameter definitions for the project, for instance:
```javascript
{
gunDepth: 6,
gunDepthTolerance: 0.5,
gunTiming: 0.9999,
gunTimingSubarrayAverage: 0.5,
gunPressureNominal: 2000,
gunPressureToleranceRatio: 0.025,
crosslineError: 12,
crosslineErrorAverage: 9,
inlineErrorRunningAverageValue: 2,
inlineErrorRunningAverageShots: 40
}
```
#### `currentItem`
The item being iterated over. This can be of type `Shot`, `Sequence`, or `Preplot` depending on the value of this test's `iterate` attribute.
#### `shots`
An array of `Shot` objects.
Example:
```javascript
{
type: 'final',
_id: [ 7, 1764 ],
sequence: 7,
line: 5500,
point: 1764,
objref: 3,
tstamp: "2020-09-07T04:10:13.680Z",
hash: '2173892:1599458941.3070147:1599458941.3070147:80621034',
geometry: '{"type":"Point","coordinates":[2.471571453,59.169725413]}',
error_i: 2.7102108739654795,
error_j: -0.06460360411324473,
preplot_geometry: '{"type":"Point","crs":{"type":"name","properties":{"name":"EPSG:23031"}},"coordinates":[469883.4,6559284.9]}',
raw_geometry: '{"type":"Point","crs":{"type":"name","properties":{"name":"EPSG:23031"}},"coordinates":[469881.87,6559285.76]}',
final_geometry: '{"type":"Point","crs":{"type":"name","properties":{"name":"EPSG:23031"}},"coordinates":[469881.09,6559286.22]}',
pp_meta: {},
raw_meta: {
smsrc: {
guns: [Array],
line: '1054980007S00000',
mask: 38,
shot: 1764,
time: "b'20/09/07:04:10:13'",
spare: '',
header: '*SMSRC',
spread: 3,
volume: 3050,
blk_siz: 2282,
manifold: 2027,
num_auto: 0,
num_guns: 52,
trg_mode: 'E',
avg_delta: 0,
num_delta: 0,
std_delta: 0.073,
baro_press: null,
num_active: 52,
num_nofire: 0,
src_number: 2,
num_subarray: 6
}
},
final_meta: {},
_: [Function]
}
```
#### `sequences`
An array of `Sequence` objects.
Example:
```javascript
{
"_id": 9,
"sequence": 9,
"line": 5562,
"fsp": 2580,
"lsp": 996,
"fsp_final": 2548,
"lsp_final": 1000,
"ts0": "2020-09-07T08:34:13.112Z",
"ts1": "2020-09-07T10:47:49.116Z",
"ts0_final": "2020-09-07T08:36:58.984Z",
"ts1_final": "2020-09-07T10:47:28.608Z",
"duration": {
"hours": 2,
"minutes": 13,
"seconds": 36,
"milliseconds": 4
},
"duration_final": {
"hours": 2,
"minutes": 10,
"seconds": 29,
"milliseconds": 624
},
"num_preplots": 775,
"num_points": 775,
"missing_shots": 0,
"length": 19350.1845360761,
"azimuth": 26.443105805883572,
"remarks": "",
"remarks_final": "",
"status": "final"
_: [Function]
}
```
#### `preplots`
An array of `Preplot` objects.
Example:
```javascript
{
_id: [ null, 2348, 5130 ],
line: 5130,
point: 2348,
class: 'V',
ntba: false,
geometry: '{"type":"Point","crs":{"type":"name","properties":{"name":"EPSG:23031"}},"coordinates":[470769.8,6550688.8]}',
meta: {},
count: 0
}
```
#### The `_` function
Each of the above objects has a function named `_`. This is a helper to quickly access own nested attributes without having to check if the attribute or one of its parents exist. For instance, on a `Shot` item, `currentItem._('raw_meta.smsrc')` will return the SmartSource gun data if it exists, or undefined if either `smsrc` or `raw_meta` are not defined.

View File

@@ -0,0 +1,276 @@
# QC definition file
-
name: "Missing shots"
iterate: "sequences"
labels: [ "QC" ]
id: missing_shots
check: |
const sequence = currentItem;
let results;
if (sequence.missing_shots) {
results = {
shots: {}
}
const missing_shots = missingShotpoints.filter(i => !i.ntba);
for (const shot of missing_shots) {
results.shots[shot.point] = { remarks: "Missed shot", labels: [ "QC", "QCAcq" ] };
}
} else {
results = true;
}
results;
-
name: "Gun QC"
disabled: false
labels: [ "QC", "QCGuns" ]
children:
-
name: "Sequences without gun data"
iterate: "sequences"
id: seq_no_gun_data
check: |
shotpoints.some(i => i.meta?.raw?.smsrc) || "Sequence has no gun data"
-
name: "Missing gun data"
id: missing_gun_data
ignoreAllFailed: true
check: |
!!currentItem._("raw_meta.smsrc.guns")
? true
: "Missing gun data"
-
name: "No fire"
id: no_fire
check: |
const currentShot = currentItem;
const gunData = currentItem._("raw_meta.smsrc");
(gunData && gunData.guns && gunData.guns.length != gunData.num_active)
? `Source ${gunData.src_number}: No fire (${gunData.guns.length - gunData.num_active} guns)`
: true;
-
name: "Pressure errors"
id: pressure_errors
check: |
const pressure=11;
const gunData = currentItem._("raw_meta.smsrc");
const results = gunData &&
gunData
.guns
.filter(gun => ((gun[2] == gunData.src_number) && (gun[pressure]/parameters.gunPressureNominal - 1) > parameters.gunPressureToleranceRatio))
.map(gun =>
`source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, pressure: ${gun[pressure]} / ${parameters.gunPressureNominal} = ${(Math.abs(gun[pressure]/parameters.gunPressureNominal - 1)*100).toFixed(2)}% > ${(parameters.gunPressureToleranceRatio*100).toFixed(2)}%`
).join(" \n");
results && results.length
? results
: true
-
name: "Single gun / cluster"
children:
-
name: "Source depth"
id: source_depth
check: |
const currentShot = currentItem;
let _result_;
_depth=10;
const gunData = currentShot._("raw_meta.smsrc.guns");
if (!gunData) {
// We check for missing data elsewhere, so don't fail this test
_result_ = true
} else if (gunData.every(gun => Math.abs(gun[_depth]-parameters.gunDepth) <= parameters.gunDepthTolerance)) {
_result_ = true;
} else {
const bad_guns = gunData.filter(gun => Math.abs(gun[_depth]-parameters.gunDepth) > parameters.gunDepthTolerance).map(gun => {
return `source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, depth: ${gun[10]}`;
});
_result_ = `Depth error: ${bad_guns.join("; ")}`;
}
_result_
-
name: "Synchronisation (error)"
id: sync_error
check: |
const currentShot = currentItem;
const gunData = currentShot._("raw_meta.smsrc");
let result = [];
if (gunData && gunData.num_nofire == 0) {
// These are the indices into the gun array for the different
// values of interest.
const subarray = 0;
const aimpoint = 7;
const firetime = 8;
// We only care about the source which actually fired (or was supposed to)
const sourceFired = gunData.guns.filter(g => g[2] == gunData.src_number);
// Let us check if the average delta for each string is within spec
let subarrayAverages = [];
sourceFired.forEach(g => {
const idx = g[subarray]-1;
const delta = g[firetime]-g[aimpoint];
if (!subarrayAverages[idx]) {
subarrayAverages[idx] = [];
}
subarrayAverages[idx].push(delta);
});
subarrayAverages = subarrayAverages.map(s => s.reduce( (a, b) => a+b, 0 ) / s.length);
subarrayAverages.forEach((value, idx) => {
if (value > parameters.gunTimingSubarrayAverage) {
result.push(`Average delta error: string ${idx+1}: ${value.toFixed(2)} > ${parameters.gunTimingSubarrayAverage}`);
}
});
// Let us see about individual guns
sourceFired
.filter(gun => Math.abs(gun[firetime]-gun[aimpoint]) > parameters.gunTiming)
.forEach(gun => {
const value = Math.abs(gun[firetime]-gun[aimpoint]);
result.push(`Delta error: source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}: ${value.toFixed(2)} > ${parameters.gunTiming}`);
});
}
if (result.length) {
result.join("; ");
} else {
// Either there were no error or gun data was missing, which we take care of elsewhere
true;
}
-
name: "Synchronisation (warning)"
id: sync_warn
check: |
const currentShot = currentItem;
const gunData = currentShot._("raw_meta.smsrc");
let result = [];
if (gunData && gunData.num_nofire == 0) {
// These are the indices into the gun array for the different
// values of interest.
const subarray = 0;
const aimpoint = 7;
const firetime = 8;
// We only care about the source which actually fired (or was supposed to)
const sourceFired = gunData.guns.filter(g => g[2] == gunData.src_number);
sourceFired
.filter(gun => Math.abs(gun[firetime]-gun[aimpoint]) >= parameters.gunTimingWarning && Math.abs(gun[firetime]-gun[aimpoint]) <= parameters.gunTiming)
.forEach(gun => {
const value = Math.abs(gun[firetime]-gun[aimpoint]);
result.push(`Delta warning: source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}: ${parameters.gunTimingWarning} ≤ ${value.toFixed(2)} ≤ ${parameters.gunTiming}`);
});
}
if (result.length) {
result.join("; ");
} else {
// Either there were no error or gun data was missing, which we take care of elsewhere
true;
}
-
name: "Autofire"
id: autofire
check: |
const currentShot = currentItem;
let _result_;
_autofire=5;
const gunData = currentShot._("raw_meta.smsrc.guns");
if (!gunData) {
// We check for missing data elsewhere, so don't fail this test
_result_ = true;
} else if (gunData.every(gun => gun[_autofire] == false)) {
_result_ = true;
} else {
const bad_guns = gunData.filter(gun => gun[_autofire]).map(gun => {
return `source ${gun[2]}, string ${gun[0]}, gun ${gun[1]}, depth: ${gun[10]}`;
});
_result_ = `Depth error: ${bad_guns.join(";\n")}`;
}
_result_
-
name: "Centre of source preplot deviation (single shots)"
labels: [ "QC", "QCNav" ]
disabled: false
children:
-
name: "Crossline"
id: crossline
check: |
const currentShot = currentItem;
Math.abs(currentShot.error_i) <= parameters.crosslineError
|| `Crossline error (${currentShot.type}): ${currentShot.error_i.toFixed(2)} > ${parameters.crosslineError}`
-
name: "Inline"
id: inline
check: |
const currentShot = currentItem;
Math.abs(currentShot.error_j) <= parameters.inlineError
|| `Inline error (${currentShot.type}): ${currentShot.error_j.toFixed(2)} > ${parameters.inlineError}`
-
name: "Centre of source preplot deviation (moving average)"
labels: [ "QC", "QCNav" ]
children:
-
name: "Crossline"
iterate: "sequences"
parameters: [ "crosslineErrorAverage" ]
id: crossline_average
check: |
const currentSequence = currentItem;
//const i_err = shotpoints.filter(s => s.error_i != null).map(a => a.error_i);
const i_err = shotpoints.map(i =>
(i.errorfinal?.coordinates ?? i.errorraw?.coordinates)[0]
)
.filter(i => !isNaN(i));
if (i_err.length) {
const avg = i_err.reduce( (a, b) => a+b)/i_err.length;
avg <= parameters.crosslineErrorAverage ||
`Average crossline error: ${avg.toFixed(2)} > ${parameters.crosslineErrorAverage}`
} else {
`Sequence ${currentSequence.sequence} has no shots within preplot`
}
-
name: "Inline"
iterate: "sequences"
parameters: [ "inlineErrorRunningAverageShots" ]
id: inline_average
check: |
const currentSequence = currentItem;
const n = parameters.inlineErrorRunningAverageShots; // For brevity
const results = shotpoints.slice(n/2, -n/2).map( (shot, index) => {
const shots = shotpoints.slice(index, index+n).map(i =>
(i.errorfinal?.coordinates ?? i.errorraw?.coordinates)[1]
).filter(i => i !== null);
if (!shots.length) {
// We are outside the preplot
// Nothing to see here, move along
return true;
}
const mean = shots.reduce( (a, b) => a+b ) / shots.length;
return Math.abs(mean) <= parameters.inlineErrorRunningAverageValue || [
shot.point,
{
remarks: `Running average inline error: ${mean.toFixed(2)} > ${parameters.inlineErrorRunningAverageValue}`,
labels: [ "QC", "QCNav" ]
}
]
}).filter(i => i !== true);
results.length == 0 || results.join("\n");
results.length == 0 || {
remarks: "Sequence exceeds inline error running average limit",
shots: Object.fromEntries(results)
}

View File

@@ -0,0 +1,15 @@
gunDepth: 6.0
gunDepthTolerance: 0.5
gunTimingWarning: 1.0
gunTiming: 1.5
gunTimingSubarrayAverage: 0.5
gunPressureNominal: 2000
gunPressureToleranceRatio: 0.025
crosslineError: 12
inlineError: 2
crosslineErrorAverage: 9
inlineErrorRunningAverageValue: 1
inlineErrorRunningAverageShots: 40

3
etc/ssl/README.md Normal file
View File

@@ -0,0 +1,3 @@
# TLS certificates directory
Drop TLS certificates required by Dougal in this directory. It is excluded by [`.gitignore`](../../.gitignore) so its contents should never be committed by accident (and shouldn't be committed on purpose!).

View File

@@ -1,6 +1,9 @@
{
"jwt": {
"secret": ""
"secret": "",
"options": {
"expiresIn": 1800
}
},
"db": {
"user": "postgres",

View File

@@ -1,5 +1,8 @@
module.exports = {
presets: [
'@vue/cli-plugin-babel/preset'
],
plugins: [
'@babel/plugin-proposal-logical-assignment-operators'
]
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,35 +1,54 @@
{
"name": "dougal-web",
"version": "0.1.0",
"version": "0.0.0",
"private": true,
"scripts": {
"serve": "vue-cli-service serve",
"build": "vue-cli-service build"
},
"dependencies": {
"@mdi/font": "^5.3.45",
"@mdi/font": "^5.6.55",
"core-js": "^3.6.5",
"jwt-decode": "^2.2.0",
"leaflet": "^1.6.0",
"d3": "^7.0.1",
"jwt-decode": "^3.0.0",
"leaflet": "^1.7.1",
"leaflet-arrowheads": "^1.2.2",
"leaflet-realtime": "^2.2.0",
"leaflet.markercluster": "^1.4.1",
"marked": "^2.0.3",
"plotly.js-dist": "^2.5.0",
"suncalc": "^1.8.0",
"typeface-roboto": "0.0.75",
"vue": "^2.6.11",
"vue-debounce": "^2.5.7",
"vue-router": "^3.2.0",
"vuetify": "^2.3.4",
"vuex": "^3.5.1"
"vue": "^2.6.12",
"vue-debounce": "^2.6.0",
"vue-router": "^3.5.1",
"vuetify": "^2.5.0",
"vuex": "^3.6.2"
},
"devDependencies": {
"@babel/plugin-proposal-logical-assignment-operators": "^7.14.5",
"@vue/cli-plugin-babel": "~4.4.0",
"@vue/cli-plugin-router": "~4.4.0",
"@vue/cli-plugin-vuex": "~4.4.0",
"@vue/cli-service": "~4.4.0",
"sass": "^1.19.0",
"@vue/cli-service": "^4.5.13",
"sass": "~1.32",
"sass-loader": "^8.0.0",
"stylus": "^0.54.7",
"stylus": "^0.54.8",
"stylus-loader": "^3.0.2",
"vue-cli-plugin-vuetify": "~2.0.6",
"vue-template-compiler": "^2.6.11",
"vue-cli-plugin-vuetify": "^2.0.7",
"vue-template-compiler": "^2.6.12",
"vuetify-loader": "^1.3.0"
}
},
"description": "User interface for the Dougal system.",
"main": "babel.config.js",
"repository": {
"type": "git",
"url": "git+https://gitlab.com/wgp/dougal/software.git"
},
"author": "Aaltronav s.r.o.",
"license": "UNLICENSED",
"bugs": {
"url": "https://gitlab.com/wgp/dougal/software/issues"
},
"homepage": "https://gitlab.com/wgp/dougal/software#readme"
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.2 KiB

After

Width:  |  Height:  |  Size: 210 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

View File

@@ -26,9 +26,16 @@
<style lang="stylus">
@import '../node_modules/typeface-roboto/index.css'
@import '../node_modules/@mdi/font/css/materialdesignicons.css'
.markdown.v-textarea textarea
font-family monospace
line-height 1.1 !important
</style>
</style>
<script>
import { mapActions } from 'vuex';
import DougalNavigation from './components/navigation';
import DougalFooter from './components/footer';
@@ -58,12 +65,27 @@ export default {
snackText (newVal) {
this.snack = !!newVal;
},
snack (newVal) {
// When the snack is hidden (one way or another), clear
// the text so that if we receive the same message again
// afterwards it will be shown. This way, if we get spammed
// we're also not triggering the snack too often.
if (!newVal) {
this.$store.commit('setSnackText', "");
}
}
},
methods: {
...mapActions(["setCredentials"])
},
mounted () {
// Local Storage values are always strings
this.$vuetify.theme.dark = localStorage.getItem("darkTheme") == "true";
this.setCredentials()
}
};

View File

@@ -1,6 +1,7 @@
<template>
<v-menu
v-model="show"
:value="value"
@input="(e) => $emit('input', e)"
:position-x="absolute && x || undefined"
:position-y="absolute && y || undefined"
:absolute="absolute"
@@ -20,6 +21,7 @@
<dougal-context-menu v-if="item.items"
:value="showSubmenu"
:items="item.items"
:labels="labels.concat(item.labels||[])"
@input="selected"
submenu>
<template v-slot:activator="{ on, attrs }">
@@ -55,14 +57,14 @@ export default {
props: {
value: { type: [ MouseEvent, Object, Boolean ] },
labels: { type: [ Array ], default: () => [] },
absolute: { type: Boolean, default: false },
submenu: { type: Boolean, default: false },
items: { type: Array, default: [] }
items: { type: Array, default: () => [] }
},
data () {
return {
show: false,
x: 0,
y: 0,
showSubmenu: false
@@ -97,7 +99,12 @@ export default {
selected (item) {
this.show = false;
this.$emit('input', item);
if (typeof item === 'object' && item !== null) {
const labels = this.labels.concat(item.labels??[]);
this.$emit('input', {...item, labels});
} else {
this.$emit('input', item);
}
}
}

View File

@@ -1,395 +0,0 @@
<template>
<v-dialog
v-model="show"
max-width="600px"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="mx-2"
fab dark
x-small
color="primary"
title="Add event"
v-bind="attrs"
v-on="on"
>
<v-icon dark>mdi-plus</v-icon>
</v-btn>
</template>
<v-card>
<v-card-title>
<span class="headline">{{ formTitle }}</span>
</v-card-title>
<v-card-text>
<v-container>
<v-row>
<v-col>
<v-textarea
v-model="remarks"
label="Description"
rows="1"
auto-grow
clearable
autofocus
filled
:hint="presetRemarks ? 'Enter your own comment or select a preset one from the menu on the left' : 'Enter a comment'"
@keyup="handleKeys"
>
<template v-slot:prepend v-if="presetRemarks">
<v-icon
title="Select predefined comments"
color="primary"
@click="showRemarksMenu"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:prepend v-else>
<v-icon
color="disabled"
>
mdi-dots-vertical
</v-icon>
</template>
</v-textarea>
<dougal-context-menu
:value="remarksMenu"
@input="addRemark"
:items="presetRemarks"
absolute
></dougal-context-menu>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-autocomplete
ref="labels"
v-model="labels"
:items="Object.keys(allowedLabels)"
chips
deletable-chips
multiple
label="Labels"
@input="labelSearch=null; $refs.labels.isMenuActive=false"
:search-input.sync="labelSearch"
>
<template v-slot:selection="data">
<v-chip
v-bind="data.attrs"
:input-value="data.selected"
close
@click="data.select"
@click:close="remove(data.item)"
:color="allowedLabels[data.item].view.colour"
:title="allowedLabels[data.item].view.description"
>{{data.item}}</v-chip>
</template>
<template v-slot:prepend v-if="presetLabels">
<v-icon
title="Select labels"
color="primary"
@click="showLabelsMenu"
>
mdi-dots-vertical
</v-icon>
</template>
<template v-slot:prepend v-else>
<v-icon
color="disabled"
>
mdi-dots-vertical
</v-icon>
</template>
</v-autocomplete>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-switch label="Change time" v-model="timeInput" :disabled="shotInput"></v-switch>
</v-col>
<v-col>
<v-switch label="Enter shotpoint" v-model="shotInput" :disabled="timeInput"></v-switch>
</v-col>
</v-row>
<v-row dense>
<v-col :style="{visibility: timeInput ? 'visible' : 'hidden'}">
<v-text-field v-model="tsTime" type="time" step="1" label="Time">
</v-text-field>
</v-col>
<v-col :style="{visibility: timeInput ? 'visible' : 'hidden'}">
<v-text-field v-model="tsDate" type="date" label="Date">
</v-text-field>
</v-col>
<v-col :style="{visibility: shotInput ? 'visible' : 'hidden'}">
<v-autocomplete
:items="sequenceList"
v-model="sequence"
label="Sequence"
></v-autocomplete>
</v-col>
<v-col :style="{visibility: shotInput ? 'visible' : 'hidden'}">
<v-text-field v-model="point" type="number" label="Shot">
</v-text-field>
</v-col>
</v-row>
</v-container>
</v-card-text>
<v-card-actions>
<v-spacer></v-spacer>
<v-btn color="blue darken-1" text @click="close">Cancel</v-btn>
<v-btn color="blue darken-1" text @click="save" :disabled="!isValid">Save</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<style>
</style>
<script>
import { mapActions } from 'vuex';
import DougalContextMenu from '@/components/context-menu';
export default {
name: 'DougalEventEditDialog',
components: {
DougalContextMenu
},
props: {
value: Boolean,
allowedLabels: { type: Object, default: () => {} },
sequences: { type: Object, default: null },
defaultTimestamp: { type: [ Date, String, Number, Function ], default: null },
defaultSequence: { type: Number, default: null },
defaultShotpoint: { type: Number, default: null },
eventMode: { type: String, default: "timed" },
presetRemarks: { type: [ Object, Array ], default: null },
presetLabels: { type: [ Object, Array ], default: null }
},
data () {
const tsNow = new Date;
return {
show: false,
tsDate: tsNow.toISOString().substring(0, 10),
tsTime: tsNow.toISOString().substring(11, 19),
sequenceData: null,
sequence: null,
point: null,
remarks: "",
labels: [],
labelSearch: null,
timer: null,
timeInput: false,
shotInput: false,
remarksMenu: false,
menuX: 0,
menuY: 0,
}
},
computed: {
eventType () {
return this.timeInput
? "timed"
: this.shotInput
? "seq"
: this.eventMode;
},
formTitle () {
if (this.eventType == "seq") {
return `New event at shotpoint ${this.shot.point}`;
} else {
return "New event at time "+this.tstamp.toISOString().replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2");
}
},
defaultTimestampAsDate () {
if (this.defaultTimestamp instanceof Date) {
return this.defaultTimestamp;
} else if (typeof this.defaultTimestamp == 'string') {
return new Date(this.defaultTimestamp);
} else if (typeof this.defaultTimestamp == 'number') {
return new Date(this.defaultTimestamp);
} else if (typeof this.defaultTimestamp == 'function') {
return new Date(this.defaultTimestamp());
}
},
tstamp () {
return this.timeInput
? new Date(this.tsDate+"T"+this.tsTime+"Z")
: this.defaultTimestampAsDate || new Date();
},
shot () {
return this.shotInput
? { sequence: this.sequence, point: Number(this.point) }
: { sequence: this.defaultSequence, point: this.defaultShotpoint };
},
isTimedEvent () {
return Boolean((this.timeInput && this.tstamp) ||
(this.defaultTimestampAsDate && !this.shotInput));
},
isShotEvent () {
return Boolean((this.shotInput && this.shot.sequence && this.shot.point) ||
(this.defaultSequence && this.defaultShotpoint && !this.timeInput));
},
isValid () {
if (this.isTimedEvent) {
return !isNaN(this.tstamp) &&
((this.remarks && this.remarks.trim()) || this.labels.length);
}
if (this.isShotEvent) {
return Number(this.sequence) && Number(this.point) &&
((this.remarks && this.remarks.trim()) || this.labels.length);
}
return false;
},
sequenceList () {
const seq = this.sequences || this.sequenceData || [];
return seq.map(s => s.sequence).sort((a,b) => b-a);
},
eventData () {
if (!this.isValid) {
return null;
}
const data = {}
data.remarks = this.remarks.trim();
if (this.labels) {
data.labels = this.labels;
}
if (this.isTimedEvent) {
data.tstamp = this.tstamp;
} else if (this.isShotEvent) {
data.sequence = this.shot.sequence;
data.point = this.shot.point;
}
return data;
}
},
watch: {
async show (value) {
this.$emit('input', value);
if (value) {
this.updateTimeFields();
await this.updateSequences();
this.sequence = this.defaultSequence;
this.point = this.defaultShotpoint;
this.shotInput = this.eventMode == "seq";
}
},
value (v) {
if (v != this.show) {
this.show = v;
}
}
},
methods: {
clear () {
this.timeInput = false;
this.shotInput = false;
this.remarks = "";
this.labels = [];
},
close () {
this.show = false;
this.clear();
},
save () {
this.$emit('save', this.eventData);
this.close();
},
remove (item) {
this.labels.splice(this.labels.indexOf(item), 1);
},
updateTimeFields () {
const tsNow = new Date;
this.tsDate = tsNow.toISOString().substring(0, 10);
this.tsTime = tsNow.toISOString().substring(11, 19);
},
async updateSequences () {
if (this.sequences == null) {
const url = `/project/${this.$route.params.project}/sequence`;
this.sequenceData = await this.api([url]) || null
}
this.sequence = this.sequenceList.reduce( (a, b) => Math.max(a, b) );
},
showRemarksMenu (e) {
this.remarksMenu = e;
},
addRemark ({text}) {
if (text) {
if (this.remarks === null) {
this.remarks = "";
}
if (this.remarks.length && this.remarks[this.remarks.length-1] != "\n") {
this.remarks += "\n";
}
this.remarks += text;
}
},
handleKeys (e) {
if (e.ctrlKey && !e.altKey && !e.shiftKey && !e.metaKey && e.keyCode == 13) {
// Ctrl+Enter
if (this.isValid) {
this.save();
}
}
},
...mapActions(["api"])
}
};
</script>

View File

@@ -0,0 +1,240 @@
<template>
<v-dialog
v-model="dialog"
style="z-index:2020;"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="hover"
icon
small
title="This entry has edits. Click to view history."
:disabled="disabled"
v-on="on"
>
<v-icon small>mdi-playlist-edit</v-icon>
</v-btn>
</template>
<v-card>
<v-card-title class="headline">
Event history
</v-card-title>
<v-card-text>
<p>Event ID: {{ id }}</p>
<v-data-table
dense
class="small"
:headers="headers"
:items="rows"
item-key="uid"
sort-by="uid"
:sort-desc="true"
:loading="loading"
fixed-header
:footer-props='{itemsPerPageOptions: [ 10, 25, 50, 100, 500, -1 ]}'
>
<template v-slot:item.tstamp="{value}">
<span style="white-space:nowrap;" v-if="value">
{{ value.replace(/(.{10})T(.{8}).{4}Z$/, "$1 $2") }}
</span>
</template>
<template v-slot:item.remarks="{item}">
<template>
<div>
<span v-if="item.labels.length">
<v-chip v-for="label in item.labels"
class="mr-1 px-2 underline-on-hover"
x-small
:color="labels[label] && labels[label].view.colour"
:title="labels[label] && labels[label].view.description"
:key="label"
:href="$route.path+'?label='+encodeURIComponent(label)"
>{{label}}</v-chip>
</span>
<span v-html="$options.filters.markdownInline(item.remarks)">
</span>
</div>
</template>
</template>
<template v-slot:item.valid_from="{item}">
<span style="white-space:nowrap;" v-if="item.validity[1]">
{{ item.validity[1].replace(/(.{10})[T ](.{8}).{4,}(Z|[+-][\d]+)$/, "$1 $2") }}
</span>
<span v-else>
</span>
</template>
<template v-slot:item.valid_until="{item}">
<span style="white-space:nowrap;" v-if="item.validity[2]">
{{ item.validity[2].replace(/(.{10})[T ](.{8}).{4,}(Z|[+-][\d]+)$/, "$1 $2") }}
</span>
<span v-else>
</span>
</template>
<!-- Actions column -->
<template v-slot:item.actions="{ item }">
<div style="white-space:nowrap;">
<!-- NOTE Kind of cheating here by assuming that there will be
no items with *future* validity. -->
<template v-if="item.validity[2]">
<v-btn v-if="!item.meta.readonly"
class="hover"
icon
small
title="Restore"
:disabled="loading"
@click=restoreEvent(item)
>
<v-icon small>mdi-history</v-icon>
</v-btn>
<v-btn v-else
class="hover off"
icon
small
title="This event is read-only"
:disabled="loading"
>
<v-icon small>mdi-lock-reset</v-icon>
</v-btn>
</template>
</div>
</template>
</v-data-table>
</v-card-text>
</v-card>
</v-dialog>
</template>
<style scoped>
.hover {
opacity: 0.4;
}
.hover:hover {
opacity: 1;
}
.hover.off:hover {
opacity: 0.4;
}
.small >>> td, .small >>> th {
font-size: 85% !important;
}
</style>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: 'DougalEventEditHistory',
props: {
id: { type: Number },
disabled: { type: Boolean, default: false },
labels: { default: {} }
},
data () {
return {
dialog: false,
rows: [],
headers: [
{
value: "tstamp",
text: "Timestamp",
width: "20ex"
},
{
value: "sequence",
text: "Sequence",
align: "end",
width: "10ex"
},
{
value: "point",
text: "Shotpoint",
align: "end",
width: "10ex"
},
{
value: "remarks",
text: "Text",
width: "100%"
},
{
value: "valid_from",
text: "Valid From"
},
{
value: "valid_until",
text: "Valid Until"
},
{
value: "actions",
text: "Actions",
sortable: false
}
]
};
},
computed: {
...mapGetters(['loading', 'serverEvent'])
},
watch: {
dialog (val) {
if (!val) {
this.rows = [];
} else {
this.getEventHistory();
}
},
async serverEvent (event) {
if (event.channel == "event" &&
(event.payload?.new?.id ?? event.payload?.old?.id) == this.id) {
// The event that we're viewing has been refreshed (possibly by us)
this.getEventHistory();
}
}
},
methods: {
async getEventHistory () {
const url = `/project/${this.$route.params.project}/event/${this.id}`;
this.rows = (await this.api([url]) || []).map(row => {
row.valid_from = row.validity[1] ?? -Infinity;
row.valid_until = row.validity[2] ?? +Infinity;
return row;
});
},
async restoreEvent (item) {
if (item.id) {
const url = `/project/${this.$route.params.project}/event/${item.id}`;
await this.api([url, {
method: "PUT",
body: item // NOTE Sending extra attributes in the body may cause trouble down the line
}]);
}
},
...mapActions(["api"])
}
};
</script>

View File

@@ -0,0 +1,208 @@
<template>
<v-dialog
:value="value"
@input="(e) => $emit('input', e)"
max-width="600"
>
<v-card>
<v-toolbar
flat
color="transparent"
>
<v-toolbar-title>Event labels</v-toolbar-title>
<v-spacer></v-spacer>
<v-btn
icon
@click="$refs.search.focus()"
>
<v-icon>mdi-magnify</v-icon>
</v-btn>
</v-toolbar>
<v-container class="py-0">
<v-row
align="center"
justify="start"
>
<v-col
v-for="(item, i) in selection"
:key="item.text"
class="shrink"
>
<v-chip
:disabled="loading"
small
:color="item.colour"
:title="item.title"
close
@click:close="selection.splice(i, 1)"
>
<v-icon
left
v-text="item.icon"
></v-icon>
{{ item.text }}
</v-chip>
</v-col>
<v-col v-if="!allSelected"
cols="12"
>
<v-text-field
ref="search"
v-model="search"
full-width
hide-details
label="Search"
single-line
></v-text-field>
</v-col>
</v-row>
</v-container>
<v-divider v-if="!allSelected"></v-divider>
<v-list dense style="max-height:600px;overflow-y:auto;">
<template v-for="item in categories">
<v-list-item v-if="!selection.find(i => i.text == item.text)"
dense
:key="item.text"
:disabled="loading"
@click="selection.push(item)"
>
<v-list-item-avatar
class="my-0"
width="12ex"
>
<v-chip
x-small
:color="item.colour"
:title="item.title"
>{{item.text}}</v-chip>
</v-list-item-avatar>
<v-list-item-title v-text="item.title"></v-list-item-title>
</v-list-item>
</template>
</v-list>
<v-divider></v-divider>
<v-card-actions>
<v-btn
:loading="loading"
color="warning"
text
@click="close"
>
Cancel
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!dirty"
:loading="loading"
color="primary"
text
@click="save"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<script>
function stringSort (a, b) {
return a == b
? 0
: a < b
? -1
: +1;
}
export default {
name: 'DougalEventEditLabels',
props: {
value: { default: false },
labels: { type: Object },
selected: {type: Array },
loading: { type: Boolean, default: false }
},
data: () => ({
dialog: false,
search: '',
selection: [],
}),
computed: {
allSelected () {
return this.selection.length === this.items.length
},
dirty () {
// Checks if the arrays have the same elements
return !this.selection.every(i => this.selected.includes(i.text)) ||
!this.selected.every(i => this.selection.find(j => j.text==i));
},
categories () {
const search = this.search.toLowerCase()
if (!search) return this.items
return this.items.filter(item => {
const text = item.text.toLowerCase();
const title = item.title.toLowerCase();
return text.includes(search) || title.includes(search);
}).sort( (a, b) => stringSort(a.text, b.text) )
},
items () {
return Object.keys(this.labels).map(this.labelToItem);
}
},
watch: {
value () {
this.dialog = this.value;
if (this.dialog) {
this.$nextTick(() => this.$refs.search?.focus());
}
},
selected () {
this.selection = this.selected.map(this.labelToItem)
},
selection () {
this.search = '';
this.$refs.search?.focus();
},
},
methods: {
labelToItem (k) {
return {
text: k,
icon: this.labels[k].view?.icon,
colour: this.labels[k].view?.colour,
title: this.labels[k].view?.description
};
},
close () {
this.selection = this.selected.map(this.labelToItem)
this.$emit("input", false);
},
save () {
this.$emit("selectionChanged", {labels: this.selection.map(i => i.text)});
this.$emit("input", false);
},
},
}
</script>

View File

@@ -0,0 +1,679 @@
<template>
<v-dialog
:value="value"
@input="(e) => $emit('input', e)"
max-width="600"
>
<template v-slot:activator="{ on, attrs }">
<v-btn
class="mx-2"
fab dark
x-small
color="primary"
title="Add event"
@click="(e) => $emit('new', e)"
v-bind="attrs"
v-on="on"
>
<v-icon dark>mdi-plus</v-icon>
</v-btn>
</template>
<v-card>
<v-toolbar
flat
color="transparent"
>
<v-toolbar-title>Event</v-toolbar-title>
<v-spacer></v-spacer>
</v-toolbar>
<v-container class="py-0">
<v-row dense>
<v-col>
<v-menu
v-model="dateMenu"
:close-on-content-click="false"
:nudge-right="40"
transition="scale-transition"
offset-y
min-width="auto"
>
<template v-slot:activator="{ on, attrs }">
<v-text-field
v-model="tsDate"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
label="Date"
suffix="UTC"
prepend-icon="mdi-calendar"
readonly
v-bind="attrs"
v-on="on"
@change="updateAncillaryData"
></v-text-field>
</template>
<v-date-picker
v-model="tsDate"
@input="dateMenu = false"
></v-date-picker>
</v-menu>
</v-col>
<v-col>
<v-text-field
v-model="tsTime"
:disabled="!!(sequence || point || entrySequence || entryPoint)"
label="Time"
suffix="UTC"
prepend-icon="mdi-clock-outline"
type="time"
step="1"
@change="updateAncillaryData"
>
<template v-slot:prepend>
<v-menu
v-model="timeMenu"
:close-on-content-click="false"
:nudge-right="40"
transition="scale-transition"
offset-y
min-width="auto"
>
<template v-slot:activator="{ on, attrs }">
<v-icon v-on="on" v-bind="attrs">mdi-clock-outline</v-icon>
</template>
<v-time-picker
v-model="tsTime"
format="24hr"
></v-time-picker>
</v-menu>
</template>
</v-text-field>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-text-field
v-model="entrySequence"
type="number"
min="1"
step="1"
label="Sequence"
prepend-icon="mdi-format-list-bulleted"
@change="updateAncillaryData"
>
</v-text-field>
</v-col>
<v-col>
<v-text-field
v-model="entryPoint"
type="number"
min="1"
step="1"
label="Point"
prepend-icon="mdi-map-marker-circle"
@change="updateAncillaryData"
>
</v-text-field>
</v-col>
</v-row>
<v-row dense>
<v-col cols="12">
<v-combobox
ref="remarks"
v-model="entryRemarks"
:disabled="loading"
:search-input.sync="entryRemarksInput"
:items="remarksAvailable"
:filter="searchRemarks"
item-text="text"
return-object
label="Remarks"
prepend-icon="mdi-text-box-outline"
append-outer-icon="mdi-magnify"
@click:append-outer="(e) => remarksMenu = e"
></v-combobox>
<dougal-context-menu
:value="remarksMenu"
@input="handleRemarksMenu"
:items="presetRemarks"
absolute
></dougal-context-menu>
</v-col>
</v-row>
<v-row dense>
<v-col cols="12">
<v-autocomplete
ref="labels"
v-model="entryLabels"
:items="categories"
multiple
menu-props="closeOnClick, closeOnContentClick"
attach
chips
label="Labels"
prepend-icon="mdi-tag-multiple"
append-outer-icon="mdi-magnify"
@click:append-outer="() => $refs.labels.focus()"
>
<template v-slot:selection="{ item, index, select, selected, disabled }">
<v-chip
:disabled="loading"
small
light
:color="item.colour"
:title="item.title"
close
@click:close="entryLabels.splice(index, 1)"
>
<v-icon
left
v-text="item.icon"
></v-icon>
{{ item.text }}
</v-chip>
</template>
<template v-slot:item="{ item }">
<v-list-item-avatar
class="my-0"
width="12ex"
>
<v-chip
x-small
light
:color="item.colour"
:title="item.title"
>{{item.text}}</v-chip>
</v-list-item-avatar>
<v-list-item-title v-text="item.title"></v-list-item-title>
</template>
</v-autocomplete>
</v-col>
</v-row>
<v-row dense>
<v-col>
<v-text-field
v-model="entryLatitude"
label="Latitude"
prepend-icon="φ"
disabled
>
<template v-slot:append-outer>
<v-icon v-if="false/*TODO*/"
title="Click to set position"
@click="1==1/*TODO*/"
>mdi-crosshairs-gps</v-icon>
<v-icon v-else
disabled
title="No GNSS available"
>mdi-crosshairs</v-icon>
</template>
</v-text-field>
</v-col>
<v-col>
<v-text-field
v-model="entryLongitude"
label="Longitude"
prepend-icon="λ"
disabled
>
<template v-slot:append-outer>
<v-icon v-if="false"
title="Click to set position"
@click="getPosition"
>mdi-crosshairs-gps</v-icon>
<v-icon v-else
title="No GNSS available"
disabled
>mdi-crosshairs</v-icon>
</template>
</v-text-field>
</v-col>
</v-row>
</v-container>
<v-divider></v-divider>
<v-card-actions>
<v-btn
color="warning"
text
@click="close"
>
Cancel
</v-btn>
<v-spacer></v-spacer>
<v-btn
:disabled="!canSave"
:loading="loading"
color="primary"
text
@click="save"
>
Save
</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<style>
/* https://github.com/vuetifyjs/vuetify/issues/471 */
.v-dialog {
overflow-y: initial;
}
</style>
<script>
import { mapActions } from 'vuex';
import DougalContextMenu from '@/components/context-menu';
function stringSort (a, b) {
return a == b
? 0
: a < b
? -1
: +1;
}
function flattenRemarks(items, keywords=[], labels=[]) {
const result = [];
if (items) {
for (const item of items) {
if (!item.items) {
result.push({
text: item.text,
labels: labels.concat(item.labels??[]),
keywords
})
} else {
const k = [...keywords, item.text];
const l = [...labels, ...(item.labels??[])];
result.push(...flattenRemarks(item.items, k, l))
}
}
}
return result;
}
/** Compare two arrays
*
* @a a First array
* @a b Second array
* @a cbB Callback to transform elements of `b`
*
* @return true if the sets are distinct, false otherwise
*
* Note that this will not work with object or other complex
* elements unless the array members are the same object (as
* opposed to merely identical).
*/
function distinctSets(a, b, cbB = (i) => i) {
return !a.every(i => b.map(cbB).includes(i)) ||
!b.map(cbB).every(i => a.find(j => j==i));
}
export default {
name: 'DougalEventEdit',
components: {
DougalContextMenu
},
props: {
value: { default: false },
availableLabels: { type: Object, default: () => ({}) },
presetRemarks: { type: Array, default: () => [] },
id: { type: Number },
tstamp: { type: String },
sequence: { type: Number },
point: { type: Number },
remarks: { type: String },
labels: { type: Array, default: () => [] },
latitude: { type: Number },
longitude: { type: Number },
loading: { type: Boolean, default: false }
},
data: () => ({
dateMenu: false,
timeMenu: false,
remarksMenu: false,
search: '',
entryLabels: [],
tsDate: null,
tsTime: null,
entrySequence: null,
entryPoint: null,
entryRemarks: null,
entryRemarksInput: null,
entryLatitude: null,
entryLongitude: null
}),
computed: {
remarksAvailable () {
return this.entryRemarksInput == this.entryRemarks?.text ||
this.entryRemarksInput == this.entryRemarks
? []
: flattenRemarks(this.presetRemarks);
},
allSelected () {
return this.entryLabels.length === this.items.length
},
dirty () {
// Selected remark distinct from input remark
if (this.entryRemarksText != this.remarks) {
return true;
}
// The user is editing the remarks
if (this.entryRemarksText != this.entryRemarksInput) {
return true;
}
// Selected label set distinct from input labels
if (distinctSets(this.selectedLabels, this.entryLabels, (i) => i.text)) {
return true;
}
// Selected seqpoint distinct from input seqpoint (if seqpoint present)
if ((this.entrySequence || this.entryPoint)) {
if (this.entrySequence != this.sequence || this.entryPoint != this.point) {
return true;
}
} else {
// Selected timestamp distinct from input timestamp (if no seqpoint)
const epoch = Date.parse(this.tstamp);
const entryEpoch = Date.parse(`${this.tsDate} ${this.tsTime}Z`);
// Ignore difference of less than one second
if (Math.abs(entryEpoch - epoch) > 1000) {
return true;
}
}
return false;
},
canSave () {
// There is either tstamp or seqpoint, latter wins
if (!(this.entrySequence && this.entryPoint) && !this.entryTstamp) {
return false;
}
// There are remarks and/or labels
if (!this.entryRemarksText && !this.entryLabels.length) {
return false;
}
// Form is dirty
if (!this.dirty) {
return false;
}
return true;
},
categories () {
const search = this.search.toLowerCase()
if (!search) return this.items
return this.items.filter(item => {
const text = item.text.toLowerCase();
const title = item.title.toLowerCase();
return text.includes(search) || title.includes(search);
}).sort( (a, b) => stringSort(a.text, b.text) )
},
items () {
return Object.keys(this.availableLabels).map(this.labelToItem);
},
selectedLabels () {
return this.event?.labels ?? [];
},
entryTstamp () {
const ts = new Date(Date.parse(`${this.tsDate} ${this.tsTime}Z`));
if (isNaN(ts)) {
return null;
}
return ts.toISOString();
},
entryRemarksText () {
return typeof this.entryRemarks === 'string'
? this.entryRemarks
: this.entryRemarks?.text;
}
},
watch: {
value () {
if (this.value) {
// Populate fields from properties
if (!this.tstamp && !this.sequence && !this.point) {
const ts = (new Date()).toISOString();
this.tsDate = ts.substr(0, 10);
this.tsTime = ts.substr(11, 8);
} else if (this.tstamp) {
this.tsDate = this.tstamp.substr(0, 10);
this.tsTime = this.tstamp.substr(11, 8);
}
// NOTE Dead code
if (this.meta?.geometry?.type == "Point") {
this.entryLongitude = this.meta.geometry.coordinates[0];
this.entryLatitude = this.meta.geometry.coordinates[1];
}
this.entryLatitude = this.latitude;
this.entryLongitude = this.longitude;
this.entrySequence = this.sequence;
this.entryPoint = this.point;
this.entryRemarks = this.remarks;
this.entryLabels = [...(this.labels??[])];
// Focus remarks field
this.$nextTick(() => this.$refs.remarks.focus());
}
},
tstamp () {
if (this.tstamp) {
this.tsDate = this.tstamp.substr(0, 10);
this.tsTime = this.tstamp.substr(11, 8);
} else if (this.sequence || this.point) {
this.tsDate = null;
this.tsTime = null;
} else {
const ts = (new Date()).toISOString();
this.tsDate = ts.substr(0, 10);
this.tsTime = ts.substr(11, 8);
}
},
sequence () {
if (this.sequence && !this.tstamp) {
this.tsDate = null;
this.tsTime = null;
}
},
point () {
if (this.point && !this.tstamp) {
this.tsDate = null;
this.tsTime = null;
}
},
entryTstamp (n, o) {
//this.updateAncillaryData();
},
entrySequence (n, o) {
//this.updateAncillaryData();
},
entryPoint (n, o) {
//this.updateAncillaryData();
},
entryRemarks () {
if (this.entryRemarks?.labels) {
this.entryLabels = [...this.entryRemarks.labels];
} else if (!this.entryRemarks) {
this.entryLabels = [];
}
},
selectedLabels () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
},
entryLabels () {
this.search = '';
},
},
methods: {
labelToItem (k) {
return {
text: k,
icon: this.availableLabels[k].view?.icon,
colour: this.availableLabels[k].view?.colour,
title: this.availableLabels[k].view?.description
};
},
searchRemarks (item, queryText, itemText) {
const needle = queryText.toLowerCase();
const text = item.text.toLowerCase();
const keywords = item.keywords.map(i => i.toLowerCase());
const labels = item.labels.map(i => i.toLowerCase());
return text.includes(needle) ||
keywords.some(i => i.includes(needle)) ||
labels.some(i => i.includes(needle));
},
handleRemarksMenu (event) {
if (typeof event == 'boolean') {
this.remarksMenu = event;
} else {
this.entryRemarks = event;
this.remarksMenu = false;
}
},
async getPointData () {
const url = `/project/${this.$route.params.project}/sequence/${this.entrySequence}/${this.entryPoint}`;
return await this.api([url]);
},
async getTstampData () {
const url = `/navdata?q=tstamp:${this.entryTstamp}&tolerance:2500`;
return await this.api([url]);
},
async updateAncillaryData () {
if (this.entrySequence && this.entryPoint) {
// Fetch data for this sequence / point
const data = await this.getPointData();
if (data?.tstamp) {
this.tsDate = data.tstamp.substr(0, 10);
this.tsTime = data.tstamp.substr(11, 8);
}
if (data?.geometry) {
this.entryLongitude = (data?.geometry?.coordinates??[])[0];
this.entryLatitude = (data?.geometry?.coordinates??[])[1];
}
} else if (!this.entrySequence && !this.entryPoint && this.entryTstamp) {
// Fetch data for this timestamp
const data = ((await this.getTstampData())??[])[0];
console.log("TS DATA", data);
if (data?._sequence && data?.shot) {
this.entrySequence = Number(data._sequence);
this.entryPoint = data.shot;
}
if (data?.tstamp) {
this.tsDate = data.tstamp.substr(0, 10);
this.tsTime = data.tstamp.substr(11, 8);
}
if (data?.longitude && data?.latitude) {
this.entryLongitude = data.longitude;
this.entryLatitude = data.latitude;
}
}
},
close () {
this.entryLabels = this.selectedLabels.map(this.labelToItem)
this.$emit("input", false);
},
save () {
// In case the focus goes directly from the remarks field
// to the Save button.
if (this.entryRemarksInput != this.entryRemarksText) {
this.entryRemarks = this.entryRemarksInput;
}
const data = {
id: this.id,
remarks: this.entryRemarksText,
labels: this.entryLabels
};
/* NOTE This is the purist way.
* Where we expect that the server will match
* timestamps with shotpoints and so on
*
if (this.entrySequence && this.entryPoint) {
data.sequence = this.entrySequence;
data.point = this.entryPoint;
} else {
data.tstamp = this.entryTstamp;
}
*/
/* NOTE And this is the pragmatic way.
*/
data.tstamp = this.entryTstamp;
if (this.entrySequence && this.entryPoint) {
data.sequence = this.entrySequence;
data.point = this.entryPoint;
}
this.$emit("changed", data);
this.$emit("input", false);
},
...mapActions(["api"])
},
}
</script>

View File

@@ -5,10 +5,15 @@
<v-spacer></v-spacer>
<small>&copy; {{year}} <a href="https://aaltronav.eu/" target="_blank" class="brand">Aaltronav</a></small>
<small class="d-none d-sm-inline">&copy; {{year}} <a href="https://aaltronav.eu/" target="_blank" class="brand">Aaltronav</a></small>
<v-spacer></v-spacer>
<v-icon v-if="serverConnected" class="mr-6" small title="Connected to server">mdi-lan-connect</v-icon>
<v-icon v-else class="mr-6" small color="red" title="Server connection lost (we'll reconnect automatically when the server comes back)">mdi-lan-disconnect</v-icon>
<dougal-notifications-control class="mr-6"></dougal-notifications-control>
<div title="Night mode">
<v-switch
class="ma-auto"
@@ -26,28 +31,33 @@
font-family: "Bank Gothic Medium";
src: local("Bank Gothic Medium"), url("/fonts/bank-gothic-medium.woff");
}
.brand {
font-family: "Bank Gothic Medium";
}
</style>
<script>
import { mapState } from 'vuex';
import DougalHelpDialog from '@/components/help-dialog';
import DougalNotificationsControl from '@/components/notifications-control';
export default {
name: 'DougalFooter',
components: {
DougalHelpDialog
DougalHelpDialog,
DougalNotificationsControl
},
computed: {
year () {
const date = new Date();
return date.getUTCFullYear();
}
},
...mapState({serverConnected: state => state.notify.serverConnected})
}
};
</script>

View File

@@ -0,0 +1,363 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Array inline / crossline error
<v-spacer></v-spacer>
<v-switch v-model="scatterplot" label="Scatterplot"></v-switch>
<v-switch class="ml-4" v-model="histogram" label="Histogram"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graph0"></div>
</v-col>
</v-row>
<v-row v-show="scatterplot">
<v-col>
<div class="graph-container" ref="graph1"></div>
</v-col>
</v-row>
<v-row v-show="histogram">
<v-col>
<div class="graph-container" ref="graph2"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
background-color: red;
width: 100%;
height: 100%;
}
</style>
<script>
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
export default {
name: 'DougalGraphArraysIJScatter',
props: [ "data", "settings" ],
data () {
return {
graph: [],
busy: false,
resizeObserver: null,
scatterplot: false,
histogram: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
histogram () {
this.plot();
this.$emit("update:settings", {[`${this.$options.name}.histogram`]: this.histogram});
},
scatterplot () {
this.plot();
this.$emit("update:settings", {[`${this.$options.name}.scatterplot`]: this.scatterplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.histogram) {
this.plotHistogram();
}
if (this.scatterplot) {
this.plotScatter();
}
},
plotSeries () {
if (!this.data) {
return;
}
function transform (d, idx=0, otherParams={}) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
const x = unpack(d, "point");
const y = unpack(coords, idx);
const data = {
type: "scatter",
mode: "lines",
x,
y,
transforms: [{
type: "groupby",
groups: unpack(unpack(d, "meta"), "src_number"),
styles: [
{target: 1, value: {line: {color: "green"}}},
{target: 2, value: {line: {color: "red"}}}
]
}],
...otherParams
};
return data;
}
const data = [
transform(this.data.items, 1, {
xaxis: 'x',
yaxis: 'y',
name: 'Crossline'
}),
transform(this.data.items, 0, {
xaxis: 'x',
yaxis: 'y2',
name: 'Inline'
})
];
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Inline / crossline error sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis2: {
title: "Crossline (m)",
anchor: "y2",
domain: [ 0.55, 1 ]
},
yaxis: {
title: "Inline (m)",
anchor: "y1",
domain: [ 0, 0.45 ]
},
xaxis: {
title: "Shotpoint",
anchor: "x1"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph[0] = Plotly.newPlot(this.$refs.graph0, data, layout, config);
},
plotScatter () {
console.log("plot");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
function transform (d) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
const x = unpack(coords, 0);
const y = unpack(coords, 1);
const data = [{
type: "scatter",
mode: "markers",
x,
y,
transforms: [{
type: "groupby",
groups: unpack(unpack(d, "meta"), "src_number"),
styles: [
{target: 1, value: {line: {color: "green"}}},
{target: 2, value: {line: {color: "red"}}}
]
}]
}];
return data;
}
const data = transform(this.data.items);
this.busy = false;
const layout = {
//autosize: true,
//title: {text: "Inline / crossline error sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Inline (m)",
//zeroline: false
},
xaxis: {
title: "Crossline (m)"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph[1] = Plotly.newPlot(this.$refs.graph1, data, layout, config);
},
plotHistogram () {
if (!this.data) {
console.log("missing data");
return;
}
function transform (d, idx=0, otherParams={}) {
const errortype = d.errorfinal ? "errorfinal" : "errorraw";
const coords = unpack(unpack(d, errortype), "coordinates");
const x = unpack(coords, idx);
const data = {
type: "histogram",
histnorm: 'probability',
x,
transforms: [{
type: "groupby",
groups: unpack(unpack(d, "meta"), "src_number"),
styles: [
{target: 1, value: {marker: {color: "rgba(129, 199, 132, 0.9)"}}},
{target: 2, value: {marker: {color: "rgba(229, 115, 115, 0.9)"}}}
]
}],
...otherParams
};
return data;
}
const data = [
transform(this.data.items, 0, {
xaxis: 'x',
yaxis: 'y',
name: 'Crossline'
}),
transform(this.data.items, 1, {
xaxis: 'x2',
yaxis: 'y',
name: 'Inline'
})
];
const layout = {
//autosize: true,
//title: {text: "Inline / crossline error sequence %{meta.sequence}"},
legend: {
title: { text: "Array" }
},
xaxis: {
title: "Crossline distance (m)",
domain: [ 0, 0.45 ],
anchor: 'x1'
},
yaxis: {
title: "Frequency (01)",
domain: [ 0, 1 ],
anchor: 'y1'
},
xaxis2: {
title: "Inline distance (m)",
domain: [ 0.55, 1 ],
anchor: 'x2'
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.busy = false;
console.log(data);
console.log(layout);
this.graph[2] = Plotly.newPlot(this.$refs.graph2, data, layout, config);
},
replot () {
if (!this.graph.length) {
return;
}
console.log("Replotting");
this.graph.forEach( (graph, idx) => {
const ref = this.$refs["graph"+idx];
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
});
},
},
async mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graph0);
this.resizeObserver.observe(this.$refs.graph1);
this.resizeObserver.observe(this.$refs.graph2);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graph2);
this.resizeObserver.unobserve(this.$refs.graph1);
this.resizeObserver.unobserve(this.$refs.graph0);
}
}
};
</script>

View File

@@ -0,0 +1,364 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun depth
<v-spacer></v-spacer>
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphSeries"></div>
</v-col>
</v-row>
<v-row v-show="shotpoint">
<v-col>
<div class="graph-container" ref="graphBar"></div>
</v-col>
</v-row>
<v-row v-show="violinplot">
<v-col>
<div class="graph-container" ref="graphViolin"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsDepth',
props: [ "data", "settings" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
const guns = unpack(meta, "guns").map(s => s.filter(g => g[2] == src_number));;
const gunDepths = guns.map(s => s.map(g => g[10]));
const gunDepthsSorted = gunDepths.map(s => d3a.sort(s));
const gunsAvgDepth = gunDepths.map( (s, sidx) => d3a.mean(s) );
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const tracesGunDepths = [{
type: "scatter",
mode: "lines",
x,
y: gunDepthsSorted.map(s => d3a.quantileSorted(s, 0.25)),
...aes.gunArrays[src_number || 1].min
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunsAvgDepth,
...aes.gunArrays[src_number || 1].avg
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunDepthsSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsDepthsIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
mode: "markers",
marker: {size: 2 },
hoverinfo: "skip",
x: gunDepthsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
.map( f => Array(f.length).fill(x[idx]) ).flat()
).flat(),
y: gunDepthsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ ...tracesGunDepths, tracesGunsDepthsIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun depths sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
hovermode: "x",
yaxis: {
title: "Depth (m)",
//zeroline: false
},
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
const item = this.data.items.find(s => s.point == point);
const guns = item.meta.guns.filter( g => g[2] == item.meta.src_number );
const gunIds = guns.map( g => "G"+g[1] );
const depths = unpack(guns, 10);
const data = [{
type: "bar",
x: gunIds,
y: depths,
transforms: [{
type: "groupby",
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun depths shot %{meta.point}"},
height: 300,
yaxis: {
title: "Depth (m)",
range: [ Math.min(d3a.min(depths)-0.1, 5), Math.max(d3a.max(depths)+0.1, 7) ]
},
xaxis: {
title: "Gun number",
type: 'category'
},
meta: {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
if (!styles[gunId]) {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
y: unpack(unpack(unpack(d, "meta"), "guns").flat(), 10), // Gun depth
points: 'none',
box: {
visible: true
},
line: {
color: 'green',
},
meanline: {
visible: true
},
transforms: [{
type: 'groupby',
groups: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1),
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
title: {text: "Individual gun depths sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Depth (m)",
zeroline: false
},
xaxis: {
title: "Gun number"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);
this.resizeObserver.unobserve(this.$refs.graphViolin);
this.resizeObserver.unobserve(this.$refs.graphSeries);
}
}
};
</script>

View File

@@ -0,0 +1,405 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun details
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphHeat"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsDepth',
props: [ "data" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
// TODO: aspects should be a prop
aspects: [
"Mode", "Detect", "Autofire", "Aimpoint", "Firetime", "Delay",
"Delta",
"Depth", "Pressure", "Volume", "Filltime"
]
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
}
},
methods: {
plot () {
this.plotHeat();
},
async plotHeat () {
if (!this.data) {
console.log("missing data");
return;
}
function transform (data, aspects=["Depth", "Pressure"]) {
const facets = [
// Mode
{
params: {
name: "Mode",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "Off", "Auto", "Manual", "Disabled" ],
conversion: (gun, shot) => {
switch (gun[3]) {
case "A":
return 1;
case "M":
return 2;
case "O":
return 0;
case "D":
return 3;
}
}
},
// Detect
{
params: {
name: "Detect",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "Zero", "Peak", "Level" ],
conversion: (gun, shot) => {
switch (gun[4]) {
case "P":
return 1;
case "Z":
return 0;
case "L":
return 2;
}
}
},
// Autofire
{
params: {
name: "Autofire",
hovertemplate: "SP%{x}<br>%{y}<br>%{text}",
},
text: [ "False", "True" ],
conversion: (gun, shot) => {
return gun[5] ? 1 : 0;
}
},
// Aimpoint
{
params: {
name: "Aimpoint",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[7]
},
// Firetime
{
params: {
name: "Firetime",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? gun[8] : null
},
// Delta
{
params: {
name: "Delta",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms",
// NOTE: These values are based on
// Grane + Snorre's ±1.5 ms spec. While a fairly
// common range, I still consider these min / max
// numbers to have been chosen semi-arbitrarily.
zmin: -2,
zmax: 2
},
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? gun[7]-gun[8] : null
},
// Delay
{
params: {
name: "Delay",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
conversion: (gun, shot) => gun[9]
},
// Depth
{
params: {
name: "Depth",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} m"
},
conversion: (gun, shot) => gun[10]
},
// Pressure
{
params: {
name: "Pressure",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} psi"
},
conversion: (gun, shot) => gun[11]
},
// Volume
{
params: {
name: "Volume",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} in³"
},
conversion: (gun, shot) => gun[12]
},
// Filltime
{
params: {
name: "Filltime",
hovertemplate: "SP%{x}<br>%{y}<br>%{z} ms"
},
// NOTE that filltime is applicable to the *non* firing guns
conversion: (gun, shot) => gun[2] == shot.meta.src_number ? null : gun[13]
}
];
// Get gun numbers
const guns = [...new Set(data.map( s => s.meta.guns.map( g => g[1] ) ).flat())];
// z eventually will have the structure:
// z = {
// [aspect]: [ // First shotpoint
// [ // Value for gun 0, gun 1, … ],
// …more shotpoints…
// ]
// }
const z = {};
// x is an array of shotpoints
const x = [];
// y is an array of gun numbers
const y = guns.map( gun => `G${gun}` );
// Build array of guns (i.e., populate z)
// We prefer to do this outside the shot-to-shot loop
// for efficiency
for (const facet of facets) {
const label = facet.params.name;
z[label] = Array(guns.length);
for (let i=0; i<guns.length; i++) {
z[label][i] = [];
}
}
// Populate array of guns with shotpoint data
for (let shot of data) {
x.push(shot.point);
for (const facet of facets) {
const label = facet.params.name;
const facetGunsArray = z[label];
for (const gun of shot.meta.guns) {
const gunIndex = gun[1]-1;
const facetGun = facetGunsArray[gunIndex];
facetGun.push(facet.conversion(gun, shot));
}
}
}
return aspects.map( (aspect, idx) => {
const facet = facets.find(el => el.params.name == aspect) || {};
const defaultParams = {
name: aspect,
type: "heatmap",
showscale: false,
x,
y,
z: z[aspect],
text: facet.text ? z[aspect].map(row => row.map(v => facet.text[v])) : undefined,
xaxis: "x",
yaxis: "y" + (idx > 0 ? idx+1 : "")
}
return Object.assign({}, defaultParams, facet.params);
});
}
const data = transform(this.data.items, this.aspects);
this.busy = false;
const layout = {
title: {text: "Gun details sequence %{meta.sequence}"},
height: 200*this.aspects.length,
//autocolorscale: true,
/*
grid: {
rows: this.aspects.length,
columns: 1,
pattern: "coupled",
roworder: "bottom to top"
},
*/
//autosize: true,
// colorscale: "sequential",
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
this.aspects.forEach ( (aspect, idx) => {
const num = idx+1;
const key = "yaxis" + num;
const anchor = "y" + num;
const segment = (1/this.aspects.length);
const margin = segment/20;
const domain = [
segment*idx + margin,
segment*num - margin
];
layout[key] = {
title: aspect,
anchor,
domain
}
});
const config = {
//editable: true,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphHeat, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphHeat);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphHeat);
}
}
};
</script>

View File

@@ -0,0 +1,381 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun pressures
<v-spacer></v-spacer>
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphSeries"></div>
</v-col>
</v-row>
<v-row v-show="shotpoint">
<v-col>
<div class="graph-container" ref="graphBar"></div>
</v-col>
</v-row>
<v-row v-show="violinplot">
<v-col>
<div class="graph-container" ref="graphViolin"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsPressure',
props: [ "data", "settings" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
const guns = unpack(meta, "guns").map(s => s.filter(g => g[2] == src_number));;
const gunPressures = guns.map(s => s.map(g => g[11]));
const gunPressuresSorted = gunPressures.map(s => d3a.sort(s));
const gunVolumes = guns.map(s => s.map(g => g[12]));
const gunPressureWeights = gunVolumes.map( (s, sidx) => s.map( v => v/meta[sidx].volume ));
const gunsWeightedAvgPressure = gunPressures.map( (s, sidx) =>
d3a.sum(s.map( (pressure, gidx) => pressure * gunPressureWeights[sidx][gidx] )) / d3a.sum(gunPressureWeights[sidx])
);
const manifold = unpack(meta, "manifold");
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const traceManifold = {
name: "Manifold",
type: "scatter",
mode: "lines",
line: { ...aes.gunArrays[src_number || 1].avg.line, dash: "dot", width: 1 },
x,
y: manifold,
};
const tracesGunPressures = [{
type: "scatter",
mode: "lines",
x,
y: gunPressuresSorted.map(s => d3a.quantileSorted(s, 0.25)),
...aes.gunArrays[src_number || 1].min
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunsWeightedAvgPressure,
...aes.gunArrays[src_number || 1].avg
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunPressuresSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsPressuresIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
mode: "markers",
marker: {size: 2 },
hoverinfo: "skip",
x: gunPressuresSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
.map( f => Array(f.length).fill(x[idx]) ).flat()
).flat(),
y: gunPressuresSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ traceManifold, ...tracesGunPressures, tracesGunsPressuresIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun pressures sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
hovermode: "x",
yaxis: {
title: "Pressure (psi)",
//zeroline: false
},
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
const item = this.data.items.find(s => s.point == point);
const guns = item.meta.guns.filter( g => g[2] == item.meta.src_number );
const gunIds = guns.map( g => "G"+g[1] );
const pressures = unpack(guns, 11);
const volumes = unpack(guns, 12);
const maxVolume = d3a.max(volumes);
const data = [{
type: "bar",
x: gunIds,
y: pressures,
width: volumes.map( v => v/maxVolume ),
transforms: [{
type: "groupby",
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun pressures shot %{meta.point}"},
height: 300,
yaxis: {
title: "Pressure (psi)",
range: [ Math.min(d3a.min(pressures), 1950), Math.max(d3a.max(pressures), 2050) ]
},
xaxis: {
title: "Gun number",
type: 'category'
},
meta: {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
if (!styles[gunId]) {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
y: unpack(unpack(unpack(d, "meta"), "guns").flat(), 11), // Gun pressure
points: 'none',
box: {
visible: true
},
line: {
color: 'green',
},
meanline: {
visible: true
},
transforms: [{
type: 'groupby',
groups: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1),
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
title: {text: "Individual gun pressures sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Pressure (psi)",
zeroline: false
},
xaxis: {
title: "Gun number"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);
this.resizeObserver.unobserve(this.$refs.graphViolin);
this.resizeObserver.unobserve(this.$refs.graphSeries);
}
}
};
</script>

View File

@@ -0,0 +1,364 @@
<template>
<v-card style="min-height:400px;">
<v-card-title class="headline">
Gun timing
<v-spacer></v-spacer>
<v-switch v-model="shotpoint" label="Shotpoint"></v-switch>
<v-switch class="ml-4" v-model="violinplot" label="Violin plot"></v-switch>
</v-card-title>
<v-container fluid fill-height>
<v-row>
<v-col>
<div class="graph-container" ref="graphSeries"></div>
</v-col>
</v-row>
<v-row v-show="shotpoint">
<v-col>
<div class="graph-container" ref="graphBar"></div>
</v-col>
</v-row>
<v-row v-show="violinplot">
<v-col>
<div class="graph-container" ref="graphViolin"></div>
</v-col>
</v-row>
</v-container>
<v-overlay :value="busy" absolute z-index="1">
<v-progress-circular indeterminate></v-progress-circular>
</v-overlay>
</v-card>
</template>
<style scoped>
.graph-container {
width: 100%;
height: 100%;
}
</style>
<script>
import * as d3a from 'd3-array';
import Plotly from 'plotly.js-dist';
import { mapActions, mapGetters } from 'vuex';
import unpack from '@/lib/unpack.js';
import * as aes from '@/lib/graphs/aesthetics.js';
export default {
name: 'DougalGraphGunsTiming',
props: [ "data", "settings" ],
data () {
return {
graph: null,
graphHover: null,
busy: false,
resizeObserver: null,
shotpoint: true,
violinplot: false
};
},
computed: {
//...mapGetters(['apiUrl'])
},
watch: {
data (newVal, oldVal) {
console.log("data changed");
if (newVal === null) {
this.busy = true;
} else {
this.busy = false;
this.plot();
}
},
settings () {
for (const key in this.settings) {
this[key] = this.settings[key];
}
},
shotpoint () {
if (this.shotpoint) {
this.replot();
}
this.$emit("update:settings", {[`${this.$options.name}.shotpoint`]: this.shotpoint});
},
violinplot () {
if (this.violinplot) {
this.plotViolin();
}
this.$emit("update:settings", {[`${this.$options.name}.violinplot`]: this.violinplot});
}
},
methods: {
plot () {
this.plotSeries();
if (this.violinplot) {
this.plotViolin();
}
},
async plotSeries () {
function transformSeries (d, src_number, otherParams={}) {
const meta = src_number
? unpack(d, "meta").filter( s => s.src_number == src_number )
: unpack(d, "meta");
const guns = unpack(meta, "guns").map(s => s.filter(g => g[2] == src_number));;
const gunTimings = guns.map(s => s.map(g => g[9]));
const gunTimingsSorted = gunTimings.map(s => d3a.sort(s));
const gunsAvgTiming = gunTimings.map( (s, sidx) => d3a.mean(s) );
const x = src_number
? unpack(d.filter(s => s.meta.src_number == src_number), "point")
: unpack(d, "point");
const tracesGunTimings = [{
type: "scatter",
mode: "lines",
x,
y: gunTimingsSorted.map(s => d3a.quantileSorted(s, 0.25)),
...aes.gunArrays[src_number || 1].min
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunsAvgTiming,
...aes.gunArrays[src_number || 1].avg
},
{
type: "scatter",
mode: "lines",
fill: "tonexty",
x,
y: gunTimingsSorted.map(s => d3a.quantileSorted(s, 0.75)),
...aes.gunArrays[src_number || 1].max
}];
const tracesGunsTimingsIndividual = {
//name: `Array ${src_number} outliers`,
type: "scatter",
mode: "markers",
marker: {size: 2 },
hoverinfo: "skip",
x: gunTimingsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
.map( f => Array(f.length).fill(x[idx]) ).flat()
).flat(),
y: gunTimingsSorted.map( (s, idx) =>
s.filter( g => g < d3a.quantileSorted(s, 0.05) || g > d3a.quantileSorted(s, 0.95))
).flat(),
...aes.gunArrays[src_number || 1].out
};
const data = [ ...tracesGunTimings, tracesGunsTimingsIndividual ]
return data;
}
if (!this.data) {
console.log("missing data");
return;
}
const sources = [ ...new Set(unpack(this.data.items, "meta").map( s => s.src_number ))];
const data = sources.map( src_number => transformSeries(this.data.items, src_number) ).flat();
console.log("Sources", sources);
console.log(data);
this.busy = false;
const layout = {
//autosize: true,
title: {text: "Gun timings sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
hovermode: "x",
yaxis: {
title: "Timing (ms)",
//zeroline: false
},
xaxis: {
title: "Shotpoint",
showspikes: true
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphSeries, data, layout, config);
this.$refs.graphSeries.on('plotly_hover', (d) => {
const point = d.points[0].x;
const item = this.data.items.find(s => s.point == point);
const guns = item.meta.guns.filter( g => g[2] == item.meta.src_number );
const gunIds = guns.map( g => "G"+g[1] );
const timings = unpack(guns, 9);
const data = [{
type: "bar",
x: gunIds,
y: timings,
transforms: [{
type: "groupby",
groups: unpack(guns, 0)
}],
}];
const layout = {
title: {text: "Gun timings shot %{meta.point}"},
height: 300,
yaxis: {
title: "Timing (ms)",
range: [ Math.min(d3a.min(timings), 10), Math.max(d3a.max(timings), 20) ]
},
xaxis: {
title: "Gun number",
type: 'category'
},
meta: {
point
}
};
const config = { displaylogo: false };
Plotly.react(this.$refs.graphBar, data, layout, config);
});
},
async plotViolin () {
function transformViolin (d, opts = {}) {
const styles = [];
unpack(unpack(d, "meta"), "guns").flat().forEach(i => {
const gunId = i[1];
const arrayId = i[2];
if (!styles[gunId]) {
styles[gunId] = Object.assign({target: gunId}, aes.gunArrayViolins[arrayId]);
}
});
const data = {
type: 'violin',
x: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1), // Gun number
y: unpack(unpack(unpack(d, "meta"), "guns").flat(), 9), // Gun timing
points: 'none',
box: {
visible: true
},
line: {
color: 'green',
},
meanline: {
visible: true
},
transforms: [{
type: 'groupby',
groups: unpack(unpack(unpack(d, "meta"), "guns").flat(), 1),
styles: styles.filter(i => !!i)
}]
}
return data;
}
console.log("plot violin");
if (!this.data) {
console.log("missing data");
return;
}
console.log("Will plot sequence", this.data.meta.project, this.data.meta.sequence);
const data = [ transformViolin(this.data.items) ];
this.busy = false;
const layout = {
//autosize: true,
showlegend: false,
title: {text: "Individual gun timings sequence %{meta.sequence}"},
autocolorscale: true,
// colorscale: "sequential",
yaxis: {
title: "Timing (ms)",
zeroline: false
},
xaxis: {
title: "Gun number"
},
meta: this.data.meta
};
const config = {
editable: false,
displaylogo: false
};
this.graph = Plotly.newPlot(this.$refs.graphViolin, data, layout, config);
},
replot () {
if (!this.graph) {
return;
}
console.log("Replotting");
Object.values(this.$refs).forEach( ref => {
if (ref.data) {
console.log("Replotting", ref, ref.clientWidth, ref.clientHeight);
Plotly.relayout(ref, {
width: ref.clientWidth,
height: ref.clientHeight
});
}
});
},
...mapActions(["api"])
},
mounted () {
if (this.data) {
this.plot();
} else {
this.busy = true;
}
this.resizeObserver = new ResizeObserver(this.replot)
this.resizeObserver.observe(this.$refs.graphSeries);
this.resizeObserver.observe(this.$refs.graphViolin);
this.resizeObserver.observe(this.$refs.graphBar);
},
beforeDestroy () {
if (this.resizeObserver) {
this.resizeObserver.unobserve(this.$refs.graphBar);
this.resizeObserver.unobserve(this.$refs.graphViolin);
this.resizeObserver.unobserve(this.$refs.graphSeries);
}
}
};
</script>

View File

@@ -0,0 +1,145 @@
<template>
<v-dialog v-model="open">
<template v-slot:activator="{ on, attrs }">
<v-btn icon v-bind="attrs" v-on="on" title="Configure visible aspects">
<v-icon small>mdi-wrench-outline</v-icon>
</v-btn>
</template>
<v-card>
<v-list nav subheader>
<v-subheader>Visualisations</v-subheader>
<v-list-item-group v-model="aspectsVisible" multiple>
<v-list-item value="DougalGraphGunsPressure">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: Gun pressure</v-list-item-title>
<v-list-item-subtitle>Array pressures weighted averages</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsTiming">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: Gun timing</v-list-item-title>
<v-list-item-subtitle>Array timing averages</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsDepth">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: Gun depth</v-list-item-title>
<v-list-item-subtitle>Array depths averages</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphGunsHeatmap">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Heatmap: Gun parameters</v-list-item-title>
<v-list-item-subtitle>Detail of every gun × every shotpoint</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
<v-list-item value="DougalGraphArraysIJScatter">
<template v-slot:default="{ active }">
<v-list-item-action>
<v-checkbox :input-value="active"></v-checkbox>
</v-list-item-action>
<v-list-item-content>
<v-list-item-title>Series: I/J error</v-list-item-title>
<v-list-item-subtitle>Inline / crossline error</v-list-item-subtitle>
</v-list-item-content>
</template>
</v-list-item>
</v-list-item-group>
</v-list>
<v-divider></v-divider>
<v-card-actions>
<v-btn v-if="user" color="warning" text @click="save" :title="'Save as preference for user '+user.name+' on this computer (other users may have other defaults).'">Save as default</v-btn>
<v-spacer></v-spacer>
<v-btn color="primary" text @click="open=false">Close</v-btn>
</v-card-actions>
</v-card>
</v-dialog>
</template>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: "DougalGraphSettingsSequence",
props: [
"aspects"
],
data () {
return {
open: false,
aspectsVisible: this.aspects || []
}
},
watch: {
aspects () {
// Update the aspects selection list iff the list
// is not currently open.
if (!this.open) {
this.aspectsVisible = this.aspects;
}
}
},
computed: {
...mapGetters(['user', 'writeaccess', 'loading', 'serverEvent'])
},
methods: {
save () {
this.open = false;
this.$nextTick( () => this.$emit("update:aspects", {aspects: [...this.aspectsVisible]}) );
},
reset () {
this.aspectsVisible = this.aspects || [];
}
}
}
</script>

View File

@@ -7,7 +7,7 @@
<template v-slot:activator="{ on, attrs }">
<small class="ml-3">
<a v-on="on">
Get help
<span class="d-none d-sm-inline">Get help </span>
<v-icon small>mdi-account-question</v-icon>
</a>
</small>
@@ -33,7 +33,8 @@
text
:href="`mailto:${email}?Subject=Question`"
>
Ask a question
<v-icon class="d-lg-none">mdi-help-circle</v-icon>
<span class="d-none d-lg-inline">Ask a question</span>
</v-btn>
<v-btn
@@ -41,7 +42,17 @@
text
href="mailto:dougal-support@aaltronav.eu?Subject=Bug report"
>
Report a bug
<v-icon class="d-lg-none">mdi-bug</v-icon>
<span class="d-none d-lg-inline">Report a bug</span>
</v-btn>
<v-btn
color="info"
text
:href='"/feed/"+feed'
title="View development log"
>
<v-icon>mdi-rss</v-icon>
</v-btn>
<v-spacer></v-spacer>
@@ -52,7 +63,8 @@
text
@click="dialog=false"
>
Close
<v-icon class="d-lg-none">mdi-close-circle</v-icon>
<span class="d-none d-lg-inline">Close</span>
</v-btn>
</v-card-actions>
@@ -69,7 +81,8 @@ export default {
data () {
return {
dialog: false,
email: "dougal-support@aaltronav.eu"
email: "dougal-support@aaltronav.eu",
feed: btoa(encodeURIComponent("https://gitlab.com/wgp/dougal/software.atom?feed_token=XSPpvsYEny8YmH75Nz5W"))
};
}

View File

@@ -0,0 +1,113 @@
<template>
<div class="line-status" v-if="sequences.length == 0">
<slot name="empty"></slot>
</div>
<div class="line-status" v-else-if="sequenceHref">
<router-link v-for="sequence in sequences" :key="sequence.sequence"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
:title="title(sequence)"
:to="sequenceHref(sequence)"
>
</router-link>
</div>
<div class="line-status" v-else>
<div v-for="sequence in sequences"
class="sequence"
:class="sequence.status"
:style="style(sequence)"
:title="title(sequence)"
>
</div>
</div>
</template>
<style lang="stylus" scoped>
.line-status
display flex
flex-direction column
height 67%
min-width 64px
min-height 16px
background-color #d3d3d314
border-radius 4px
.sequence
flex 1 1 auto
opacity 0.5
border-radius 4px
&.ntbp
background-color red
&.raw
background-color orange
&.final
background-color green
&.online
background-color blue
&.planned
background-color magenta
</style>
<script>
export default {
name: 'DougalLineStatus',
props: {
preplot: Object,
sequences: Array,
"sequence-href": Function
},
methods: {
style (s) {
const values = {};
const fsp = s.status == "final"
? s.fsp_final
: s.status == "ntbp"
? (s.fsp_final || s.fsp)
: s.fsp; /* status == "raw" */
const lsp = s.status == "final"
? s.lsp_final
: s.status == "ntbp"
? (s.lsp_final || s.lsp)
: s.lsp; /* status == "raw" */
const pp0 = Math.min(this.preplot.fsp, this.preplot.lsp);
const pp1 = Math.max(this.preplot.fsp, this.preplot.lsp);
const len = pp1-pp0;
const sp0 = Math.max(Math.min(fsp, lsp), pp0);
const sp1 = Math.min(Math.max(fsp, lsp), pp1);
const left = (sp0-pp0)/len;
const right = 1-((sp1-pp0)/len);
values["margin-left"] = left*100 + "%";
values["margin-right"] = right*100 + "%";
return values;
},
title (s) {
const status = s.status == "final"
? "Final"
: s.status == "raw"
? "Acquired"
: s.status == "ntbp"
? "NTBP"
: s.status == "planned"
? "Planned"
: s.status;
const remarks = "\n"+[s.remarks, s.remarks_final].join("\n").trim()
return `Sequence ${s.sequence} ${status} (${s.fsp_final || s.fsp}${s.lsp_final || s.lsp})${remarks}`;
}
}
}
</script>

View File

@@ -4,7 +4,7 @@
app
clipped-left
>
<v-img src="https://aaltronav.eu/media/aaltronav-logo.svg"
<v-img src="/wgp-logo.png"
contain
max-height="32px" max-width="32px"
></v-img>
@@ -12,17 +12,70 @@
<v-toolbar-title class="mx-2" @click="$router.push('/')" style="cursor: pointer;">Dougal</v-toolbar-title>
<v-spacer></v-spacer>
<v-menu bottom offset-y>
<template v-slot:activator="{on, attrs}">
<v-hover v-slot="{hover}">
<v-btn
class="align-self-center"
:xcolor="hover ? 'secondary' : 'secondary lighten-3'"
small
text
v-bind="attrs"
v-on="on"
title="Settings"
>
<v-icon small>mdi-cog-outline</v-icon>
</v-btn>
</v-hover>
</template>
<v-list dense>
<v-list-item :href="`/settings/equipment`">
<v-list-item-title>Equipment list</v-list-item-title>
<v-list-item-action><v-icon small>mdi-view-list</v-icon></v-list-item-action>
</v-list-item>
</v-list>
</v-menu>
<v-breadcrumbs :items="path"></v-breadcrumbs>
<template v-if="$route.name != 'Login'">
<v-btn text link to="/login" v-if="!$root.user">Log in</v-btn>
<template v-else>
<v-btn title="Edit profile" disabled>
{{$root.user.name}}
</v-btn>
<v-btn class="ml-2" title="Log out" link to="/?logout=1">
<v-icon>mdi-logout</v-icon>
<v-btn text link to="/login" v-if="!user && !loading">Log in</v-btn>
<template v-else-if="user">
<v-menu
offset-y
>
<template v-slot:activator="{on, attrs}">
<v-avatar :color="user.colour || 'primary'" :title="`${user.name} (${user.role})`" v-bind="attrs" v-on="on">
<span class="white--text">{{user.name.slice(0, 5)}}</span>
</v-avatar>
</template>
<v-list dense>
<v-list-item link to="/login" v-if="user.autologin">
<v-list-item-icon><v-icon small>mdi-login</v-icon></v-list-item-icon>
<v-list-item-content>
<v-list-item-title>Log in as a different user</v-list-item-title>
<v-list-item-subtitle>Autologin from {{user.ip}}</v-list-item-subtitle>
</v-list-item-content>
</v-list-item>
<v-list-item link to="/logout" v-else>
<v-list-item-icon><v-icon small>mdi-logout</v-icon></v-list-item-icon>
<v-list-item-title>Log out</v-list-item-title>
</v-list-item>
</v-list>
</v-menu>
<!--
<v-btn small text class="ml-2" title="Log out" link to="/?logout=1">
<v-icon small>mdi-logout</v-icon>
</v-btn>
-->
</template>
</template>
<template v-slot:extension v-if="$route.matched.find(i => i.name == 'Project')">
@@ -35,6 +88,7 @@
</template>
<script>
import { mapActions, mapGetters } from 'vuex';
export default {
name: 'DougalNavigation',
@@ -44,9 +98,12 @@ export default {
tabs: [
{ href: "summary", text: "Summary" },
{ href: "lines", text: "Lines" },
{ href: "plan", text: "Plan" },
{ href: "sequences", text: "Sequences" },
{ href: "calendar", text: "Calendar" },
{ href: "log", text: "Log" },
{ href: "qc", text: "QC" },
{ href: "graphs", text: "Graphs" },
{ href: "map", text: "Map" }
],
path: []
@@ -56,7 +113,9 @@ export default {
computed: {
tab () {
return this.tabs.findIndex(t => t.href == this.$route.path.split(/\/+/)[3]);
}
},
...mapGetters(['user', 'loading'])
},
watch: {

View File

@@ -0,0 +1,113 @@
<template>
<div title="Notifications" v-if="visible">
<v-switch
class="ma-auto"
flat
hide-details
v-model="notify"
:loading="waiting"
:disabled="disabled"
append-icon="mdi-email-outline"
></v-switch>
</div>
</template>
<script>
import { mapActions, mapGetters, mapState } from 'vuex';
export default {
name: 'DougalNotificationsControl',
data () {
return {
visible: true,
notify: false,
waiting: false,
disabled: false
}
},
watch: {
async notify (state) {
if (state) {
console.log("Checking for permission", Notification.permission);
if (Notification.permission == "default") {
console.log("Asking for permission");
this.waiting = true;
const response = await Notification.requestPermission();
console.log("User says", response);
this.waiting = false;
if (response != "granted") {
this.$nextTick( () => this.notify = false );
}
if (response == "denied") {
this.disabled = true;
}
}
} else {
if (this.waiting) {
this.waiting = false;
}
}
},
async serverEvent (event) {
if (this.notify) {
let notification;
//console.log(event.channel);
switch (event.channel) {
case "realtime":
break;
case "event":
//console.log("EVENT",JSON.parse(JSON.stringify(event)));
let title, body, tag;
if (event.payload.new) {
tag = `${event.payload.schema}.${event.payload.table}.${event.payload.new.id}`;
if (event.payload.table == "events_seq") {
const point = event.payload.new.point;
const sequence = event.payload.new.sequence;
title = event.payload.operation == "INSERT"
? `Dougal: Seq. ${sequence.toString().padStart(3, "0")} SP ${point}`
: event.payload.operation == "UPDATE"
? `Dougal: Seq. ${sequence.toString().padStart(3, "0")} SP ${point} (update)`
: "";
body = event.payload.new.remarks;
} else if (event.payload.table == "events_timed") {
const tstamp = event.payload.new.tstamp;
title = event.payload.operation == "INSERT"
? `Dougal: ${tstamp}`
: event.payload.operation == "UPDATE"
? `Dougal: ${tstamp} (update)`
: "";
body = event.payload.new.remarks;
}
if (title && body) {
notification = new Notification(title, {body, tag});
}
}
break;
}
}
}
},
computed: {
...mapGetters(['loading', 'serverEvent']),
...mapState({projectSchema: state => state.project.projectSchema})
},
created () {
this.visible = "Notification" in window;
this.disabled = !this.visible|| Notification.permission == "denied";
}
};
</script>

View File

@@ -0,0 +1,135 @@
<template>
<v-hover v-slot:default="{hover}" v-if="!isEmpty(item)">
<span>
<v-btn v-if="!isAccepted(item)"
:class="{'text--disabled': !hover}"
icon
small
color="primary"
:title="isMultiple(item) ? 'Accept all' : 'Accept'"
@click.stop="accept(item)">
<v-icon small :color="isAccepted(item) ? 'green' : ''">
{{ isMultiple(item) ? 'mdi-check-all' : 'mdi-check' }}
</v-icon>
</v-btn>
<v-btn v-if="someAccepted(item)"
:class="{'text--disabled': !hover}"
icon
small
color="primary"
:title="isMultiple(item) ? 'Restore all' : 'Restore'"
@click.stop="unaccept(item)">
<v-icon small>
{{ isMultiple(item) ? 'mdi-restore' : 'mdi-restore' }}
</v-icon>
</v-btn>
</span>
</v-hover>
</template>
<script>
export default {
name: 'DougalQcAcceptance',
props: {
item: { type: Object }
},
methods: {
isAccepted (item) {
if (item._children) {
return item._children.every(child => this.isAccepted(child));
}
if (item.labels) {
return item.labels.includes("QCAccepted");
}
return false;
},
someAccepted (item) {
if (item._children) {
return item._children.some(child => this.someAccepted(child));
}
if (item.labels) {
return item.labels.includes("QCAccepted");
}
return false;
},
isEmpty (item) {
return item._children?.length === 0;
},
isMultiple (item) {
return item._children?.length;
},
action (action, item) {
const items = [];
const iterate = (item) => {
if (item._kind == "point") {
if (this.isAccepted(item)) {
if (action == "unaccept") {
items.push(item);
}
} else {
if (action == "accept") {
items.push(item);
}
}
} else if (item._kind == "sequence" || item._kind == "test") {
if (item._children) {
for (const child of item._children) {
iterate(child);
}
}
if (item._shots) {
for (const child of item._children) {
iterate(child);
}
}
}
}
iterate(item);
return items;
},
accept (item) {
const items = this.action('accept', item);
if (items.length) {
this.$emit('accept', items);
}
},
unaccept (item) {
const items = this.action('unaccept', item);
if (items.length) {
this.$emit('unaccept', items);
}
}
}
}
</script>

View File

@@ -1,5 +1,5 @@
export default function FormatTimestamp (str) {
const d = new Date(str);
if (isNaN(d)) {

View File

@@ -0,0 +1,88 @@
export const gunArrays = {
1: {
min: {
fillcolor: "rgba(200, 230, 201, 0.2)",
line: {color: "rgba(129, 199, 132, 0.3)", shape: "spline"},
showlegend: false,
name: "Array 1 (min.)",
hoverinfo: "skip"
},
avg: {
fillcolor: "rgba(200, 230, 201, 0.2)",
line: {color: "rgba(129, 199, 132, 0.9)", shape: "spline"},
name: "Array 1 (avg.)"
},
max: {
fillcolor: "rgba(200, 230, 201, 0.2)",
line: {color: "rgba(129, 199, 132, 0.4)", shape: "spline"},
showlegend: false,
name: "Array 1 (max.)",
hoverinfo: "skip"
},
out: {
name: "Array 1 outliers",
line: {color: "rgba(129, 199, 166, 0.7)"},
fillcolor: "rgba(129, 199, 166, 0.5)"
}
},
2: {
min: {
fillcolor: "rgba(255, 205, 210, 0.2)",
line: {color: "rgba(229, 115, 115, 0.3)", shape: "spline"},
showlegend: false,
name: "Array 2 (min.)",
hoverinfo: "skip"
},
avg: {
fillcolor: "rgba(255, 205, 210, 0.2)",
line: {color: "rgba(229, 115, 115, 0.9)", shape: "spline"},
name: "Array 2 (avg.)"
},
max: {
fillcolor: "rgba(255, 205, 210, 0.2)",
line: {color: "rgba(229, 115, 115, 0.4)", shape: "spline"},
showlegend: false,
name: "Array 2 (max.)",
hoverinfo: "skip"
},
out: {
name: "Array 2 outliers",
line: {color: "rgba(229, 153, 115, 0.7)"},
fillcolor: "rgba(229, 153, 115, 0.5)"
}
},
3: {
min: {
fillcolor: "",
line: {color: "", shape: "spline"},
showlegend: false,
name: "Array 3 (min.)",
hoverinfo: "skip"
},
avg: {
fillcolor: "",
line: {color: "", shape: "spline"},
name: "Array 3 (avg.)"
},
max: {
fillcolor: "",
line: {color: "", shape: "spline"},
showlegend: false,
name: "Array 3 (max.)",
hoverinfo: "skip"
},
out: {
name: "Array 3 outliers",
//fillcolor: ""
}
}
};
export const gunArrayViolins = {
1: {
value: {line: {color: "rgba(129, 199, 132, 0.9)"}}
},
2: {
value: {line: {color: "rgba(229, 115, 115, 0.9)"}}
}
};

View File

@@ -0,0 +1,11 @@
const marked = require('marked');
function markdown (str) {
return marked(String(str));
}
function markdownInline (str) {
return marked.parseInline(String(str));
}
module.exports = { markdown, markdownInline };

View File

@@ -0,0 +1,33 @@
/**
* Throttle a function call.
*
* It delays `callback` by `delay` ms and ignores any
* repeated calls from `caller` within at most `maxWait`
* milliseconds.
*
* Used to react to server events in cases where we get
* a separate notification for each row of a bulk update.
*/
function throttle (callback, caller, delay = 100, maxWait = 500) {
const schedule = async () => {
caller.triggeredAt = Date.now();
caller.timer = setTimeout(async () => {
await callback();
caller.timer = null;
}, delay);
}
if (!caller.timer) {
schedule();
} else {
const elapsed = Date.now() - caller.triggeredAt;
if (elapsed > maxWait) {
cancelTimeout(caller.timer);
schedule();
}
}
}
export default throttle;

View File

@@ -0,0 +1,4 @@
export default function unpack(rows, key) {
return rows && rows.map( row => row[key] );
};

View File

@@ -0,0 +1,141 @@
function withParentProps(item, parent, childrenKey, prop, currentValue) {
if (!Array.isArray(parent)) {
return;
}
let currentPropValue = currentValue || parent[prop];
for (const entry of parent) {
if (entry[prop]) {
currentPropValue = entry[prop];
}
if (entry === item) {
return [item, currentPropValue];
}
if (entry[childrenKey]) {
const res = withParentProps(item, entry[childrenKey], childrenKey, prop, currentPropValue);
if (res[1]) {
return res;
}
}
}
return [];
}
function dms (lat, lon) {
const λh = lat < 0 ? "S" : "N";
const φh = lon < 0 ? "W" : "E";
const λn = Math.abs(lat);
const φn = Math.abs(lon);
const λi = Math.trunc(λn);
const φi = Math.trunc(φn);
const λf = λn - λi;
const φf = φn - φi;
const λs = ((λf*3600)%60).toFixed(1);
const φs = ((φf*3600)%60).toFixed(1);
const λm = Math.trunc(λf*60);
const φm = Math.trunc(φf*60);
const λ =
String(λi).padStart(2, "0") + "°" +
String(λm).padStart(2, "0") + "'" +
String(λs).padStart(4, "0") + '" ' +
λh;
const φ =
String(φi).padStart(3, "0") + "°" +
String(φm).padStart(2, "0") + "'" +
String(φs).padStart(4, "0") + '" ' +
φh;
return λ+" "+φ;
}
function geometryAsString (item, opts = {}) {
const key = "key" in opts ? opts.key : "geometry";
const formatDMS = opts.dms;
let str = "";
if (key in item) {
const geometry = item[key];
if (geometry && "coordinates" in geometry) {
if (geometry.type == "Point") {
if (formatDMS) {
str = dms(geometry.coordinates[1], geometry.coordinates[0]);
} else {
str = `${geometry.coordinates[1].toFixed(6)}, ${geometry.coordinates[0].toFixed(6)}`;
}
}
if (str) {
if (opts.url) {
if (typeof opts.url === 'string') {
str = `[${str}](${opts.url.replace("$x", geometry.coordinates[0]).replace("$y", geometry.coordinates[1])})`;
} else {
str = `[${str}](geo:${geometry.coordinates[0]},${geometry.coordinates[1]})`;
}
}
}
}
}
return str;
}
/** Extract preferences by prefix.
*
* This function returns a lambda which, given
* a key or a prefix, extracts the relevant
* preferences from the designated preferences
* store.
*
* For instance, assume preferences = {
* "a.b.c.d": 1,
* "a.b.e.f": 2,
* "g.h": 3
* }
*
* And λ = preferencesλ(preferences). Then:
*
* λ("a.b") → { "a.b.c.d": 1, "a.b.e.f": 2 }
* λ("a.b.e.f") → { "a.b.e.f": 2 }
* λ("g.x", {"g.x.": 99}) → { "g.x.": 99 }
* λ("a.c", {"g.x.": 99}) → { "g.x.": 99 }
*
* Note from the last two examples that a default value
* may be provided and will be returned if a key does
* not exist or is not searched for.
*/
function preferencesλ (preferences) {
return function (key, defaults={}) {
const keys = Object.keys(preferences).filter(str => str.startsWith(key+".") || str == key);
const settings = {...defaults};
for (const str of keys) {
const k = str == key ? str : str.substring(key.length+1);
const v = preferences[str];
settings[k] = v;
}
return settings;
}
}
export {
withParentProps,
geometryAsString,
preferencesλ
}

View File

@@ -5,12 +5,22 @@ import store from './store'
import vuetify from './plugins/vuetify'
import vueDebounce from 'vue-debounce'
import { mapMutations } from 'vuex';
import { markdown, markdownInline } from './lib/markdown';
import { geometryAsString } from './lib/utils';
Vue.config.productionTip = false
Vue.use(vueDebounce);
Vue.filter('markdown', markdown);
Vue.filter('markdownInline', markdownInline);
Vue.filter('position', (str, item, opts) =>
str
.replace(/@POS(ITION)?@/g, geometryAsString(item, opts) || "(position unknown)")
.replace(/@DMS@/g, geometryAsString(item, {...opts, dms:true}) || "(position unknown)")
);
// Vue.filter('position', (str, item, opts) => str.replace(/@POS(ITION)?@/, "☺"));
new Vue({
data () {
return {
@@ -42,8 +52,8 @@ new Vue({
},
initWs () {
if (this.ws && this.ws.readyState == 1) {
console.log("WebSocket already initialised");
if (this.ws) {
console.log("WebSocket initWs already called");
return;
}
@@ -51,30 +61,33 @@ new Vue({
this.ws.addEventListener("message", (ev) => {
const msg = JSON.parse(ev.data);
if (msg.payload) {
msg.payload = JSON.parse(msg.payload);
}
this.setServerEvent(msg);
});
this.ws.addEventListener("open", (ev) => {
console.log("WebSocket connection open", ev);
this.setServerConnectionState(true);
});
this.ws.addEventListener("close", (ev) => {
console.warn("WebSocket connection closed", ev);
delete this.ws;
this.ws = null;
setTimeout( this.initWs, 5000 );
});
this.ws.addEventListener("error", (ev) => {
console.error("WebSocket connection error", ev);
this.ws.close();
delete this.ws;
this.ws = null;
setTimeout( this.initWs, 60000 );
this.setServerConnectionState(false);
});
},
...mapMutations(['setServerEvent'])
...mapMutations(['setServerEvent', 'setServerConnectionState'])
},

View File

@@ -1,15 +1,20 @@
import Vue from 'vue'
import VueRouter from 'vue-router'
import Home from '../views/Home.vue'
import Login from '../views/Login.vue'
import Logout from '../views/Logout.vue'
import Project from '../views/Project.vue'
import ProjectList from '../views/ProjectList.vue'
import ProjectSummary from '../views/ProjectSummary.vue'
import LineList from '../views/LineList.vue'
import Plan from '../views/Plan.vue'
import LineSummary from '../views/LineSummary.vue'
import SequenceList from '../views/SequenceList.vue'
import SequenceSummary from '../views/SequenceSummary.vue'
import Calendar from '../views/Calendar.vue'
import Log from '../views/Log.vue'
import QC from '../views/QC.vue'
import Graphs from '../views/Graphs.vue'
import Map from '../views/Map.vue'
@@ -29,6 +34,40 @@ Vue.use(VueRouter)
// which is lazy-loaded when the route is visited.
component: () => import(/* webpackChunkName: "about" */ '../views/About.vue')
},
{
path: '/feed/:source',
name: 'Feed',
// route level code-splitting
// this generates a separate chunk (about.[hash].js) for this route
// which is lazy-loaded when the route is visited.
component: () => import(/* webpackChunkName: "about" */ '../views/Feed.vue')
},
{
path: "/settings/equipment",
name: "equipment",
component: () => import(/* webpackChunkName: "about" */ '../views/Equipment.vue')
},
{
pathToRegexpOptions: { strict: true },
path: "/login",
redirect: "/login/"
},
{
pathToRegexpOptions: { strict: true },
name: "Login",
path: "/login/",
component: Login,
meta: {
// breadcrumbs: [
// { text: "Projects", href: "/projects", disabled: true }
// ]
}
},
{
// pathToRegexpOptions: { strict: true },
path: "/logout",
component: Logout,
},
{
pathToRegexpOptions: { strict: true },
path: "/projects",
@@ -77,6 +116,11 @@ Vue.use(VueRouter)
name: "LineList",
component: LineList
},
{
path: "plan/",
name: "Plan",
component: Plan
},
{
path: "lines/:line",
name: "Line",
@@ -103,8 +147,23 @@ Vue.use(VueRouter)
{ path: "date/:date0/:date1", name: "logByDates" }
]
},
{
path: "qc",
component: QC
},
{
path: "graphs",
component: Graphs,
children: [
{ path: "sequence/:sequence", name: "graphsBySequence" },
{ path: "sequence/:sequence0/:sequence1", name: "graphsBySequences" },
{ path: "date/:date0", name: "graphsByDate" },
{ path: "date/:date0/:date1", name: "graphsByDates" }
]
},
{
path: "map",
name: "map",
component: Map
}
]

View File

@@ -2,6 +2,7 @@ import Vue from 'vue'
import Vuex from 'vuex'
import api from './modules/api'
import user from './modules/user'
import snack from './modules/snack'
import project from './modules/project'
import notify from './modules/notify'
@@ -11,6 +12,7 @@ Vue.use(Vuex)
export default new Vuex.Store({
modules: {
api,
user,
snack,
project,
notify

View File

@@ -1,5 +1,5 @@
async function api ({state, commit, dispatch}, [resource, init = {}]) {
async function api ({state, commit, dispatch}, [resource, init = {}, cb]) {
try {
commit("queueRequest");
if (init && init.hasOwnProperty("body")) {
@@ -13,10 +13,17 @@ async function api ({state, commit, dispatch}, [resource, init = {}]) {
init.body = JSON.stringify(init.body);
}
}
const res = await fetch(`${state.apiUrl}${resource}`, init);
const url = /^https?:\/\//i.test(resource) ? resource : (state.apiUrl + resource);
const res = await fetch(url, init);
if (typeof cb === 'function') {
cb(null, res);
}
if (res.ok) {
await dispatch('setCredentials');
try {
return await res.json();
return init.text ? (await res.text()) : (await res.json());
} catch (err) {
if (err instanceof SyntaxError) {
if (Number(res.headers.get("Content-Length")) === 0) {
@@ -31,7 +38,11 @@ async function api ({state, commit, dispatch}, [resource, init = {}]) {
await dispatch('showSnack', [res.statusText, "warning"]);
}
} catch (err) {
if (err && err.name == "AbortError") return;
await dispatch('showSnack', [err, "error"]);
if (typeof cb === 'function') {
cb(err);
}
} finally {
commit("dequeueRequest");
}

View File

@@ -32,4 +32,19 @@ function point (state) {
return Number(v) || v;
}
export default { serverEvent, online, lineName, sequence, line, point };
function position (state) {
const λ = Number(_(state, "serverEvent.payload.new.meta.longitude"));
const φ = Number(_(state, "serverEvent.payload.new.meta.latitude"));
if (!isNaN(λ) && !isNaN(φ)) {
return [ λ, φ ];
}
return null;
}
function timestamp (state) {
const v = _(state, "serverEvent.payload.new.meta.time");
return v;
}
export default { serverEvent, online, lineName, sequence, line, point, position, timestamp };

View File

@@ -7,4 +7,8 @@ function clearServerEvent (state) {
state.serverEvent = null;
}
export default { setServerEvent, clearServerEvent };
function setServerConnectionState (state, isConnected) {
state.serverConnected = !!isConnected;
}
export default { setServerEvent, clearServerEvent, setServerConnectionState };

View File

@@ -1,5 +1,6 @@
const state = () => ({
serverEvent: null
serverEvent: null,
serverConnected: false
});
export default state;

View File

@@ -4,6 +4,7 @@ async function getProject ({commit, dispatch}, projectId) {
if (res) {
commit('setProjectName', res.name);
commit('setProjectId', res.pid);
commit('setProjectSchema', res.schema);
const recentProjects = JSON.parse(localStorage.getItem("recentProjects") || "[]")
recentProjects.unshift(res);
localStorage.setItem("recentProjects", JSON.stringify(recentProjects.slice(0, 3)));

View File

@@ -1,4 +1,4 @@
function setProjectId (state, pid) {
state.projectId = pid;
}
@@ -7,4 +7,8 @@ function setProjectName (state, name) {
state.projectName = name;
}
export default { setProjectId, setProjectName };
function setProjectSchema (state, schema) {
state.projectSchema = schema;
}
export default { setProjectId, setProjectName, setProjectSchema };

View File

@@ -1,6 +1,7 @@
const state = () => ({
projectId: null,
projectName: null
projectName: null,
projectSchema: null
});
export default state;

View File

@@ -1,4 +1,4 @@
function showSnack({commit}, [text, colour]) {
commit('setSnackColour', colour || 'primary');
commit('setSnackText', text);

View File

@@ -1,4 +1,4 @@
function setSnackText (state, text) {
state.snackText = text;
}

View File

@@ -0,0 +1,104 @@
import jwt_decode from 'jwt-decode';
async function login ({commit, dispatch}, loginRequest) {
const url = "/login";
const init = {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: loginRequest
}
const res = await dispatch('api', [url, init]);
if (res && res.ok) {
await dispatch('setCredentials', true);
await dispatch('loadUserPreferences');
}
}
async function logout ({commit, dispatch}) {
commit('setCookie', null);
commit('setUser', null);
// Should delete JWT cookie
await dispatch('api', ["/logout"]);
// Clear preferences
commit('setPreferences', {});
}
function browserCookie (state) {
return document.cookie.split(/; */).find(i => /^JWT=.+/.test(i));
}
function cookieChanged (cookie) {
return browserCookie != cookie;
}
function setCredentials ({state, commit, getters, dispatch}, force = false) {
if (cookieChanged(state.cookie) || force) {
try {
const cookie = browserCookie();
const decoded = cookie ? jwt_decode(cookie.split("=")[1]) : null;
commit('setCookie', cookie);
commit('setUser', decoded);
} catch (err) {
if (err.name == "InvalidTokenError") {
console.warn("Failed to decode", browserCookie());
} else {
console.error("setCredentials", err);
}
}
}
dispatch('loadUserPreferences');
}
/**
* Save user preferences to localStorage and store.
*
* User preferences are identified by a key that gets
* prefixed with the user name and role. The value can
* be anything that JSON.stringify can parse.
*/
function saveUserPreference ({state, commit}, [key, value]) {
const k = `${state.user?.name}.${state.user?.role}.${key}`;
if (value !== undefined) {
localStorage.setItem(k, JSON.stringify(value));
const preferences = state.preferences;
preferences[key] = value;
commit('setPreferences', preferences);
} else {
localStorage.removeItem(k);
const preferences = state.preferences;
delete preferences[key];
commit('setPreferences', preferences);
}
}
async function loadUserPreferences ({state, commit}) {
// Get all keys which are of interest to us
const prefix = `${state.user?.name}.${state.user?.role}`;
const keys = Object.keys(localStorage).filter( k => k.startsWith(prefix) );
// Build the preferences object
const preferences = {};
keys.map(str => {
const value = JSON.parse(localStorage.getItem(str));
const key = str.split(".").slice(2).join(".");
preferences[key] = value;
});
// Commit it
commit('setPreferences', preferences);
}
export default {
login,
logout,
setCredentials,
saveUserPreference,
loadUserPreferences
};

View File

@@ -0,0 +1,18 @@
function user (state) {
return state.user;
}
function writeaccess (state) {
return state.user && ["user", "admin"].includes(state.user.role);
}
function adminaccess (state) {
return state.user && state.user.role == "admin";
}
function preferences (state) {
return state.preferences;
}
export default { user, writeaccess, adminaccess, preferences };

View File

@@ -0,0 +1,6 @@
import state from './state'
import getters from './getters'
import actions from './actions'
import mutations from './mutations'
export default { state, getters, actions, mutations };

View File

@@ -0,0 +1,14 @@
function setCookie (state, cookie) {
state.cookie = cookie;
}
function setUser (state, user) {
state.user = user;
}
function setPreferences (state, preferences) {
state.preferences = preferences;
}
export default { setCookie, setUser, setPreferences };

View File

@@ -0,0 +1,7 @@
const state = () => ({
cookie: null,
user: null,
preferences: {}
});
export default state;

Some files were not shown because too many files have changed in this diff Show More